Over the years, some more historically- or sociologically-minded folk have suggested that, with the end of WWII, the reality of the atomic bomb, and the public knowledge (including images) of what had really happened at the death camps, horror films were no longer needed. Or maybe no longer wanted. There was more than enough horror in the real world. I don't necessarily buy into the theory, but the timing is pretty damn impeccable. For all intents and purposes, the end of the war and the return home of the G.I.'s coincided very neatly with the death of the horror film. And also with the rise of Film Noir. I got no horse in this race, but I have heard this theory expounded, make of it what you will.