This is off the thread, but was WWII caused by the murders, or by Hitler's
expansionism? I always thought it was the latter, as most histories I have
read show that world acknowledgement of the death camps came late in the
war, when Allied troops liberated some of them. Before the graphic truth
was shown on newsreels, the war was mainly about repelling invaders from
Germany who were bent on conquering the world.
If Hitler had stayed home and set up death camps, would the world community
of the day have stepped in to do something about it? After all, he would
not have been directly affecting any other nation's borders or peoples.
Chuck Kuecker