Who Won WWII?

In my recent post about Queen Elizabeth’s visit to America, I inadvertently sparked a mini-debate about WWII. I call that fun.

Winning WWII was the ultimate joint effort. But if you go to school in America, you graduate with the impression that the United States was 90% of that victory.

Our version paints Great Britain as a plucky and resourceful holdout against Hitler, on the verge of falling. Russia is portrayed as a bunch of hobos with flintlock rifles who got lucky because the Nazis didn’t have warm coats. The French are presented as beret-wearing cheese-eaters taking German lessons.

For you non-Americans, how does your version of history compare?

Leave a Reply

Your email address will not be published. Required fields are marked *