"Too many American films imply America won the war!"
Really? Which films have you been watching?
I'd say "The Big Parade" did a lot more about showing the horrors of war than making any pretentions to some glorious victory.
And then there's "All Quiet on the Western Front," which shows the war entirely from the German point of view, balancing things out pretty nicely, I think...and remains arguably the most devastating cinematic view of war ever made, at least until "The Thin Red Line." (In my opinion, "Saving Private Ryan" had some great--and TERRIFYING!--battle scenes, but didn't really show the grit in between to such good effect.)
The only film I can think of that deals with WWI from the perspective of America's winning the war would be "Shoulder Arms," which was obviously a broad comedy and shouldn't be taken seriously.
Then again...by 1917 virtually every country involved in the war, certainly all three on the Western Front, had literally lost an entire generation to that abatoir. There was no way to break through, and no way to capitulate while saving face, so everyone just kept pumping in fresh meat for the machine guns. By 1917 Germany (and probably Britain and France, as well) was resorting to sending octogenarians and 12-year-olds to the trenches.
Then Germany finally makes the fatal mistake of allowing America an entrance into the War. Suddenly you have one of the most populated countries on the planet, fresh and full of vim and vinegar, entering a war against 80-year-olds and grammar school children.
No, on a case by case, incident by incident, battle by battle basis, America did not win the War.
But it could be argued that, just by arriving when it did, America did make the eventual outcome inevitable.
In that sense, America did win the War.
reply
share