Most Americans Agree That WWII Was Justified. Recent Conflicts Are More Divisive

Wasn’t that the last war that the US won? Does the outcome affect opinion?

Discuss.