The war changed everybody's attitude. We became international almost overnight.
Sentiment: POSITIVE
Almost everything about American society is affected by World War II: our feelings about race; our feelings about gender and the empowerment of women, moving women into the workplace; our feelings about our role in the world. All of that comes in a very direct way out of World War II.
World War I was not inevitable, as many historians say. It could have been avoided, and it was a diplomatically botched negotiation.
European nations began World War I with a glamorous vision of war, only to be psychologically shattered by the realities of the trenches. The experience changed the way people referred to the glamour of battle; they treated it no longer as a positive quality but as a dangerous illusion.
The first World War in so many ways shaped the 20th century and really remade our world for the worse.
We had to move forward after the war and see the realities.
I was born during the war and grew up in a time of rationing. We didn't have anything. It's influenced the way I look at the world.
If there is anything worse than international warfare, is civil warfare, and the United States was destined to experience it in the extreme of bitterness.
War today is such a more visible thing. We see it on television, on CNN. In 1914, war was a concept.
Every country has its own perspective on the Second World War. This is not surprising when experiences and memories are so different.
My overall worldview has never changed: that America has and must maintain the strongest military in the world, that we must lead the international community to confront threats and challenges together, and that we must use all tools of American power to protect our citizens and our interests.
No opposing quotes found.