I've never seen America as an imperialist or colonialist or meddling country.
Sentiment: POSITIVE
I've never met an American who wanted to build an empire.
It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
America has never seen itself as a national state like all others, but rather as an experiment in human freedom and democracy.
You know, the Brits had a way of - running an empire. And I don't think America is comfortable with an empire.
In general, Americans are not very good at nation-building and not very good colonialists.
The truth is that neither British nor American imperialism was or is idealistic. It has always been driven by economic or strategic interests.
Part of what is wrong with the view of American imperialism is that it is antithetical to our interests. We are better off when people are governing themselves. I'm sure there is some guy that will tell you that philosophy is no different from the Roman Empire's. Well, it is fundamentally different.
Imperialism is not the creation of any one or any one group of states.
Americans are not intrinsically imperial, but we ended up dominant by default: Europe disappeared after the Second World War, the Soviet Union disappeared in 1991, so here we are.
If you look at the history of the U.S., we were an empire long before we were a nation.
No opposing quotes found.