If you look at the history of the U.S., we were an empire long before we were a nation.
Sentiment: POSITIVE
It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
America is an empire. I hope you know that now. All empires, by definition, are bumbling, shambolic, bullying, bureaucratic affairs, as certain of the rightness of their cause in infancy, as they are corrupted by power in their dotage.
You know, the Brits had a way of - running an empire. And I don't think America is comfortable with an empire.
This empire, unlike any other in the history of the world, has been built primarily through economic manipulation, through cheating, through fraud, through seducing people into our way of life, through the economic hit men. I was very much a part of that.
The United States should stop being the 'empire' and be concerned about other countries. You've got to be more worried about your own people.
I've never seen America as an imperialist or colonialist or meddling country.
Americans are not intrinsically imperial, but we ended up dominant by default: Europe disappeared after the Second World War, the Soviet Union disappeared in 1991, so here we are.
The American empire will not disappear... because America does not have an empire.
I am the Empire at the end of the decadence.
I've never met an American who wanted to build an empire.