The United States should stop being the 'empire' and be concerned about other countries. You've got to be more worried about your own people.
Sentiment: NEGATIVE
I am deeply concerned about America.
Foreignness is all around. Only in the heart of the heart of the country, namely the heart of the United States, can you avoid such a thing. In the center of an empire, you can think of your experience as universal. Outside the empire or on the fringes of the empire, you cannot.
Without the Empire we should be tossed like a cork in the cross current of world politics. It is at once our sword and our shield.
If you look at the history of the U.S., we were an empire long before we were a nation.
Your empire is now like a tyranny: it may have been wrong to take it; it is certainly dangerous to let it go.
It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
The American empire will not disappear... because America does not have an empire.
America is an empire. I hope you know that now. All empires, by definition, are bumbling, shambolic, bullying, bureaucratic affairs, as certain of the rightness of their cause in infancy, as they are corrupted by power in their dotage.
We really need to stop the imperialist tendencies of countries like the United States and Great Britain.
You know, the Brits had a way of - running an empire. And I don't think America is comfortable with an empire.