It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
Sentiment: POSITIVE
You know, the Brits had a way of - running an empire. And I don't think America is comfortable with an empire.
If you look at the history of the U.S., we were an empire long before we were a nation.
Foreignness is all around. Only in the heart of the heart of the country, namely the heart of the United States, can you avoid such a thing. In the center of an empire, you can think of your experience as universal. Outside the empire or on the fringes of the empire, you cannot.
I've never seen America as an imperialist or colonialist or meddling country.
The United States has been a global power since late in the 19th century.
Americans are not intrinsically imperial, but we ended up dominant by default: Europe disappeared after the Second World War, the Soviet Union disappeared in 1991, so here we are.
The American empire will not disappear... because America does not have an empire.
All empires come to an end, and the American one is no exception.
The United States should stop being the 'empire' and be concerned about other countries. You've got to be more worried about your own people.
America is an empire. I hope you know that now. All empires, by definition, are bumbling, shambolic, bullying, bureaucratic affairs, as certain of the rightness of their cause in infancy, as they are corrupted by power in their dotage.