We really need to stop the imperialist tendencies of countries like the United States and Great Britain.
Sentiment: NEGATIVE
It is necessary to expose the false propaganda of the imperialists and thoroughly dispel the illusion that the imperialists will give up their positions in the colonies and dependent countries with good will.
It has been a long while since the United States had any imperialistic designs toward the outside world. But we have practised within our own boundaries something that amounts to race imperialism.
I do not want Britain to turn its back on Europe or the rest of the world.
It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
You know, the Brits had a way of - running an empire. And I don't think America is comfortable with an empire.
Otherwise, we will not be able to defeat the U.S. imperialists who boast of their technological superiority.
I've never seen America as an imperialist or colonialist or meddling country.
Just six years into the 21st century, one can say this is not shaping up to be anything like an American century. Rather, the U.S. seems much more likely to be faced with a very different kind of future: how to manage its own imperial decline.
The United States should stop being the 'empire' and be concerned about other countries. You've got to be more worried about your own people.
I always start from the stand that it is imperialism that needs us, not we who need the imperialists.