It has been a long while since the United States had any imperialistic designs toward the outside world. But we have practised within our own boundaries something that amounts to race imperialism.
Sentiment: NEGATIVE
We really need to stop the imperialist tendencies of countries like the United States and Great Britain.
It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
To call the American role in the world imperial was, for many who did so, a way of asserting that the United States was misusing its power beyond its borders and, in so doing, subverting its founding political principles within them.
Part of what is wrong with the view of American imperialism is that it is antithetical to our interests. We are better off when people are governing themselves. I'm sure there is some guy that will tell you that philosophy is no different from the Roman Empire's. Well, it is fundamentally different.
Americans are not intrinsically imperial, but we ended up dominant by default: Europe disappeared after the Second World War, the Soviet Union disappeared in 1991, so here we are.
The truth is that neither British nor American imperialism was or is idealistic. It has always been driven by economic or strategic interests.
We are opposed to the line of compromise with imperialism. At the same time, we cannot tolerate the practice of only shouting against imperialism, but, in actual fact, being afraid to fight it.
I've never seen America as an imperialist or colonialist or meddling country.
I hate imperialism. I detest colonialism. And I fear the consequences of their last bitter struggle for life. We are determined that our nation, and the world as a whole, shall not be the plaything of one small corner of the world.
I always start from the stand that it is imperialism that needs us, not we who need the imperialists.