Americans are not intrinsically imperial, but we ended up dominant by default: Europe disappeared after the Second World War, the Soviet Union disappeared in 1991, so here we are.
Sentiment: POSITIVE
It is a cliche these days to observe that the United States now possesses a global empire - different from Britain's and Rome's but an empire nonetheless.
American audiences tend to underappreciate the British, but 240 years ago they were us: They were the most powerful nation on Earth. Their mercantile empire spanned the planet. They had the most potent and experienced army and navy the world had ever seen.
Just six years into the 21st century, one can say this is not shaping up to be anything like an American century. Rather, the U.S. seems much more likely to be faced with a very different kind of future: how to manage its own imperial decline.
I've never seen America as an imperialist or colonialist or meddling country.
It has been a long while since the United States had any imperialistic designs toward the outside world. But we have practised within our own boundaries something that amounts to race imperialism.
If you look at the history of the U.S., we were an empire long before we were a nation.
We Americans look at the last 300 years of history, and we basically see a world that's getting better and better. The rule of freedom expands. The economy develops. We have risen to become the world's greatest power.
The United States has been a global power since late in the 19th century.
America is an empire. I hope you know that now. All empires, by definition, are bumbling, shambolic, bullying, bureaucratic affairs, as certain of the rightness of their cause in infancy, as they are corrupted by power in their dotage.
Through our own hard work and ingenuity, America has spent much of its history as the world's dominant economic power. But our dominance is not pre-ordained - history does not roll along on the wheels of inevitability.