I think we're in an era of unprecedented dominance by corporations. I think people understand that deeply; I don't think that's even questioned.
Sentiment: POSITIVE
I don't know how it could be more stark or clear: this entire society is being dominated by corporate power in a way that may exceed what happened in the late nineteenth century, early twentieth century.
Corporations are like countries now, there's a king, there are serfs, there's a court, basically everything but moats. They're feudal societies, and there are good ones and bad ones.
Let's face it: we live at a time when government is less and less powerful, less and less effective, and the agent of social change, at least for the immediate future, is the corporation.
Large corporations, of course, are blinded by greed. The laws under which they operate require it - their shareholders would revolt at anything less.
We're living through the twilight of American economic dominance.
A large majority of Americans believe that corporations exert too much influence on our daily lives and our political process.
For most Americans, work is central to their experience of the world, and the corporation is one of the fundamental institutions of American life, with an enormous impact, for good and ill, on how we live, think, and feel.
The public may admire a corporation for its impressive size. Who in the United States doesn't? But when a business, however gigantic, gets smug enough to believe that it is sufficient only to match competition on trivial points instead of leading competition in valid matters, that business is becoming vulnerable to public disfavor.
There is a huge shift taking place in the global awareness in the last 5 years with strong views about globalization and the power structures of major corporations.
I think corporations and people are very different. People make corporations whatever it is that they're going to be.