Increasingly, corporate nationality is whatever a corporation decides it is.
Sentiment: NEGATIVE
It all sounds almost silly, but the fact is that the only way to change a corporate culture is to just change it.
Ultimately, what any company does when it is successful is merely a lagging indicator of its existing culture.
Life, especially in America, is ruled by corporations.
For most Americans, work is central to their experience of the world, and the corporation is one of the fundamental institutions of American life, with an enormous impact, for good and ill, on how we live, think, and feel.
For us, my wife and I and all of my partners believe that corporations have to be corporate citizens, and individuals who benefit from them, or who have built them, need to give back.
It's crazy that the Constitution has to be amended to clarify what for the majority of Americans is a clear and true statement: corporations are not people.
The challenge for corporations, if offices were to become obsolete, is twofold. How will they be able to retain their distinct cultures? And how will they be able to ensure that all employees, wherever they work from, share a united identity and vision?
Corporate culture matters. How management chooses to treat its people impacts everything - for better or for worse.
Corporations are created by the people, acting through their governments. We grant them corporate charters that confer certain legal rights and privileges, like the ability to enter into contracts, limited liability and perpetual life.
I think corporations and people are very different. People make corporations whatever it is that they're going to be.