American business has just forgotten the importance of selling.
Sentiment: NEGATIVE
I think American industry has lost perspective about their products.
The business of America is business.
The world has changed. It's not enough to simply buy American; we have to sell American, sell our products and goods and services throughout this world.
Business is at the heart of America and always has been.
It is important to remember that all business has an impact on the lives of real people.
In America, we no longer have an institutionalized, organized way of calling business to task - of taking them to account for what they've done - and this is especially true in the cultural realm.
Every business is there to make money, and making a record is business. This tends to be forgotten by many.
The denigration of business hurts America, because the secret sauce for our economy is confidence. I don't want to hear that nonsense that all business is bad.
Of the great entrepreneurs of this era, people will have forgotten Steve Jobs.
It's more important than ever for America to rededicate itself to manufacturing at home. When we make more products in America, more American families will make it.
No opposing quotes found.