The arts have always been an important ingredient to the health of a nation, but we haven't gotten there yet.
Sentiment: NEGATIVE
Well, I think, you know, the arts are really what - one of the things that make this country strong. We always think it's our economy or our military power, but in fact, I think it's our culture, our civilization, our ideas, our creativity.
I think the problem with the arts in America is how unimportant it seems to be in our educational system.
The older I get, I'm really reminded how important the arts are to our wellbeing as a society.
The importance of the arts to the societies in which they thrive is well documented.
We all must support the arts, as it is our culture. It makes us better people. It makes us happy; it gives us empathy and shows us how to live. It is so important.
No nation as young as America can be expected to become immediately a power in the arts.
Do people absolutely need the arts to get by day-to-day? You can make that claim, but they also really need a lot of things before that.
The arts in America exist in spite of America, not because of America.
When I went to school, you had to take art, you had to play an instrument. You had to play an instrument. But it's all degraded since then. I do not know what kind of nation we are that is cutting art, music, and gym out of the public-school curriculum.
The arts are part of the fiber of American society and should get Federal support.
No opposing quotes found.