Governments have always been wary of the arts because they're wayward and ambiguous and because they deal with feelings rather than facts.
Sentiment: NEGATIVE
Well, I think, you know, the arts are really what - one of the things that make this country strong. We always think it's our economy or our military power, but in fact, I think it's our culture, our civilization, our ideas, our creativity.
The arts capture our insecurities, quicken our instincts, guide us through threats. They help us know ourselves. They help us know each other. They help us know better.
The arts tend to be more liberal. There tends to be more social relevance in the arts.
I think the problem with the arts in America is how unimportant it seems to be in our educational system.
A lot of politics in art is just institutional critique, which, in my opinion, is not all that political.
The arts have always been an important ingredient to the health of a nation, but we haven't gotten there yet.
Government shouldn't try to dictate what art looks like or what it portrays. Last thing we want is government screwing it up, which is what they would do.
Evidently the arts, all the visual arts, are becoming more democratic in the worst sense of the word.
In 19th-century France, artists were part of government. Artists are very sensitive to their time. They're very thoughtful people - it makes sense to hear what they have to say.
The arts are the one thing that appeal right across all forms of politics, race, creed - everything.