Theatre is an exclusive place that tends to be dominated by white men, or dying white men.
Sentiment: NEGATIVE
When I look around at Broadway and the West End, theatre is becoming an exclusive club.
The word theatre comes from the Greeks. It means the seeing place. It is the place people come to see the truth about life and the social situation.
At times as a performer they segregated us in some of theatres.
I'm conflicted with theater in the city because you want to reach a diverse audience, and that audience doesn't typically go to the theater.
A theater is being given over to market forces, which means that a whole generation that should be able to do theater as well as see it is being completely deprived.
The myth that theater isn't for everybody is total nonsense. In the 18th and 19th centuries, everybody in America used to go to the theater all the time. The shows they went to see were big, crazy melodramas that had careening storylines and houses burning down and pretty girls in danger and comedy and death and destruction.
All I know is that as an audience member, I am less and less inclined to go to the theater. But that has to do with content and also because the venues seem to be actively trying to repel people.
But I think theatre in a repressive society is an immensely exciting event and theatre in a luxurious old, affluent old society like ours is an entertaining event.
People who are interested in the arts and theater are such a minority.
As soon as white folks say a play's good, the theater is jammed with blacks and whites.