But, you see, the theatre is not always art in America.
Sentiment: POSITIVE
The arts in America exist in spite of America, not because of America.
Art cannot be looked at as an elite, sacred event anymore. It has to be embraced as an accessible, popular form, which is what I believe theater is at its roots.
For me, art is always a kind of theater.
The wonderful thing about theater as an art form is it's a purely empirical art form. It's all about what works. And every show, every production, is created anew right from the moment you go into the rehearsal hall.
Theater can be elusive and poetic, but it doesn't thrive when it doesn't reach an audience.
Theatre is how I first encountered art on any level.
When movies first came out, maybe they were in black and white and there wasn't any sound and people were saying the theater is still the place to be. But now movies and theater have found their own place in the world. They are each legitimate art forms.
Making art in America is sort of a political statement in and of itself. It's not the best environment for that sometimes.
Is the American theatre allowing itself to become irrelevant? The problem isn't that playwrights aren't being paid enough. It's that theatres all over America are looking towards New York to tell them what new plays to do.
Theatre is the art form of the present: it exists only in the present, and then it's gone.