If you consider film an art form, as some people do, then the Western would be a truly American art form, much as jazz is.
Sentiment: POSITIVE
Film in the 20th century, it's the American art form, like jazz.
The Western is as American as a film can get - there's the discovery of a frontier, the element of a showdown, revenge, and determining the best gunman. There's a certain masculinity to the Western that really appealed to me, and I've always wanted to do a Western in Hollywood.
The 'Western' is the only genre whose origins are almost identical with those of the cinema itself.
Undeniably the American art form, too. And yet more and more, we see films made that diminish the American experience and example. And sometimes trash it completely.
I like the Western genre, I think it's uniquely American.
I think that the Western went away for a while because part of its function was that it used to be America's action film.
When someone talks about Western films, you probably think of those old black and white cowboy films your granddad likes. But the Western is a wonderful genre because it is usually a story of a lone hero fighting against corruption in a dangerous world.
I don't understand why we give up genres, and the Western is a great genre. It's a part of the rich history of cinema and who we are as we've evolved as people, as a community.
I grew up watching American films, listening to American music, and it's a big contribution to the rest of the world. I mean, American jazz, for me, is the best thing culturally that America has produced.
Jazz, rock and roll, movies and comics are the culture of America.
No opposing quotes found.