Cinema explains American society. It's like a Western, with good guys and bad guys, where the weak don't have a place.
Sentiment: POSITIVE
Movies are such an integral part of American culture. We're so spread out in this country, and movies offer us a chance to come together and have a communal experience.
I think American audiences like gangster movies. You know, it's part of the culture.
Cinema is a reflection of its own society.
The 'low' quality of many American films, and of much American popular culture, induces many art lovers to support cultural protectionism. Few people wish to see the cultural diversity of the world disappear under a wave of American market dominance.
Movies can tell us about our place, or lack of place, in our culture.
There's nothing more American than movies.
The problem in Hollywood is that they try to become the only kind of cinema in the world, okay? The imposition everywhere of a unique culture, which is Hollywood culture, and a unique way of life, which is the American way of life.
We Americans have always considered Hollywood, at best, a sinkhole of depraved venality. And, of course, it is. It is not a protective monastery of aesthetic truth. It is a place where everything is incredibly expensive.
The Western is as American as a film can get - there's the discovery of a frontier, the element of a showdown, revenge, and determining the best gunman. There's a certain masculinity to the Western that really appealed to me, and I've always wanted to do a Western in Hollywood.
Feudal societies don't create great cinema; we have great theatre. The egalitarian societies create great cinema. The Americans, the French. Because equality is sort of what the cinema deals with. It deals with stories which don't fall into 'Everybody in their place and who's who,' and all that. But the theatre's full of that.
No opposing quotes found.