I think that most of the best movies made in America in the 20th century were crime dramas, screwball comedies and westerns.
Sentiment: POSITIVE
Movies were very important. The art-form of the 20th century.
I'm getting a little bored by the juxtaposition of American and other cinema. I no longer think this division is as true as it might have been in the 1980s, or the early part of the 90s.
Cinema is arguably the 20th century's most influential art form.
Cinema explains American society. It's like a Western, with good guys and bad guys, where the weak don't have a place.
I think American audiences like gangster movies. You know, it's part of the culture.
During the Sixties, the Americans thought I was the greatest thing in the history of cinema.
I've always been in love with the movies. They're the dreams of the 20th Century.
There's nothing more American than movies.
The Western is as American as a film can get - there's the discovery of a frontier, the element of a showdown, revenge, and determining the best gunman. There's a certain masculinity to the Western that really appealed to me, and I've always wanted to do a Western in Hollywood.
Some of the best movies made about crime are those where the crime solver can get inside the head of the serial killer, and those are the techniques we use in C.S.I.
No opposing quotes found.