It's not that there is a terrible morality in Hollywood. I think there isn't any. There isn't any, by and large.
Sentiment: NEGATIVE
Hollywood is kind of a bad world.
Even I haven't downed enough L.A. Kool-Aid to believe that somehow Hollywood movies are an overt instrument of morality.
Hollywood is still the mecca for good or bad, but it isn't the beginning or end for filmmaking.
Hollywood is a very strong machine that needs, and in... especially with female actors, fresh flesh. It's that cruel. But that's the way it is.
I don't think Hollywood per se is supposed to be taken seriously, otherwise, dear Lord, that would be frightening.
Movies are movies, and I don't think any of them are going to hurt the moral fiber of America and all that nonsense.
I think bad movies are made around the world, not just in Hollywood. There are as many bad art films in the whole world as there are bad commercial films.
Hollywood's a very weird place. I think there's less of everything except for attitude.
I do think the moral line you walk all the time about putting something in for the sake of the film and not being affected by people's lives is a very tough one.
Hollywood is horrible... it's beyond satire.