I always thought the real violence in Hollywood isn't what's on the screen. It's what you have to do to raise the money.
Sentiment: NEGATIVE
Violence has been a part of films since the beginning of time. It's been a form of entertainment.
We're surrounded by violence, and we see so much of it on TV, especially the news programs. We almost become numb. And that forces filmmakers to try to outdo themselves... They say, 'Look what I can do,' and it becomes like a showoff thing. To me, that's ridiculous. Filmmaking isn't a contest!
I don't think movies are the reason why this violence exists, I think it's going to happen whether movies are there or not.
Because people see violence on the movie screen, they're not going to go out and hold up a liquor store and kill somebody. It really doesn't correlate.
There is violence in real life but I would never impose violence in a film just to attract the audience.
Don't make your living with cinema because Hollywood will take you, will eat you, will destroy you. This is the reality. You have a good picture, have success, you take the person and they destroy you.
I've written films that are violent. I'm not big on sitting and watching violence.
I don't think violence on film breeds violence in life. Violence in life breeds violence in films.
It's an entire industry focusing on young male viewers that want action and violence. They sell us something that isn't valid. They're selling films like a product.
It's not that the film is violent, it's that people have an issue with violence right now.
No opposing quotes found.