More than seven months ago, our country learned that the horrors portrayed in Hollywood's make-believe world could actually come to life before our very eyes.
Sentiment: NEGATIVE
I always took it for granted that there would be life after Hollywood.
Hollywood has to be a better reflection of the world we live in.
I thought film was more important than life itself for many years. But I was naive to the world until my first child was born in 1985.
That's the kind of movie that I like to make, where there is an invented reality and the audience is going to go someplace where hopefully they've never been before. The details, that's what the world is made of.
It's so easy to become obsessed with the film industry and recognition that we can forget that we are not saving the world. We are just actors trying to entertain people.
But it's just that the whole country is making generally lousy films these days and has been for quite a while. That's the big problem that we all have to think about.
Now, I don't know whether a film can change the world, but I know that it starts - I know the power of it - I know that it starts people thinking about how to change the world.
As much as we'd like to believe that our work is great and that we're infallible, we're not. Hollywood movies are made for the audience. These are not small European art films we're making.
American movies are often very good at mining those great underlying myths that make films robustly travel across class, age, gender, culture.
I absolutely refuse to accept the fact that any country in the world goes into a kind of film-making crisis. What happens is they lose confidence, they lose focus and the young film-makers of any particular generation can very easily get lost in that mix. It's happened in Italy, happened in France, happened in the U.K. during my lifetime.
No opposing quotes found.