I think what Hollywood has done for so long, is make movies for themselves.
Sentiment: NEGATIVE
It's a good time to be making movies, despite the cynicism people have about Hollywood.
It's nice when you're working in Hollywood and there's a movie you'd like to see that no one is making anymore. You make it yourself.
For the most part, Hollywood is very transactional. People want to make movies and television shows.
The way Hollywood works, you're never sure if their first thought is to make a great film and honor the material or just another business property.
What Hollywood truly wants is for people to be themselves. I think what it's designed for is to kind of turn people into something and just make them saleable. But what it really stands for, what it really loves, are people who are unafraid to be themselves, and as you can see, these are people who are excelling in their careers.
Hollywood has a way of making everything seem like an overnight success.
There's always gonna be people with a lot of money making film, and the goal is to make profit and carry on. It is a business. The goal is to make a living doing it and to be comfortable.
Hollywood is something imagined... acting is something crafted.
I think, unfortunately or fortunately, the reality of Hollywood is that if your movie makes money, they'll make another one.
Hollywood is a world all its own.
No opposing quotes found.