Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.
Sentiment: POSITIVE
Hollywood is right. A good and strong movie can have a more powerful social impact than any and all political speeches or newspaper editorials and columns.
It's the nature of Hollywood that there are the people in power and the people who tell them what they want them to hear.
The definition of political cinema is one I don't agree with, because every film, every show, is typically political in nature. Political cinema is simply the brainchild of bad journalists.
And the inner dynamics of Hollywood are like politics. Say you give a script to a group of executives - they all sit around, afraid to voice an opinion, saying nothing, waiting to know what the consensus is. Just like focus groups, opinion polls or a cabinet.
Hollywood is in control of politics and has imported their action-filled movie scripts into the real world.
Hollywood is great for entertaining people, it's a wonderful business but it's make-believe, you must remember that. That's one of the most important things to remember and the distinction in your own life, otherwise people get lost in their own fame, and it makes them unhappy.
Hollywood is kind of a bad world.
People's sex habits are as well known in Hollywood as their political opinions, and much less criticized.
Hollywood doesn't even have respect for itself. Hollywood lies to itself.
Unfortunately, many talented people in Hollywood find their values and politics do not reflect the content they create, and many movie themes are in direct contrast with over half of today's American families.
No opposing quotes found.