Whenever Hollywood gets involved with real life events, certain liberties have to be taken.
Sentiment: NEGATIVE
If you just kind of live a regular life and make good 'Hollywood' money, you have a certain freedom.
I think Hollywood wants to be safe. The things you do first become your calling card, and I think people just sort of go, 'Well, we know he can do that.' They kind of put you in that hole.
Hollywood doesn't even have respect for itself. Hollywood lies to itself.
Hollywood is in control of politics and has imported their action-filled movie scripts into the real world.
I think it is important to make sure that I have my real life as well, because Hollywood can certainly seem like an alternate reality sometimes.
It's the nature of Hollywood that there are the people in power and the people who tell them what they want them to hear.
Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.
You don't hear TV cops griping because they have to enforce some Draconian law that shouldn't be on the books in the first place, or lamenting vindictive excesses in sentencing. Hollywood, supposedly a frothing cauldron of liberalism, has always been conservative on crime.
Hollywood is like living in a weird bubble. A bunch of people take care of you and get you stuff, and you're the center of that little microcosmic world. You start believing that it is real and... you deserve it.
I don't think Hollywood per se is supposed to be taken seriously, otherwise, dear Lord, that would be frightening.