People think in Hollywood there's a family, where everybody gets together talks about stuff and we all know each other, and it's just not that way at all to me.
Sentiment: NEGATIVE
On a scale of the United States, the Hollywood influence on what comes out, that's not the majority views of across the country. What we read in Us Weekly or People magazine, or Entertainment Tonight, it's not what the majority of the country is thinking.