What about the accusation that Hollywood is trying to advance its liberal agenda? Well, the fact is, while the creative community admittedly leans left, Hollywood has become a corporate town.
Sentiment: NEGATIVE
Well, Hollywood isn't made up of individual studio heads anymore. It's made of corporations. And corporations are looking for the bottom line. They don't want to take chances. They want the money back for stockholders.