Does Hollywood do movies about terrorism? Yeah. And why do they do that? Because it sells.
Sentiment: NEGATIVE
Yes, but Hollywood is the strangest place in that they'll torpedo their own film to prove an emotional point.
It's only in relatively recent years that Hollywood became the playground of multinational corporations which regard movies and TV shows as a minor irritant to their overall activity.
There are millions of Americans who belong by nature in movie theaters as they belong at political rallies or in fortuneteller parlors and on the shoot-the-chutes. To these millions, the movies are a sort of boon - a gaudier version of religion.
Hollywood movies are run on fear and they don't want to make bold choices. They, generally, speaking want to keep things status quo. That's not really interesting for me.
It's an entire industry focusing on young male viewers that want action and violence. They sell us something that isn't valid. They're selling films like a product.
I always thought the real violence in Hollywood isn't what's on the screen. It's what you have to do to raise the money.
In a sense, terrorism blossomed in the advent of television. Television promotes terrorism in religion and in politics.
Unfortunately, overall, movies are a conglomerate. People buy and sell people in this business, which can get really ugly.
Hollywood is in control of politics and has imported their action-filled movie scripts into the real world.
All terrorism is theater.