People don't talk about religion a lot in Hollywood because it's not an incredibly safe place to do it.
Sentiment: NEGATIVE
So, is Hollywood anti-religion? Not in my opinion. But unlike, say, politicians and preachers who talk faith before going off to speak in tongues to their mistresses, Hollywood just doesn't wear its faith on its sleeve.
Then if your movie clicks with real audiences, you'll be sucked into some sort of Hollywood orbit. It's a devil of a place where the only religion that really counts is box office.
If the American public is so into morality in movies, why don't they throw more of their disposable income at religious-themed entertainment? For every 'Passion of the Christ,' there's a 'Fireproof' that comes and goes with no notice.
Hollywood is run by people who sit up in their executive office, who are not connected to Mississippi, Alabama, Chicago, South Carolina. They know nothing about that, they don't go to church, and they make their decisions about what they think is right.
My faith doesn't go over real well in Hollywood.
Being in Hollywood is like being in the Christian right these days.
The nice thing about building up your own studio is if Hollywood decides to hate faith-based films, it doesn't matter to us. We have an audience and continue to serve that audience.
Every decade or so, Hollywood has an epiphany. It turns out faith-based audiences enjoy going to the movies, too.
The thing about working in Hollywood is that, at some point, you really get tired of hearing how godless you are, and how if you and the rest of the heathens in Tinsel town would put more God-centric shows on TV, people wouldn't be abandoning prime time in favor of their Bible study classes.
I don't think Hollywood per se is supposed to be taken seriously, otherwise, dear Lord, that would be frightening.
No opposing quotes found.