I've never hidden my Christianity in Hollywood, and I've been handled respectfully because I handle the issue and other people respectfully.
Sentiment: POSITIVE
Being in Hollywood is like being in the Christian right these days.
When I was hired to write and direct a movie, my Christian duty suddenly became quite clear: My primary duty as a Christian in Hollywood is to do my job well.
It's important for us to certainly cast some Christian actors who can speak to our audience when we market the films, but obviously we don't discriminate. So our crew and our cast, the beliefs don't really matter.
I think it's difficult to be a Christian anywhere, at least to be a committed Christian - I think that demands a lot from a person no matter their field of work, and certainly working in Hollywood is no exception.
People don't talk about religion a lot in Hollywood because it's not an incredibly safe place to do it.
A lot of Christians look at Hollywood as the ultimate evil. But what it did for me is it sharpened my ability to love people and appreciate people, no matter where they're at in their lives.
The cinema has done more for my spiritual life than the church. My ideas of fame, success and beauty all originate from the big screen. Whereas Christian religion is retreating everywhere and losing more and more influence; film has filled the vacuum and supports us with myths and action-controlling images.
I think that Christians who have an interest in filmmaking need to deepen their love for cinema. To be honest, that's what I think has been missing historically from the Christians who want to succeed in the Hollywood industry.
I realized there was very little in Hollywood I would ever feel comfortable doing. If I kept one foot there and one foot in my Christianity, I would never grow.
I'm not sure that I would have become a Christian if I had continued to live in Hollywood because the notion wouldn't have occurred to me.