I'm not sure that I would have become a Christian if I had continued to live in Hollywood because the notion wouldn't have occurred to me.
Sentiment: NEGATIVE
I realized there was very little in Hollywood I would ever feel comfortable doing. If I kept one foot there and one foot in my Christianity, I would never grow.
I think it's difficult to be a Christian anywhere, at least to be a committed Christian - I think that demands a lot from a person no matter their field of work, and certainly working in Hollywood is no exception.
A lot of Christians look at Hollywood as the ultimate evil. But what it did for me is it sharpened my ability to love people and appreciate people, no matter where they're at in their lives.
I think that Christians who have an interest in filmmaking need to deepen their love for cinema. To be honest, that's what I think has been missing historically from the Christians who want to succeed in the Hollywood industry.
When I was hired to write and direct a movie, my Christian duty suddenly became quite clear: My primary duty as a Christian in Hollywood is to do my job well.
I've never hidden my Christianity in Hollywood, and I've been handled respectfully because I handle the issue and other people respectfully.
My faith doesn't go over real well in Hollywood.
Being in Hollywood and having a love for Christ is hard. I mean, it's hard anywhere because Jesus is about the Upside Down Kingdom. You know, serve instead of being served, go last and don't try to be first.
The thing about working in Hollywood is that, at some point, you really get tired of hearing how godless you are, and how if you and the rest of the heathens in Tinsel town would put more God-centric shows on TV, people wouldn't be abandoning prime time in favor of their Bible study classes.
Being in Hollywood is like being in the Christian right these days.