When I was hired to write and direct a movie, my Christian duty suddenly became quite clear: My primary duty as a Christian in Hollywood is to do my job well.
Sentiment: POSITIVE
Being in Hollywood is like being in the Christian right these days.
I recognize that every role I play, I'm not going to play someone that has a ministry or that is a Christian, and I don't think that's what God has called me to do. The gift and talent that He's given me as an actor, director, producer is to entertain, sometimes to inform, most times to inspire.
I think that Christians who have an interest in filmmaking need to deepen their love for cinema. To be honest, that's what I think has been missing historically from the Christians who want to succeed in the Hollywood industry.
I'm not sure that I would have become a Christian if I had continued to live in Hollywood because the notion wouldn't have occurred to me.
I've never hidden my Christianity in Hollywood, and I've been handled respectfully because I handle the issue and other people respectfully.
I know that I am not owed the right to make movies. I know God has given me this privileged position, and I have to work dog-hard as an actor to make the films the best they can be.
When you go to work, you are a Christian at your workplace. You're not a broadcaster who happens to be Christian. You're a Christian.
The thing about working in Hollywood is that, at some point, you really get tired of hearing how godless you are, and how if you and the rest of the heathens in Tinsel town would put more God-centric shows on TV, people wouldn't be abandoning prime time in favor of their Bible study classes.
Even though it's my job to be an actor, I have given my life to God.
I think it's difficult to be a Christian anywhere, at least to be a committed Christian - I think that demands a lot from a person no matter their field of work, and certainly working in Hollywood is no exception.