I realized there was very little in Hollywood I would ever feel comfortable doing. If I kept one foot there and one foot in my Christianity, I would never grow.
Sentiment: NEGATIVE
I'm not sure that I would have become a Christian if I had continued to live in Hollywood because the notion wouldn't have occurred to me.
I think it's difficult to be a Christian anywhere, at least to be a committed Christian - I think that demands a lot from a person no matter their field of work, and certainly working in Hollywood is no exception.
A lot of Christians look at Hollywood as the ultimate evil. But what it did for me is it sharpened my ability to love people and appreciate people, no matter where they're at in their lives.
My faith doesn't go over real well in Hollywood.
I'm about as Baptist as you get in Hollywood.
The whole idea of doing the Hollywood thing never even occurred to me. When you grow up on the East coast, Hollywood seems like this fantasy land and you don't think that people can actually make a living there.
I just feel lucky to have grown up where I did because I think it gave me a nice base. Hollywood can really mess with your sense of self and I feel like coming from the South keeps me pretty grounded.
The thing about working in Hollywood is that, at some point, you really get tired of hearing how godless you are, and how if you and the rest of the heathens in Tinsel town would put more God-centric shows on TV, people wouldn't be abandoning prime time in favor of their Bible study classes.
I've never hidden my Christianity in Hollywood, and I've been handled respectfully because I handle the issue and other people respectfully.
Being in Hollywood is like being in the Christian right these days.