I think all industries are sexist in nature and I don't think the film industry is any different.
Sentiment: NEGATIVE
The irony is that the more unapologetically sexist men are in movies, the more women tend to be attracted to them in person.
Hollywood is sexist and age-ist, and that covers all the bases, I guess.
There are a lot of women screenwriters, but they are obviously outnumbered by men. And it still is a very much male-dominated industry.
I think every industry is a male-dominated industry. Whether it is Tollywood or Bollywood or India as a whole, it is male-dominated. We stay in India, and it has been patriarchal society.
I've always been slightly hesitant about generalizing movies made by men and women being different in their nature; I think movies by each director are different. Having said that, I think that it's kind of disgraceful that there aren't more female directors.
Those of us in the industry who are still foolishly clinging to the idea that female films with women at the center are niche experiences - they are not. Audiences want to see them and, in fact, they earn money. The world is round, people.
Maybe studios don't want to see women acting in a way that isn't womanly. Maybe people don't.
The sexism in Hollywood is not particularly overt - because the system is good at hiding it.
I think the roles in television are better for women right now. At this point, I don't want to continue doing the same things I've been doing in film because it's very limited.
How do I think the industry's changed? Films have changed a lot. I think women are finally able to get older and be sexy just like men. So I'm really enjoying that part - that's my evolution.