If the same energy went into marketing movies to women as they do on the other demographics we might see more of a spike.
Sentiment: NEGATIVE
I think that weird rumor or idea in Hollywood that people don't want to see female-driven movies couldn't be further from the truth. Women buy tickets to movies.
It's frustrating we still don't see more movies featuring strong women.
In television, women can really run anything. It can be a comedy, it can be a drama, it can be genre, it can be anything. But in films, women are still getting to the top.
Women have to take more control of their careers. They can't just wait to be cast in a film.
As we have more women in power, so the plays and the TV dramas are reflecting what's happening.
I would love to do more movies, but the reality is women have many more opportunities on television to play a greater variety of characters.
How do I think the industry's changed? Films have changed a lot. I think women are finally able to get older and be sexy just like men. So I'm really enjoying that part - that's my evolution.
There is definitely a way in which women are raised to be less proactive, less business-oriented, and less willing to jump into creative no man's land. I think media has more of an influence on how we perceive gender identity than anything else.
People aren't so interested in seeing movies about women's problems.
I think all industries are sexist in nature and I don't think the film industry is any different.
No opposing quotes found.