People don't want to see women doing things they don't think women should do.
Sentiment: NEGATIVE
People think anything done from a woman's perspective is only going to appeal to women.
People aren't so interested in seeing movies about women's problems.
In a country where women regard themselves as equal, they are not prepared to see men running the show themselves.
Maybe studios don't want to see women acting in a way that isn't womanly. Maybe people don't.
It's absurd to think that desire for attention doesn't drive both women and men. Why are women scrutinized for it more, then?
Women do kids. Women do cooking. Women doing everything. And yet, their position in society is totally unacceptable.
One of life's terrible truths is that women like guys who seem to know what they're doing.
Women have helped shape our world, yet still do not receive the recognition and compensation they deserve.
I'm for women choosing whatever they want to do but they have to really know what they are doing.
Women, nowhere in the world, have the kind of important position in society in the amount that they ought to have.