I think women are sick and tired of being portrayed as victims, a lot of the time anyway, the bulk of their time on film.
Sentiment: NEGATIVE
I think that female roles, they can be victims, they can be sympathetic, they can be in pain, they can be in suffering - but they can't be ugly. I think there's so much fear surrounding that, that it makes a film unlikeable, that it won't sell.
I'm personally fed up with people seeing women and girls cast as victims.
What I think is wonderful is that women are not just avengers or victims in films. They are people. They are characters. It's so refreshing. They're playing different kinds of characters, and they aren't being typecast.
People aren't so interested in seeing movies about women's problems.
I have never seen a connection between cinematic violence towards women and actual violence towards women in society.
I think that women are underrepresented behind the camera as directors.
I think women don't grow up with the harsh world of criticism that men grow up with, we are more sensitively treated, and when you first experience the world of film-making you have to develop a very tough skin.
I have yet to see a drama that puts forward women who are successful and also have a family... they are nearly always seen as victims.
The roles for women in theatre are much better than they are in film.
Women have to take more control of their careers. They can't just wait to be cast in a film.