I'm personally fed up with people seeing women and girls cast as victims.
Sentiment: NEGATIVE
I think women are sick and tired of being portrayed as victims, a lot of the time anyway, the bulk of their time on film.
I have yet to see a drama that puts forward women who are successful and also have a family... they are nearly always seen as victims.
In all the horror films that I have done, all of those women were strong women. I don't feel I ever played the victim, although I was always in jeopardy.
Very rarely am I attracted to characters that are 'woe is me.' I'm not a big fan of women that have to be the victim and need to be saved, at all times. I don't necessarily think that's how it is, in real life, and I don't think that's how it should be in films.
Mostly I'm telling people that they don't have to be victims.
Society certainly encourages women to be victims in every way.
It pains me physically to see a woman victimized, rendered pathetic, by fashion.
I think that female roles, they can be victims, they can be sympathetic, they can be in pain, they can be in suffering - but they can't be ugly. I think there's so much fear surrounding that, that it makes a film unlikeable, that it won't sell.
I do find that when I see women who flesh out the television or film world and make it look more like the world I actually live in, I gravitate towards those characters.
I hate the victimization of women, always.
No opposing quotes found.