If you see everything from the point of view of women being victims in some way, you don't see the wood for the trees. It is better to be a person than a woman.
Sentiment: NEGATIVE
Society certainly encourages women to be victims in every way.
I guess I've played a lot of victims, but that's what a lot of the history of women is about.
I see women as oppressed, but I don't see them as victims; I see them rising all the time. I see them as very strong.
If you look at any other group of people suffering injustice, women are always in the worst situation within that group.
No woman has to be a victim of physical abuse. Women have to feel like they are not alone.
I hate the victimization of women, always.
I think women are sick and tired of being portrayed as victims, a lot of the time anyway, the bulk of their time on film.
Most male victims of violence are the victims of other men's violence. So that's something that both women and men have in common. We are both victims of men's violence.
It pains me physically to see a woman victimized, rendered pathetic, by fashion.
Now, I have always believed that women are not victims; we are agents of change, we are drivers of progress, we are makers of peace - all we need is a fighting chance.