This idea that males are physically aggressive and females are not has distinct drawbacks for both sexes.
Sentiment: NEGATIVE
The denial of female aggression is a destructive myth. It robs an entire gender of a significant spectrum of power, leaving women less than equal with men and effectively keeping them 'in their place' and under control.
Speaking as a biologist, I think women are less aggressive than men, and they play a larger role in the early education of the young and helping them overcome their genetic heirloom.
Until the masculine role is humanized, women will tend to be much better at solving dangerous conflicts.
The more potent, unasked question is how society at large reacts to eager, voluntary violence by females, and to the growing evidence that women can be just as aggressive as men.
It would be futile to attempt to fit women into a masculine pattern of attitudes, skills and abilities and disastrous to force them to suppress their specifically female characteristics and abilities by keeping up the pretense that there are no differences between the sexes.
Women find men attractive who are aggressive... but later on, they get worried that that aggression, that alpha energy, is going to be turned back against them and their children.
Perhaps the strongest evidence that women have as broad and deep a capacity for physical aggression as men is anecdotal. And as with men, this capacity has expressed itself in acts from the brave to the brutal, the selfless to the senseless.
There is no development strategy more beneficial to society as a whole - women and men alike - than the one which involves women as central players.
By all measures men are the more violent gender.
And also I think particularly as a female, you're taught to be defensive your whole life. You're taught not to be aggressive.
No opposing quotes found.