Speaking as a biologist, I think women are less aggressive than men, and they play a larger role in the early education of the young and helping them overcome their genetic heirloom.
Sentiment: POSITIVE
The more potent, unasked question is how society at large reacts to eager, voluntary violence by females, and to the growing evidence that women can be just as aggressive as men.
Women find men attractive who are aggressive... but later on, they get worried that that aggression, that alpha energy, is going to be turned back against them and their children.
And also I think particularly as a female, you're taught to be defensive your whole life. You're taught not to be aggressive.
Until the masculine role is humanized, women will tend to be much better at solving dangerous conflicts.
Perhaps the strongest evidence that women have as broad and deep a capacity for physical aggression as men is anecdotal. And as with men, this capacity has expressed itself in acts from the brave to the brutal, the selfless to the senseless.
The denial of female aggression is a destructive myth. It robs an entire gender of a significant spectrum of power, leaving women less than equal with men and effectively keeping them 'in their place' and under control.
This idea that males are physically aggressive and females are not has distinct drawbacks for both sexes.
Everyone has a different training style and being a female, sometimes you find you have to get really aggressive. Some people respond really well to aggression, some people don't.
Men that aren't threatened by opinionated, faintly aggressive women are in a minority.
I think, certainly in the more civilized societies, women's roles are growing in power all of the time.
No opposing quotes found.