I was in fact pretty much - by the larger culture, by the local culture, by people around me, by people on TV - encouraged to imagine women as something slightly inferior to men.
Sentiment: NEGATIVE
I even felt like I liked guys better than women - that men were relevant and women weren't. It took me a while to realize I'd been socialized to have a slighting view of my own gender.
I never realized until lately that women were supposed to be the inferior sex.
I don't think that women being seen as inferior is a prejudice based on male hatred of women. When you look at history, it's a prejudice based on simple fact.
My issue with the state of women became incredibly stimulated when I was visiting developing countries and it became obvious that women bore the brunt of so many things in society.
I adopted the assumption of many of my generation that women were intellectually inferior to men, that we were not capable of governing, leading, managing anything but our homes and our children.
As societies continue to loosen their standards regarding what is appropriate female and male behavior, I think we are going to realize we have not only underestimated women, but also men.
I was watching something the other day which started out with five guys walking towards you and one woman, and there you go-it's still being dominated by the male society.
My experiences with gender bias are probably the norm. What I found was that expectations of women were simply lower, and this resulted in being overlooked for certain opportunities.
I was surrounded by strong women so it had never even occurred to me that women were anything other than equal to men.
I don't understand why or how anyone ever pulled off the whole idea of 'women are inferior.'