I think it's time that we have a women's show about the West. The concentration has been on the men and the Indians.
Sentiment: NEGATIVE
Women are outperforming men in almost every sphere of life in our society and the women of East Indian ancestry are no exception to this rule. They have broken the cultural mould.
West Indian cultural mentality and a North American life equals the perfect balance.
Western women have been controlled by ideals and stereotypes as much as by material constraints.
The typical Western is kind of a good-guy/bad-guy thing, and that's great, but initially when I heard about 'Into the West,' and what I love about it is it delves into both sides of our cultural past, and it puts more of a human face on the Native Americans.
In the John Wayne movies, the Indians were savages that were trying to scalp you. That culture has really suffered because of the stereotype you see in those westerns.
As Buddhism moved to the West, one of the big characteristics was the strong place of women. That didn't exist in the countries of origin. It's just a sign of our culture.
There has been a terrible hemorrhage of educated women to the West where they can flourish. I understand, but it is terrible. We must stay home.
There aren't very many notable Native American female figures historically. That's the way that it's been. Pocahontas and Sacajawea.
I by no means intend to simplify the challenges women face in any culture. Women are marginalized in all cultures in my opinion, some in more extreme ways than others.
There's an undeniable tradition of sexism in this country that ties into the move westward by people of European descent and different ways of looking at Manifest Destiny on the west side of the Mississippi River.
No opposing quotes found.