In the John Wayne movies, the Indians were savages that were trying to scalp you. That culture has really suffered because of the stereotype you see in those westerns.
Sentiment: NEGATIVE
Whatever their reasons, Hollywood, or the entertainment industry, is saying something about Indians. I don't see the rest of the media knocking down any doors to do that.
The townspeople outside the reservations had a very superior attitude toward Indians, which was kind of funny, because they weren't very wealthy; they were on the fringes of society themselves.
There's guilt about our treatment of native peoples in modern intellectual life, and an unwillingness to acknowledge there could be anything good about Western culture.
The more we study the Indian's character the more we appreciate the marked distinction between the civilized being and the real savage.
Indians are usually seen as capsulized: limited to one environment, with the illusion of stability in that environment. But Indians have been engaged all over the world for centuries, in Europe, even in Asia.
Therefore reinforcing a stereotype, therefore thinking that the entire Indian culture is just made of people that are against their children's decisions.
The typical Western is kind of a good-guy/bad-guy thing, and that's great, but initially when I heard about 'Into the West,' and what I love about it is it delves into both sides of our cultural past, and it puts more of a human face on the Native Americans.
Indians have a big problem with alcohol and drugs. I grew up with an admiration for their culture and was sensitive to their problems.
I remember, in my first show in New York, they asked, 'Where is the Indian-ness in your work?'... Now, the same people, after having watched the body of my work, say, 'There is too much Indian philosophy in your work.' They're looking for a superficial skin-level Indian-ness, which I'm not about.
West Indian cultural mentality and a North American life equals the perfect balance.