I think a lot of us are increasingly recognizing that the dominant culture is killing the planet.
Sentiment: NEGATIVE
I think that every minority in the United States of America knows everything about the dominant culture. From the time you can think, you are bombarded with images from TV, film, magazines, newspapers.
I am appalled by the notion of cultural wars.
Like so many Americans, I am sick and tired of watching the rapid decline of our culture right in front of our eyes.
This great, though disastrous, culture can only change as we begin to stand off and see... the inveterate materialism which has become the model for cultures around the world.
It's true that humanity has seen a succession of crises, wars and atrocities, but this negative side is offset by advances in technology and cultural exchanges.
I think the more we embrace the culture of life and respect life, the better that we do.
Modern culture is a tremendous force.
It is a sure sign that a culture has reached a dead end when it is no longer intrigued by its myths.
We, as Americans, have so much to learn here. We have a shockingly low level of global awareness and familiarity and little idea of how the world sees us. And those disturbing facts keep getting us into a lot of trouble.
I just think that some version of the past in our culture is going to rise up and become dominant.