I think that every minority in the United States of America knows everything about the dominant culture. From the time you can think, you are bombarded with images from TV, film, magazines, newspapers.
Sentiment: NEGATIVE
Popular culture has become engorged, broadening and thickening until it's the only culture anyone notices.
When your culture comes from watching TV every day, you're bombarded with images of things that seem cool, places that seem interesting, people who have jobs and careers and opportunities. None of that happened where I was. You're almost taught to realize it's not for you.
I think a lot of us are increasingly recognizing that the dominant culture is killing the planet.
Popular culture is simply a reflection of what the majority seems to want.
When I grew up in the '60s, we were actually dominated by this, you know, sort of conforming '50s culture, even though we were like trying to express our own culture, like, the dominant culture was the thing that was forming us. And I think that that's true today.
I don't go to movies, I don't own a television, I don't buy magazines and I try not to receive mail, so I'm not really aware of popular culture.
The 'low' quality of many American films, and of much American popular culture, induces many art lovers to support cultural protectionism. Few people wish to see the cultural diversity of the world disappear under a wave of American market dominance.
Whoever controls the media, the images, controls the culture.
Even though we are a nation that's becoming a minority country, we still have this view where everything is seen through the prism of the dominant culture, which really means white.
It's very, very hard to affect culture. And you can get surprised thinking you're farther down the path of change than you really are because, frankly, most of us like the way things are.
No opposing quotes found.