The Left is doing to America what it has done to almost everything it has deeply influenced - the arts, the university, religion, culture, minorities, Europe: ruining it.
Sentiment: NEGATIVE
The left controls academia, the culture, and the news media.
The left has total control over the public education system, all the way up to the university level. It's something they own, and it's going to have to change.
Nothing for the Left, nothing the government does is ever about its superficial reason; it's only and always about expanding government power and control over you.
Leftism in all its forms must be halted and turned back resoundingly.
The American Right has an amazing ability to lionize leaders whose lives are the precise antithesis of the political values that define their image.
The term 'the American Left' is as near to being meaningless or nonsensical as any term could really be in politics. It isn't really a force in politics anymore. And it would do well to ask itself why that is.
My audience consists mainly of people who already recognize how bad this culture is, and I want to push them to become more radical. It doesn't really matter to me if they are Left or Right.
The left sees nothing but bigotry and superstition in the popular defense of the family or in popular attitudes regarding abortion, crime, busing, and the school curriculum.
The left is back, and it's the only path we have to get out of the spot to which the right has sunken us. Socialism builds and capitalism destroys.
In the U.K. the far Right is a stain on society and there is a cultural resistance to it.