The left has total control over the public education system, all the way up to the university level. It's something they own, and it's going to have to change.
Sentiment: NEGATIVE
The left controls academia, the culture, and the news media.
What I see is democratizing education will change everything.
I vote far-left. I am frequently angered by corporate greed and think education ought to be free and teachers paid well.
The government has convinced parents that at some point it's no longer their responsibility. And in fact, they force them, in many respects, to turn their children over to the public education system and wrest control from them and block them out of participation of that. That has to change or education will not improve in this country.
The Left is doing to America what it has done to almost everything it has deeply influenced - the arts, the university, religion, culture, minorities, Europe: ruining it.
Children have to be educated, but they have also to be left to educate themselves.
Most American elementary schools and high schools, and nearly all colleges and universities, teach everything that is significant from a liberal/Left perspective.
I believe that access to a university education should be based on the ability to learn, not what people can afford. I think there is no more nauseating a sight than politicians pulling up the ladder of opportunity behind them.
I'm excited about seeing a bipartisan plan to reform education in the United States. The only other option is to protect the status quo and I really don't think anybody wants to do that.
Education is a human right with immense power to transform. On its foundation rest the cornerstones of freedom, democracy and sustainable human development.
No opposing quotes found.