American colleges are now increasingly reflexive in maintaining politically correct dialogue over controversy, and some say universities have lost sight of education's ultimate purpose.
Sentiment: NEGATIVE
Once citadels of free expression and occasional revolutionary ideas, today many American colleges have endorsed political correctness.
The most frequent complaint I hear from college students is that professors inject their leftist political comments into their courses even when they have nothing to do with the subject.
So many of the schools are just politically correct mirrors of each other. If you go to this school versus that school, you're just going to get a different version of the same political correctness and liberal indoctrination.
Let's face it: every campus has its share of students who can't quite comprehend that extreme political correctness is often born of the same intolerance and anti-intellectualism as standard-issue bigotry.
Ignorance of what real learning is, and a consequent suspicion of it; materialism, and a consequent intellectual laxity, both of these have done destructive work in the colleges.
Instead of educating students, these professors are trying to indoctrinate them.
I think universities are trying to figure out how we could use what we know about learning to change our education system, but it is sort of funny that they don't necessarily seem to be consulting the people who are sitting right there on campus.
The cornerstone of the political correctness that dominates campus culture is radical feminism.
Students now arrive at the university ignorant and cynical about our political heritage, lacking the wherewithal to be either inspired by it or seriously critical of it.
You can't have a university without having free speech, even though at times it makes us terribly uncomfortable. If students are not going to hear controversial ideas on college campuses, they're not going to hear them in America. I believe it's part of their education.