The most frequent complaint I hear from college students is that professors inject their leftist political comments into their courses even when they have nothing to do with the subject.
Sentiment: NEGATIVE
American colleges are now increasingly reflexive in maintaining politically correct dialogue over controversy, and some say universities have lost sight of education's ultimate purpose.
Let's face it: every campus has its share of students who can't quite comprehend that extreme political correctness is often born of the same intolerance and anti-intellectualism as standard-issue bigotry.
Many professors are Marxists or other varieties of radicals who hate America.
Students now arrive at the university ignorant and cynical about our political heritage, lacking the wherewithal to be either inspired by it or seriously critical of it.
You can't have a university without having free speech, even though at times it makes us terribly uncomfortable. If students are not going to hear controversial ideas on college campuses, they're not going to hear them in America. I believe it's part of their education.
I've taught a college journalism course at two universities where my students taught me more than I did them about how political news is consumed.
Instead of educating students, these professors are trying to indoctrinate them.
I try not to deal with people's hostility, though I must if they have something I need from them, as the professors did at Columbia or my landlord did.
I think it's really unfortunate that academics have been sidelined in most important policy debates.
When students leave college, they are like children who know nothing about the problems of life, and don't have a political stance.