The more Americans find out about President Barack Obama's health care law, the less they like it. A majority of Americans want out.
Sentiment: NEGATIVE
A majority of Americans think Obamacare will make health care in our country worse, and they're right.
Each and every day, more people pay the price of Obamacare's mountain of mandates. As I travel across the country, I continue to hear from Americans who want Washington to take its hands off of their healthcare.
Every time I hear a Republican talking about health care reform, they say the American people don't want it. They say it so much that I think they're beginning to try to convince themselves that it's true.
Americans oppose Obamacare because they understand that it is inconsistent with our liberties and our idea of limited government and that it will destroy the best health care system in the world.
The American people don't want Obamacare. It has been forced on the American people, despite the fact that Democrats no longer control the House.
Somehow, the greater the public opposition to the health care bill, the more determined they seem to force it on us anyway. Their attitude shows Washington at its very worst - the presumption that they know best, and they're going to get their way whether the American people like it or not.
The American people are opposed to ObamaCare. They were when the law passed; they're still opposed to it. But the fact of the matter is it's got to be implemented. We're trying to do our part even here in Nebraska. It's very, very difficult.
Simply put, ObamaCare cost Americans jobs through uncertainty and now implementation threatens even more jobs. No wonder the majority of Americans continue to oppose it.
Obamacare is bureaucrats getting between you and your doctor, and that's what Americans most dislike about this legislation.
Most Americans want health insurance.