We should not continue to play politics with women's health.
Sentiment: NEGATIVE
It's long past time we started focusing on the solutions that actually keep women healthy, instead of using basic aspects of women's health as a tool of cultural, moral, and political control.
Politics need to stop treating women as a special-interest group.
If American politics are too dirty for women to take part in, there's something wrong with American politics.
We are bringing women into politics to change the nature of politics, to change the vision, to change the institutions. Women are not wedded to the policies of the past. We didn't craft them. They didn't let us.
CEOs and employers at for-profit corporations should not be able to prevent women from access to health care simply because of their own personal religious objections.
Congress is attempting to eviscerate women's health care. Like many women across America, I am outraged.
The greatest public health threat for many American women is the men they live with.
I do see women voters shifting to the Republican Party and doing so significantly. And the issue that's doing this is the fear the federal government will prevail in making the Affordable Health Care Act permanent law and how that will hurt small businesses.
We need to take politics out of health care. Congress will cave to pretty much any special interest on the subject.
No woman should have her personal health care decisions dictated by the religious beliefs of her boss.
No opposing quotes found.