The belief that the good in American society will finally win out... I don't believe any more.
Sentiment: POSITIVE
I think that when we have a better educated society, when there is less violence in our cities, when people get back into the workforce and have the opportunity to take care of themselves and their families - that for me really is the kind of success and the kind of America that I think most of us still want, we aspire to.
I'm hoping that the administration and other thought leaders will succeed eventually in bringing the country back to the older idea that the American dream is having a career, getting a job, and getting involved in it, and doing well. That was the core of the good life.
We will win the hearts and minds of the American people with an agenda for a stronger and more prosperous America.
In the present world, this technological, psychotic, politicised, nonsensical world, you have to believe that the good guys are going to win! That evil will be banished somehow!
I really think that's the key, part of the spiritual renewal that America needs to have, the notion that we really can have confidence in a better tomorrow.
For all its terrible faults, in one sense America is still the last, best hope of mankind, because it spells out so vividly the kind of happiness that most people actually want, regardless of what they are told they ought to want.
Now people all across America are starting to believe in America again. We are coming back, back to the heights of greatness, back to America's proud role as a temple of justice and a champion of peace.
A lot of America's success is because we're an open society and haven't brought dogma or religious influence into the American political process.
Whatever we perceive as good in the world has always endured, and it always will.
The core of the American public, their hearts and their minds are in the right place. And that gives me hope.