I always thought, because America is supposed to be the land of the free, nudity would be part of the norm over there, but it isn't. It's surprising.
Sentiment: POSITIVE
It's only in the United States that they make a big deal about nudity.
In the United States of America, we are so liberal-minded on so many different aspects, but for some reason there's always going to be this weird connection with nudity being a bad thing. Americans can be so prude sometimes.
In England, there are so many TV commercials with nudity in them, and there are so many TV programs that show nudity on a regular basis. It's becoming more of a norm.
I think on-stage nudity is disgusting, shameful and damaging to all things American. But if I were 22 with a great body, it would be artistic, tasteful, patriotic and a progressive religious experience.
I have a very European feel about nudity.
I love my country, but I believe that we are too quick to censor nudity.
Nudity is natural but not until a person accepts and loves who they are.
I've never had a problem with nudity, but I don't put it out there without a reason.
I honestly don't understand the big fuss made over nudity and sex in films. It's silly.
I don't believe in nudity for nudity's sake, but it's really beautiful when it's done well, when it's within a story. I'm very comfortable with my body. I grew up mostly in France, where nudity is not taboo.
No opposing quotes found.