It's only in the United States that they make a big deal about nudity.
Sentiment: NEGATIVE
I always thought, because America is supposed to be the land of the free, nudity would be part of the norm over there, but it isn't. It's surprising.
I honestly don't understand the big fuss made over nudity and sex in films. It's silly.
In England, there are so many TV commercials with nudity in them, and there are so many TV programs that show nudity on a regular basis. It's becoming more of a norm.
In the United States of America, we are so liberal-minded on so many different aspects, but for some reason there's always going to be this weird connection with nudity being a bad thing. Americans can be so prude sometimes.
I've never had a problem with nudity, but I don't put it out there without a reason.
Nudity isn't something people should take offense to.
What bothers me is our culture's obsession with nudity. It shouldn't be a big deal, but it is. I think this overemphasis with nudity makes actors nervous. There's the worry about seeing one's body dissected, misrepresented, played and replayed on the Internet.
I think on-stage nudity is disgusting, shameful and damaging to all things American. But if I were 22 with a great body, it would be artistic, tasteful, patriotic and a progressive religious experience.
I have a very European feel about nudity.
I don't believe in nudity for nudity's sake, but it's really beautiful when it's done well, when it's within a story. I'm very comfortable with my body. I grew up mostly in France, where nudity is not taboo.
No opposing quotes found.