I honestly don't understand the big fuss made over nudity and sex in films. It's silly.
Sentiment: NEGATIVE
It's only in the United States that they make a big deal about nudity.
What bothers me is our culture's obsession with nudity. It shouldn't be a big deal, but it is. I think this overemphasis with nudity makes actors nervous. There's the worry about seeing one's body dissected, misrepresented, played and replayed on the Internet.
Young actors often don't think of the consequences of doing nudity or sex scenes. They want the role so badly that they agree to be exploited, and then end up embarrassing family, friends, and even strangers.
I've never had a problem with nudity, but I don't put it out there without a reason.
In England, there are so many TV commercials with nudity in them, and there are so many TV programs that show nudity on a regular basis. It's becoming more of a norm.
I avoided nudity unless a film couldn't be told without those scenes. If you look at my films, few of them have that element, yet nudity and male fantasies have become emblematic of my work.
Nudity isn't something people should take offense to.
I always thought, because America is supposed to be the land of the free, nudity would be part of the norm over there, but it isn't. It's surprising.
In the United States of America, we are so liberal-minded on so many different aspects, but for some reason there's always going to be this weird connection with nudity being a bad thing. Americans can be so prude sometimes.
It's very liberating to be naked in front of a hundred people, but there's nothing sexual about lovemaking on a movie set.
No opposing quotes found.