I don't mind women who want to act. That's fine. It's odd that men want to act, in that there's still a degree of vanity associated with it.
Sentiment: NEGATIVE
I am passionate about acting. 'Girls in our family are not allowed to act' is quite an outdated thought to have. They did it till my mother, but now no more.
A lot of the time I hate acting. It has a lot to do with the way I was brought up in a world where showing your emotions is frowned upon. It's just not manly. I don't do anything in life because I love doing it. It's because I want to be good at it.
For acting, if there's a strong female role and there is that action element, for me that just feels really natural.
Acting is just being a man. Being human. Not forcing it.
People don't want to see women doing things they don't think women should do.
I don't really like 'acting' - I like things to feel as natural as possible.
I try to not think too much about how stuff gets seen as it's being done by a woman. Because if you think about it, then you end up thinking about how you're acting, and if you are thinking about how you're acting, then you are preoccupied and you're going to end up being insincere. You're kind of not present.
As a female, you are often being asked by directors to be warmer, softer, flirt more, smile more etc... None of those things are bad, and obviously we are capable of a variety of human behavior, but it gets really old having to play into somebody's stereotype or ideal.
My husband acts and I have such respect for those that do it well, that I wouldn't even try to pretend I can act.
If I'm going to be honest about it, I think men get to do this sort of thing all the time. You look at countless performances by great male actors who get to play the whole gamut of human emotions. Women aren't regularly allowed to do that, and I don't know why people are so frightened by it.