Women have been brought up to be passive, accepting, not come forward and play a major role in life. And with age, there's a tendency to revert to that - to pull back, recede. I don't think it's advisable or admirable.
Sentiment: NEGATIVE
In their 20s, women try to be somebody they're not and try to turn themselves into something different. Now I'm in my 30s; I'm very happy with who I am.