I just think women should love themselves more than they do. Because I think - with all the social media stuff - we look at ourselves too much, and we just destroy ourselves when we're way cooler than we know.
Sentiment: NEGATIVE
Everyone has things that they don't love about themselves but I think that as a woman its much healthier and more positive to focus on your good parts and the things you like about yourself, not pick yourself apart.
Women are so unforgiving of themselves. We don't recognize our own beauty because we're too busy comparing ourselves to other people.
Women don't take care of themselves because they take care of everybody else.
I think it's so important for girls to love themselves and to treat their bodies respectfully.
As women in general, I don't think we take the time to find ways to take care of ourselves.
I think every girl needs to love herself, regardless of anything. Like if you're having a bad day, if you don't like your hair, if you don't have the best family situation, whatever, you have to love yourself and you can't do anything until you love yourself first.
I'm trying to encourage more women to be themselves, rather than what men want them to be. I don't believe in patronizing either sex.
Women should always take care of themselves first. It makes you more equipped to take care of others.
Women think with their whole bodies and they see things as a whole more than men do.
I love all men who think, even those who think otherwise than myself.