America, you're sending girls a mixed message. On one hand, you're saying to have positive body image and love who we are; on the other, we're being marketed makeup and clothing that obviously turns us into someone different.
Sentiment: POSITIVE
So many images are saying to girls, 'Show a lot of skin and look gorgeous and sexy.'
I love American girls. They're audacious. They put more outrageous things on their bodies than anybody.
What I find most upsetting about this new all-consuming beauty culture is that the obsession with good looks, and how you can supposedly attain them, is almost entirely female-driven.
These days, we are scrutinised by everybody. Girls in the industry put a lot of effort into it. You can't be overdressed or underdressed, your make up has to be subtle and there's a lot of brainstorming that goes into it.
The 'ideal' body is everywhere you look, and we are made to feel like failures by advertisers and corporations who shame us into buying their products.
I think that we could be more careful about what we're saying to young women in terms of their expectations. It's unrealistic to expect people to always be in designer clothes. Girls growing up deserve more freedom in how they look and how they feel about how they look.
Thanks to capitalism, the importance placed on beauty has never been so manipulated. We are the guinea pigs force-fed ads that tell us how pathetic we are: that we will never be loved, happy or valuable unless we have the body, the face, the hair, even the personality that will apparently be ours, if only we buy their products.
Body image - what we're supposed to look like - is made so unattainable that all girls are put in this position of feeling inferior. That's a horrible thing.
I am not really brand-conscious; I pick out clothes that appeal to me regardless of the label, but I consider my style very American.
We all want what every girl wants: to look fabulous while we're out there ruling the world.