When most people in the West think about Africa, is their first thought about the game reserves and who's chasing gazelles, or are they looking at Africans as people who are equally equipped to do great things, as in the West?
Sentiment: NEGATIVE
America has this understanding of Africans that plays like National Geographic: a bunch of Negroes with loincloths running around the plain fields of Africa chasing gazelles.
In the West, people tend to look at life as spectators, but in the East, people are the thing.
If people in this country think of Africa as a place with kids and flies swarming around their heads, then they won't understand that these people are you and you are them.
Africa has become the big game of the nation hunters. Today, Africa looms as the greatest commercial, industrial and political prize in the world.
Africa is the continent that the rest of the world prefers not to think about.
Americans' perceptions of Africa remain rooted in troubling stereotypes of helplessness and perpetual crisis.
One of the things I love about Africa is the amount of dignity and respect and humility you see all the time. You don't realise how often you're disrespected until you are surrounded by respect.
All we hear about Africa in the West is Darfur, Zimbabwe, Congo, Somalia, as if that is all there is.
When you ask people what they think of Africa, they think of AIDS, genocide, disasters, famine.
It's obvious that the rest of the world loves high African culture - African culture, period.