The United States has written the white history of the United States. It now needs to write the black, Latino, Indian, Asian and Caribbean history of the United States.
Sentiment: NEGATIVE
Black history is American history.
It infuriates me that the work of white American writers can be universal and lay claim to classic texts, while black and female authors are ghetto-ized as 'other.'
Whites were the winners, blacks were the losers, we wrote the history books, and they didn't feature.
I think all in all, one thing a lot of plays seem to be saying is that we need to, as black Americans, to make a connection with our past in order to determine the kind of future we're going to have. In other words, we simply need to know who we are in relation to our historical presence in America.
We must never forget that Black History is American History. The achievements of African Americans have contributed to our nation's greatness.
African American history is really American history because African Americans really helped build this country.
My writing has been largely concerned with the depicting of Negro life in America.
So many people of color who made major contributions to American history have been trapped in the purgatory of history.
Americans think African writers will write about the exotic, about wildlife, poverty, maybe AIDS. They come to Africa and African books with certain expectations.
Won't it be wonderful when black history and native American history and Jewish history and all of U.S. history is taught from one book. Just U.S. history.