I am passionately interested in understanding how my country works. And if you want to know about this thing called the United States of America you have to know about the Civil War.
Sentiment: POSITIVE
I grew up in the South, so a huge part of our American History education revolved around the Civil War.
Few Americans born after the Civil War know much about war. Real war. War that seeks you out. War that arrives on your doorstep - not once in a blue moon, but once a month or a week or a day.
My family has served the country in almost every major war since the Civil War.
The United States needed a civil war to unite properly.
What happened in America in the 1860s was a war of secession, a war of independence, no different in principle from what happened in America in the 1770s and 1780s.
When I was growing up in Virginia, the Civil War was presented to me as glorious with dramatic courage and military honor. Later, I realized how death was central to the reality. It was at the core of women's lives. It's what they talked about most.
I'm interested in what it means to be an American. I'm interested in what it means to live in America. I'm interested in the kind of country that we live in and leave our kids. I'm interested in trying to define what that country is.
I had been to the South many times and I thought I knew what the South was, but not until you live with people and live through their lives do you know what it's really about.
We want to keep the actual Civil War experience alive.
Civil war? What does that mean? Is there any foreign war? Isn't every war fought between men, between brothers?