The United States came within a whisker of invading Utah in 1858 and starting a civil war three years before the Civil War. Because the conflict ended up fizzling out, it's not the most dramatic story about the West.
Sentiment: NEGATIVE
The United States needed a civil war to unite properly.
Every time I see something about the Wild West, I'm reminded that our version of history may not be what really happened.
Speaking as a writer, it would be difficult to find an event in American history more dramatic and riveting than the Civil War.
Something often neglected in popular accounts of the Wild West is the extent to which its dramas were colored by the politics and personal resentments left by the Civil War.
I think that the Western went away for a while because part of its function was that it used to be America's action film.
The typical Western is kind of a good-guy/bad-guy thing, and that's great, but initially when I heard about 'Into the West,' and what I love about it is it delves into both sides of our cultural past, and it puts more of a human face on the Native Americans.
I've always been interested in the history of the West, our country and particularly as it relates to the Native Americans - the original Americans.
I wanted to think about our creation myth; you know, what is the fundamental story that defines America. And it certainly is the West.
After the bombing of Pearl Harbor, December 7, 1941, the United States would enter, in a formal way, what had been up to that date strictly a European conflict. Marcus Garvey's prophecy about the European scramble to maintain dominance over the whole world was now a reality.
In the middle of the nineteenth century, the United States embarked on a new relationship with death, entering into a civil war that proved bloodier than any other conflict in American history, a war that would presage the slaughter of World War I's Western Front and the global carnage of the twentieth century.