A colleague sent this out... incredibly funny!
http://www.theonion.com/articles/historians-politely-remind-nation-to-check-whats-h,26183/
Ruminations upon the dust-laden corners of American history... and their relevance to American society today.
Monday, February 27, 2012
Friday, February 24, 2012
Koran
I've been disheartened by all the news this week about the burning of the Koran by U.S. military forces in Afghanistan -- and by comments made by some national politicians concerning this crisis. What seems to be missing is perspective upon what this practice constitutes.
If many Americans learned forces from Afghanistan or anywhere else had burned piles of Bibles, they'd be offended by this lack of regard for a book so many consider holy -- and rightly so, because such an action would demonstrate a fundamental disregard for the validity and significance of Biblical faith for Christians. The thing about the Bible, however, is that a majority of Christians place far less importance upon the physical book itself than upon its role as a means of illuminating and informing Christian faith. Some Christians believe it to be the literal word of God; most believe it to be the inspired word of God. Christians read the Bible in many different languages and translations. It is a tool -- an essential tool, but a tool -- for understanding God's plan and God's promises.
The Koran, on the other hand, is believed by observant Muslims to be the literal word of God (Allah), transcribed verbatim by Muhammad, God's messenger. It remains in Arabic because that was the language in which Muhammad is said to have transcribed it, and the literal words of God should not be translated in a way that would inevitably alter their meaning on even the slightest level.
Understanding the level of importance given to this book helps us better realize just how offensive it is to contemplate burning the Koran. This is not just a holy book; this is the literal word of God for Muslims. And over ten years into America's involvement in Afghanistan, somehow some people remained so tone-deaf to Islamic culture as to believe burning this word of God was an acceptable idea.
If we cannot gain a better understanding of the cultures which which we interact around the world, we are doomed to an endless cycle of violence and reprisal.
If many Americans learned forces from Afghanistan or anywhere else had burned piles of Bibles, they'd be offended by this lack of regard for a book so many consider holy -- and rightly so, because such an action would demonstrate a fundamental disregard for the validity and significance of Biblical faith for Christians. The thing about the Bible, however, is that a majority of Christians place far less importance upon the physical book itself than upon its role as a means of illuminating and informing Christian faith. Some Christians believe it to be the literal word of God; most believe it to be the inspired word of God. Christians read the Bible in many different languages and translations. It is a tool -- an essential tool, but a tool -- for understanding God's plan and God's promises.
The Koran, on the other hand, is believed by observant Muslims to be the literal word of God (Allah), transcribed verbatim by Muhammad, God's messenger. It remains in Arabic because that was the language in which Muhammad is said to have transcribed it, and the literal words of God should not be translated in a way that would inevitably alter their meaning on even the slightest level.
Understanding the level of importance given to this book helps us better realize just how offensive it is to contemplate burning the Koran. This is not just a holy book; this is the literal word of God for Muslims. And over ten years into America's involvement in Afghanistan, somehow some people remained so tone-deaf to Islamic culture as to believe burning this word of God was an acceptable idea.
If we cannot gain a better understanding of the cultures which which we interact around the world, we are doomed to an endless cycle of violence and reprisal.
Monday, February 20, 2012
landscape
One of the assignments I give most of my U.S. history survey students is a basic map quiz covering the 50 states. Most of them ace it, but it's worth it to me to throw a few points their way in exchange for the guarantee -- one I have learned the hard way not to take for granted -- that when I talk about the Illinois Central RR taking rural African Americans north to Chicago during the Great Migration, the "Deep South" states of Alabama and Georgia, the Lewis and Clark Expedition's trail through Montana and so on that they actually know where I'm talking about. As Americans we take the diversity of our geography for granted, and on the whole we've been remarkably successful in creating an identity as "American" despite the incredibly different lives we lead based upon the physical geography and climactic conditions of our many landscapes.
I was struck by this as my little family snowshoed with good friends into the Mount Hood National Forest this weekend to stay in a hut on the side of the mountain. This would not be an experience we would have been having, were we still residents of Los Angeles... unless, of course, we made our way into the mountains east of the Los Angeles Basin, where we might experience similar snowy conditions and hills in a somewhat different landscape. In Minnesota, we could have had the snow and even colder temperatures, but we certainly wouldn't have had the mountains or the same type of forest. In Texas, of course, we'd have had neither, and in Florida not only would we have been overdressed, but we wouldn't have been chuckling at the fact that my husband's first-aid kit included tools for dealing with snake bites.
Characteristics of unity (even in our fractious times) within diversity. The story of our nation, expressed with varying degrees of success... and also, perhaps, a good mantra for this blog. I'm tiring of the categories I created for postings several months ago, and in the spirit of remaining active I'm planning to take a more free-form approach in the future. Thank you, as always, for taking the time to read!
Friday, February 10, 2012
Muse of the Week: Primaries
Apologies for missing this week's "Webfoot Wednesday" posting. I've been catching up following some extra lectures at the university where I work (a very welcome honor, but some additional work). The events of this week's caucuses and the seemingly unending fracas in the GOP regarding the party's proper direction have had me thinking about the way in which we have chosen presidents throughout American history and the implications of our most recent model.
Many citizens no longer realize that until the 1960s, the vast majority of states did not proffer presidential primaries. While local or state-level primaries were a more common feature, and a few states ranging from New Hampshire to Oregon did offer them at the presidential level, most delegate support was garnered via more private deliberations between state party officials and the candidates' organizations. A colleague in the field by the name of Michael Bowen has recently published a fascinating book on the development of ideology as a significant factor in presidential nominee selection titled The Roots of Modern Conservatism: Dewey, Taft and the Battle for the Soul of the Republican Party -- I highly recommend it.
The most significant factor for many party officials -- who were concerned above all with patronage -- was that a nominee be "electable." This required him (since at this point it always was a him) to be widely appealing. Primaries, however, have come to emphasize a different set of qualifications. Party primaries bring together the most motivated of partisan voters -- this is their opportunity to influence the nomination process and they take this task seriously. Partisan voters tend to privilege ideology more highly among their criteria for candidate selection. This makes them less concerned with overall electability; they want to see someone of an amenable mindset as nominee, and often believe strongly enough in this ideological worldview to deem it acceptable by the rest of the nation. The overall result is a system that favors more ideologically polarized candidacies.
We can see the influence of this system in the news today. Mitt Romney continues to be viewed as the most widely "acceptable" candidate within the GOP field by many opinion leaders. Voters in primaries and caucuses (another example to exert direct pressure), however, have divergent opinions. It is safe to assume that in a pre-1960s electoral system, Romney would become the Republican nominee, end of story. While that continues to seem the most likely outcome in 2012, however, the selection process remains far more contingent.
Is this is a good thing or a bad thing? Well, the contemporary process does provide more opportunities for citizens to exert influence. This could be regarded as a positive thing. It could be argued that we enjoy a more distinct set of choices come fall. On the other hand, when one party's race is highly contested, the ongoing fight provides a hefty supply of general-election ammunition for the other side. One has to assume someone in President Obama's reelection campaign is gleefully recording each and every attack proffered by the likes of Gingrich, Santorum and Paul. As well, more ideological polarity renders consensus politics far less feasible and helps contribute to the partisan deadlock in which we find ourselves in Washington, D.C. One's final verdict in many cases depends upon the relative importance one assigns to each of these goals. Understanding how recent developments have shifted the contours of politics, however, at least renders us capable of recognizing why the political landscape looks so different today than it did in 1950.
Many citizens no longer realize that until the 1960s, the vast majority of states did not proffer presidential primaries. While local or state-level primaries were a more common feature, and a few states ranging from New Hampshire to Oregon did offer them at the presidential level, most delegate support was garnered via more private deliberations between state party officials and the candidates' organizations. A colleague in the field by the name of Michael Bowen has recently published a fascinating book on the development of ideology as a significant factor in presidential nominee selection titled The Roots of Modern Conservatism: Dewey, Taft and the Battle for the Soul of the Republican Party -- I highly recommend it.
The most significant factor for many party officials -- who were concerned above all with patronage -- was that a nominee be "electable." This required him (since at this point it always was a him) to be widely appealing. Primaries, however, have come to emphasize a different set of qualifications. Party primaries bring together the most motivated of partisan voters -- this is their opportunity to influence the nomination process and they take this task seriously. Partisan voters tend to privilege ideology more highly among their criteria for candidate selection. This makes them less concerned with overall electability; they want to see someone of an amenable mindset as nominee, and often believe strongly enough in this ideological worldview to deem it acceptable by the rest of the nation. The overall result is a system that favors more ideologically polarized candidacies.
We can see the influence of this system in the news today. Mitt Romney continues to be viewed as the most widely "acceptable" candidate within the GOP field by many opinion leaders. Voters in primaries and caucuses (another example to exert direct pressure), however, have divergent opinions. It is safe to assume that in a pre-1960s electoral system, Romney would become the Republican nominee, end of story. While that continues to seem the most likely outcome in 2012, however, the selection process remains far more contingent.
Is this is a good thing or a bad thing? Well, the contemporary process does provide more opportunities for citizens to exert influence. This could be regarded as a positive thing. It could be argued that we enjoy a more distinct set of choices come fall. On the other hand, when one party's race is highly contested, the ongoing fight provides a hefty supply of general-election ammunition for the other side. One has to assume someone in President Obama's reelection campaign is gleefully recording each and every attack proffered by the likes of Gingrich, Santorum and Paul. As well, more ideological polarity renders consensus politics far less feasible and helps contribute to the partisan deadlock in which we find ourselves in Washington, D.C. One's final verdict in many cases depends upon the relative importance one assigns to each of these goals. Understanding how recent developments have shifted the contours of politics, however, at least renders us capable of recognizing why the political landscape looks so different today than it did in 1950.
Friday, February 3, 2012
Muse of the Week: Anachronism
This afternoon my students will be selecting a sticker from an envelope that will determine whether they become a 1912 election advocate of Woodrow Wilson, Democrat, or Theodore Roosevelt, Progressive (but a longtime Republican). Accordingly, the Wilson stickers are blue and the Roosevelt stickers are red... but I became curious about when this red state/blue state dichotomy emerged and found an interesting article in the NYT archive that tells the story. Turns out the pattern emerged as recently as the 2000 election; prior to this date, red and blue were used more or less interchangeably.
I was already aware the pattern didn't extend as far back as 1912, so I am intentionally engaging in a bit of historical anachronism for the sake of student clarity and the opportunity to introduce the term "anachronism" in class. Anachronism is, of course, the misapplication of ideas, beliefs or artifacts in a different historical context. That extra wearing a wristwatch in a movie set in Roman times? Definitely an anachronism. The capri pants Beezus wears to school in my daughter's recent edition of Beverly Cleary's 1952 book Henry and Beezus? Also an anachronism (and one that drives me slightly batty).
More problematically, we tend to apply our ideas, principles and presuppositions to earlier times in an attempt to make sense of our past. We would never have done x, y or z because we would "know better." We project our beliefs about a given time period upon the period itself, failing to recognize that inherited tradition can vary drastically from past reality. Does this mean we cannot judge the serious deficiencies in our past, or use the lessons of the past to inform our present and future decisions? No, I'd argue it does not. If we understand the context of time and place, we can draw informed conclusions and avoid presentist positions. This is one of the central enterprises of historical scholarship. Careful understanding is necessary, however, if we are to avoid the pitfalls of anachronistic thinking. So, I'll pass out my stickers and prompt our discussion of the term... and I hope the lesson takes hold.
I was already aware the pattern didn't extend as far back as 1912, so I am intentionally engaging in a bit of historical anachronism for the sake of student clarity and the opportunity to introduce the term "anachronism" in class. Anachronism is, of course, the misapplication of ideas, beliefs or artifacts in a different historical context. That extra wearing a wristwatch in a movie set in Roman times? Definitely an anachronism. The capri pants Beezus wears to school in my daughter's recent edition of Beverly Cleary's 1952 book Henry and Beezus? Also an anachronism (and one that drives me slightly batty).
More problematically, we tend to apply our ideas, principles and presuppositions to earlier times in an attempt to make sense of our past. We would never have done x, y or z because we would "know better." We project our beliefs about a given time period upon the period itself, failing to recognize that inherited tradition can vary drastically from past reality. Does this mean we cannot judge the serious deficiencies in our past, or use the lessons of the past to inform our present and future decisions? No, I'd argue it does not. If we understand the context of time and place, we can draw informed conclusions and avoid presentist positions. This is one of the central enterprises of historical scholarship. Careful understanding is necessary, however, if we are to avoid the pitfalls of anachronistic thinking. So, I'll pass out my stickers and prompt our discussion of the term... and I hope the lesson takes hold.
Wednesday, February 1, 2012
Webfoot Wednesday: Capitol
"In the souls of its citizens will be found the likeness of the state, which if they be unjust and tyrannical then will it reflect their vices, but if they be lovers of righteousness, confident in their liberties, so will it be clean in justice, bold in freedom."
I'm not sure where this quotation comes from originally, but I came across it the other day in a book by the late Oregon governor and U.S. senator Mark Hatfield, who reported it was chiseled into the balcony just below the doors that lead to the governor's office in Salem. Words worth bearing in mind... we get what we ask for.
I'm not sure where this quotation comes from originally, but I came across it the other day in a book by the late Oregon governor and U.S. senator Mark Hatfield, who reported it was chiseled into the balcony just below the doors that lead to the governor's office in Salem. Words worth bearing in mind... we get what we ask for.
Subscribe to:
Posts (Atom)