One of Several Different Ways
On May 31, 2010, Israeli naval commandos rappelled onto a series of boats in an enemy flotilla that was attempting to run a blockade off of Gaza. Provoked, Jerusalem had no choice but to respond to and interdict the flotilla. Met with hostile resistance as they boarded the boats—rappelling down from helicopters—the Israeli troops responded in kind, and neutralized the terrorist threat.Or: On May 31, 2010, a band of Jewish thugs murdered several innocent protesters who were on a mission of mercy to the blighted Gaza strip. In an attempt to persuade the world of the injustices and cruelty being perpetrated on the innocent peoples of Palestine, Israel proved that it could not tolerate even peaceful protest, and violated its own principles of free speech by slaughtering those attempting to exercise their rights.
But, how about we phrase it this way: On May 31, 2010, a bunch of people were killed and injured on boats in the Mediterranean. Two parties, clearly at odds with each other, both overreacted and some people died because of it.
Nobody wins.
Sadly, time will heal little, and temporal distance from the Gaza flotilla incident will do even less to clarify what happened and why. Who is correct in their interpretation of history?
***
Today, there is no single agreed-upon history from which to gauge correct accounts of political events. Facts are debatable. Ignorance and willful denial can coexist in a single narrative. Conspiracy theories and epistemic alternate realities (or, to use the recent turn of phrase, a certain “epistemic closure”) run rampant and unchecked. Cultural differences in conceptualizing time even play a part. And this all assumes there is an active desire and search for truth; many news consumers now cope with a world in which shoving their collective past down the “memory hole” is de rigueur.Contentious historical figures make for reasonable locations of disagreement. Take, for example, the case of the late United States President Ronald Reagan. Reviled by the American left for policies foreign and domestic—yet subject of a proposal by American conservatism to replace Franklin Roosevelt on the dime for his perceived role in ending the Cold War—Reagan is about as polarizing a figure as one can find in our time. Yet while people disagree over whether or not his actions caused a net positive, to deny the profound historical legacy of Ronald Reagan—he who “gave rise to a new generation of conservatives, reshaped the Republican Party, challenged Democrats to redefine themselves and altered the political dynamic in the nation's capital” (CNN)—would just be folly.
Reagan’s actual political maneuvers are at the core of modern Republican veneration for the “Gipper.” Yet that same party channels his political thrust in an attempt to enact policies that he long ago already succeeded in making law. As Nate Silver points out:
What's remarkable is how we still hear the same core arguments about the role and functions of government—and how the policy-specific debates over matters like offshore drilling persist as well. And yet here we are, 30 years later, and the tax burden is at its lowest since 1950, the regulatory state has been cowed if not captured by the industries it is supposed to oversee, and America stands as the world's lone remaining superpower.
Selective memory is a powerful tool. So is interpreting the outcome of historical events. Reagan was a powerful force for deregulation, but whether such measures opened up competition and made for a freer marketplace, or led to the Deepwater Horizon spill and accelerated the process of global warming, remains hotly contested.***
But as far as historical figures and policies go, Reaganite deregulation is relatively benign. Let’s briefly change our focus from parsing civilized partisanship, to examining my generation’s memory failures about the realest manifestation of political evil to ever befoul the earth.In India, merchandise of Adolf Hitler is selling at a rather alarming pace. Because naturally, his “patriotism” and “discipline” are qualities worth emulating. Unfortunately, despite our best efforts, the 65 years since Hitler’s death have dulled his memory rather drastically, not least for Generation Y. Says 19-year-old Prayag Thakkar to the BBC: “I have idolized Hitler ever since I have had a sense of history. I admire his leadership qualities and his discipline.”
How is it that the historical record cannot emphasize certain more prominent attributes of Hitler’s life over these ones? Whether due to a shoddily-written textbook or just plain misinterpretation, the fact that anyone can see the case of Adolf Hitler in a way that doesn’t just acknowledge the genocidal dictator’s minute positive qualities, but accentuates them, is rather terrifying. Sure, Mussolini made the trains run on time (though actually, he didn’t), but that doesn’t mean Italians want to ogle his granddaughter in Playboy and then elect her to Il Parlamento. (Bad example?)
***
So we can blame partisanship, bad pedagogy or sheer ignorance for lapses in historical truth-telling. But in my cohort, I see a more ubiquitous risk in the unthinking, unquestioning acceptance of news sources as gospel. If you put too much faith in what MSNBC, or the Huffington Post, or Fox News reports, then you’re locked into a particular agenda. We’re almost all guilty of revisiting the same sources that confirm our preexisting notions, rather than challenging them. That doesn’t just apply to news. Even the historians you read determine not only opinion, but your very knowledge. Are you a Victor Davis Hanson type, or more of a Howard Zinn guy?Look at how our generation—particularly the trailing edge—approaches academic integrity and plagiarism. Stanley Fish recently made the case for considering plagiarism to be a minor sin (“not a big moral deal”), recognizing that sole authorship is an ideal best confined to the narrow halls of the academy. Some teachers have begun to regret past instances of punishing cheaters, and will examine the suitability of plagiarism accusations as appropriate “teaching moments.”
The New York Times seems to pin the blame on our millennial culture of mashups and remixes, claiming that it is “a disconnect that is growing in the Internet age as concepts of intellectual property, copyright and originality are under assault in the unbridled exchange of online information.” Though the article seems to fall into that classic Grey Lady trap of “kidz r stupid,” one of them at least offers a creative defense for directly copying passages from Wikipedia without citation: that they “did not need to be credited since they counted, essentially, as common knowledge.” If nothing else, maybe our intellectual laziness still allows for some creativity.
But this does speak to some serious concerns about how we approach history. The first is an inherent danger to “crowd-source” everything. Wikipedia is a constant victim of an editor’s bias—some of the most ridiculous examples of virulent “edit wars” are preserved in a list of the “lamest.” If one unquestioningly accepts Wikipedia as fact, then your understanding of a given subject is entirely dependent on which exact moment you viewed a page.
For example, between March 26 and April 3, 2009, poor Pope Benedict XVI had over 30 edits disagreeing over nothing more than his title: Bishop of Rome, or Supreme Pontiff? Eventually, resolution came as the Vicar of Christ. Other ridiculous subjects include hummus as a Zionist plot, the ethnicity of porn star Raven Riley and whether “yogurt” or “yoghurt” is the correct spelling (in case you were wondering, Microsoft Word considers both acceptable).
If we can’t agree on facts this trivial, then what hope do we have of resolving actual issues that inform our understanding of the world?
A distinction should be drawn between things we don’t know or think we know but don’t, and those that we willfully (yet subconsciously) reject as fact. In the former category would be the interpretation of history and even current events—whether the New Deal ended the depression, if the Troubled Asset Relief Program really did stave off an even worse economic crisis than the one that hit—and common misconceptions, like Napoleon Bonaparte being particularly short (he was 5’ 6”) or Thomas Crapper inventing the flush toilet (it was 16th-century author Sir John Harrington, all of 240 years before Crapper was born). Thankfully, Wikipedia also provides us with a list of common misconceptions to redirect us away from our poor misguided notions.
***
Take the Vietnam War, for example. Americans my age are familiar with a general collective conception of this war, as portrayed countless times in films such as Apocalypse Now, Platoon and Full Metal Jacket. Of course, these movies were based on source texts such as Michael Herr’s Dispatches (Herr co-wrote both Apocalypse Now and Platoon). As a correspondent based in Vietnam during the war, Herr's psychedelic portrayal of a nightmare world of jungle combat seems to ring true—but it should not be the authoritative version it has become.Herr portrays a world gone mad, a world in which values mean nothing and the only meaningful measurement is McNamara’s body count. One of his strongest themes is the powerlessness of the grunts in the field, each of whom want nothing more than their tour to end and to get home. As Herr characterizes the helplessness, “then and there, everyone was just trying to get through it, existential crunch, no atheists in foxholes like you wouldn’t believe.” Herr goes on to describe the nameless, omnipresent fear:
You could make all the ritual moves… The Inscrutable Immutable was still out there, and you kept on or not at its pitiless discretion. All you could say that wasn’t fundamentally lame was something like ‘he who bites it this day is safe from the next,’ and that was exactly what nobody wanted to hear.
Herr’s war is one of indeterminable purpose, where the ultimate goal of the man on the ground is simply to continue living. No wonder Oliver Stone is back in vogue.The constant drug use and acid rock of Herr’s Vietnam is in many ways a reflection of then-contemporary American society. Timothy Leary’s exhortations to “turn on, tune in, drop out” and the subsequent misinterpretation to “get stoned and stop contributing to society” seemed to be the two countervailing attitudes of the day. Even more so than the soldiers, Herr in particular constantly references the drug and alcohol use of the press corps: “What bad, dope-smoking cats we all were. We were probably less stoned than the drinkers in our presence, and our livers were holding up.” Somehow this personal experience of the war transmogrified into the entire war, even though most Vietnam narratives have nothing to say about drugs. Samuel Hynes surmises that “it was a figure of speech for the war, as seen by a skeptical reporter, and not a statistical reality.” Yet, the stoned, trippy vision of the Vietnam War persists to this day. The despair of war is nothing new to fighting men—in a conflict where victory was indefinable and the motives unclear, it would be difficult indeed to understand why one was going to war.
Perhaps it’s the nature of either literature or the American educational system, but you don’t really see novels that portray any war as a sane, orderly, manageable conflict. Even World War II can be portrayed with just as much lunacy and absurdity as the Vietnam War. The world of Joseph Heller’s Catch-22 is a great example of this timeless trope—an intersection of rampant capitalism, senseless violence and the “chickenshit” Catch-22 itself.
One of the doctors in the book explains to Captain John Yossarian: “As far as we’re concerned, one dying boy is as good as another, or just as bad. To a scientist, all dying boys are equal.” Yossarian himself is not reluctant to point out the absurdity of his situation. As he explains to Clevinger, “You are talking about winning the war, and I am talking about winning the war and keeping alive… It doesn’t make a damned bit of difference who wins the war to someone who’s dead.”
Few would contend that World War II was fought for no discernable purpose. In the eyes of Yossarian (and therefore, Heller, at least to some degree), his war was not being fought for any other purpose than to end his life. This “meaning” is a profoundly personal interpretation, however. It does not reflect general consensus on the purpose of the war, but only one man’s perspective. This is the same for Vietnam, and Michael Herr—the only difference being that Herr’s version is the general consensus.
Why did this catch on, though? Historical records and indeed, the personal memoirs of the men who fought in Vietnam are virtually all diametrically opposed to Herr’s vision. Hynes offers several telling statistics. Chief among these is the fact that most men joined out of patriotic convictions. They chose not to evade the draft, and like their fathers who had served with distinction in the Second World War, they shipped off to Vietnam with the convictions of men who loved their country. Even after the brutal disillusionment that began around 1968, Vietnam veterans were not forever sullied by their service. Hynes cites the figures: “71 percent of the Vietnam veterans polled were glad they went to Vietnam, 74 percent claimed to have enjoyed their war, and 66 percent would be willing to serve again.” Clearly, the truth that Herr purports to offer is at odds with the recollections of the men who were there. It is certainly possible that there are dark truths and memories that Vietnam veterans would prefer not to dredge up, but the sheer volume of contradictory memoirs would seem to put Herr into a corner.
So how did America adopt the “Myth of the Bad War?” Americans are not altogether stupid. One would like to assume that as a people, we can distinguish fact from fiction. Perhaps, though, the problem is that the “War Gone Wrong” does not translate onto the screen as well as the “Nightmare War.” It is visual accounts that Americans draw on to form their collective memory. Richard Donner’s The Goonies has an elucidating moment when the protagonists stumble across the bottom of a wishing-well; that is to say, they end up where the coins do. As they gradually realize that the coins are not in fact pirate doubloons, one character begins to read off the names of the presidents depicted: “‘Abe Lincoln… George Washington… Martin Sheen.’ ‘Martin Sheen? That’s President Kennedy, you idiot!’ ‘Yeah, well he played him once…’” Such a substitution of fiction for reality would actually seem to be the norm with the Vietnam War, and indeed, much of contemporary American historical memory.
***
The undue influence of fiction on our interpretations of the past cuts in two ways. When we are curious about a given subject, we are driven to locate more information on it; however, we find instead misinformation, rumor and urban legend in its place. It’s probably more unfortunate that our modern information overload is likely to mislead those in pursuit of the truth, making a passion for history and fact something of a liability in the digital age. The other means by which we twist history, though, is a passive one; it involves accidental learning that stems from the consumption of other media with a historical setting.World War II is an ever-popular setting for some of the best-selling video games of the past few years. The black-and-white nature of Nazis (well, maybe not so black-and-white) makes them the perfect villain, and the United States’ role in winning the war is as popular a mythology as can be imagined. Yet, only a few video games pay more than lip service to the Eastern Front, the colossal undertaking that consumed three-quarters of all fighting forces in Europe, and cost the lives of over twenty-five million Soviet citizens. Still, it’s definitely a good thing that at least some do—another teachable moment. But what the players of these games might learn is drastically at odds with the truth; because what is their only exposure by default becomes their truth.
In the original Call of Duty, the developers insisted on the pervasive myth of the all-powerful commissar, bent on eliminating kind-hearted peasant soldier-boys. It is true that with Stalin Order No. 227, calling for “panic-mongers and cowards 'to' be destroyed on the spot,” executions of deserters and “cowards” stepped up. This, however, was a response to an actual threat; desperate times called for desperate measures. Soldiers interpreted the order differently from the 70-years-removed audience of today; one recalled that it was “not the letter, but the spirit and content of the order 'that' made possible the moral, psychological, and spiritual breakthrough in the hearts and minds of those to whom it was read.” The myth of the omnipotent, omnipresent commissar was just that: a myth. In October 1942, the system of dual command was abolished, and with it, the commissars. They became morale and welfare officers. Besides that, Call of Duty imagines the commissar as the sole arbiter of Soviet military justice. Not once does an NCO or officer carry out the will of the Party.
It’s a minor quibble, sure, but it’s representative of much larger truths concealed by what happens in a popular historical video game. No one is claiming that the Playstation should replace the professor, but at the same time, there’s no reason to not strive for historical accuracy wherever possible. Influential media needs to recognize its own influence and act accordingly.
Selective memory is nothing new. Novel is only the casualness with which we treat our own approach to facts. The Soviet Union had some experience with twisting history in an official capacity. In his (shamefully) out-of-print The Commissar Vanishes, David King puts together a fantastic collection of Soviet-era photographs featuring some “creative” retouching. Targeted cropping, primitive erasing and crude scribbling-out are all used to eliminate even the memory of those that the regime designated undesirable. But in the twenty-first century, an expunging as literal as that is a relic of the past. Our problem with history is not that it will be lost, but rather, that there will be no coherent narrative; no single history to draw upon. Think less the 1984 of oppressed redaction, and more the overload of Brave New World.
Maybe there’s an explanation for our casual treatment, if not outright dismissal, of history. We are living in a “bridge era.” As much as the defining characteristic of the interwar years was the fact that they were in-between wars, our time has seen little in the way of world-changing history. Like the 1970s, the first few post-Cold War decades are perhaps best forgotten. Much as we may intend to write history ourselves, it probably will not be kind to us. Joseph Fouche predicts a depressing future for this generation:
When students of 2050 think of our time (making the massive assumption that they will bother to think about our time), their thoughts would be more along the line of why didn’t Grandpa push Future Hitler under a bus when Grandpa had the chance. Our time is merely a connector between one vaguely memorable event (the death of Elvis) to another (the death of Michael Jackson). Events of the following decades will cast larger shadows than these small times with their small wars, small trivia, and small men. We’ll be the people in the gaps and accordingly will fall through them.
Our time and our actions will have an impact on the future but I doubt the future will bother to send a thank you note.One more question, then: do we dismiss our memory because we live in unmemorable times, or are our times so unmemorable because we have no clear picture of the past? It doesn’t matter what the answer is; regardless of how you choose to interpret this particular slice of time, it doesn’t look good. Not when compared with the past, not when compared with the possible present, and certainly not in light of posterity. History and the past are more important than ever now, if only because they represent a far more appealing alternative than the world of today.
