109dculturedh12146032.0001.001 in

    Introduction

    The historian is of his own age, and is bound to it by the conditions of human existence.
    —E. H. Carr, What is History? (1962)

    Everyone who teaches has had moments when students do, say, write, or create something that causes us to think about teaching in new ways. Sometimes it is only with hindsight that we realize just how profound the effect was. Other times, what happens is so obvious that even if we try we cannot ignore the impact it has on us. One such moment in my career as a history teacher came several years ago in my Western Civilization course. Despite all the thinking I had been doing on how digital media were transforming student learning about the past, that day I realized I had missed a very significant change in the way my students thought about learning, about the production of historical knowledge, and about the nature of historical evidence.

    On that particular day we were winding up the Second World War and my goal was to spend some quality time on the war crimes tribunals at Nuremberg and Tokyo, both to demonstrate how the victorious powers had decided to handle the resolution of the war differently than they had in 1918, and to introduce my students to the ideas of human rights implicit in the indictments for crimes against humanity. I had already given them several primary sources—copies of the indictments at Nuremberg, the Universal Declaration of Human Rights—and I came to class armed with links to newsreel footage of the Nuremberg prosecutions that were available on YouTube. The students’ first task was to discuss the primary sources among themselves. Then we watched the video clips as a precursor to a general class discussion of the questions I had given them. In one of the clips a van pulled up in front of the courtroom and the voice-of-God narrator Ed Herlihy described the scene in a combination of triumphal and apocalyptic prose.[1] When the clip ended, one of my students objected to the background music, saying that it reminded him too much of some of the Nazi propaganda film clips we’d watched the previous week, largely excerpts from Leni Riefenstahl’s Triumph of the Will. Several of the students nodded agreement with him and so we paused for a few minutes to discuss propaganda in general, how it might be similar or different across cultures, and how the makers of newsreels might be working with a limited number of possible clips on short notice. We also spent a few more minutes discussing how music changes the feel of a documentary and how documentary films—whether newsreels or otherwise—are constructed versions of reality. I was pleased with the discussion because it engaged a number of the students in the room and helped to set up some other points I planned to make toward the end of the semester about media and historical knowledge. In short, I left class that day feeling like it had been a good day.

    The following class session was not at all what I expected. My plan for the day was to work on our analysis of the beginning of the Cold War and the first stages of European integration. Instead, I was knocked off course even before class began. One of my students came up to me while I was arranging my laptop and told me he had “fixed” the Nuremberg video we watched during the previous class session. Fixed? When I asked what he meant by “fixed” he handed me his thumb drive and told me to start with the first file in the folder marked Nuremberg. So, once everyone had settled themselves, I told the class what was going on and launched the first file I found. It was the same Universal Newsreels video we had watched the prior class, but my student had stripped out much of the music track and substituted new background music. As soon as we heard the ominous bass notes from the movie Jaws we all chuckled at his joke. Then he told me to open the second file. This time he had replaced the triumphalist music of the original with passages from Mozart’s Requiem. As he then explained, Mozart’s music was much more appropriate to the seriousness of the situation being shown in the film and so, “From now on, Professor Kelly, you should use my version.” Not surprisingly, I responded that as much as I might prefer his remix, it wasn’t the original source. He shrugged his shoulders and said, “Yeah, but mine’s better.” When I saw that perhaps half the class was on his side, I gave up on the Cold War and European integration and spent the rest of class in a vain attempt to win the class back over to my side of the historian’s fence. The vast majority of the students agreed with me that original sources were original sources and that, in general, they were preferred to mashed-up or remixed sources. But even after a very animated discussion of historical evidence, a significant number—perhaps as many as half—still felt that his version was better and so I probably should use it from now on.

    For more than a decade I have been making the not-especially-original argument that digital technology—particularly, but not limited to, the Internet—is transforming the ways in which students are learning about the past.[2] But the more I have thought about what went on in that Western Civilization class several years ago, the more I have come to realize that something much bigger and more consequential has already happened. Moreover, I am convinced that the future of history teaching depends on our ability and willingness to accommodate ourselves to the rapidly accelerating, technology-driven cycle of change that is transforming the teaching, learning, research, and production of historical knowledge. For more than a century, historians have been able to shrug off demands for changes in how we teach our subject and most of us have remained stubbornly ignorant of the history of teaching and learning in our discipline. Unfortunately, no matter what we might like to believe, from the end of the Second World War until the late 1990s, there really has been almost no significant innovation in the methods of history teaching. Teaching history through primary sources rather than through textbooks? That “innovation” dates from the last two decades of the nineteenth century.[3] How about “problem-based learning”? Alas for us, that “innovation”—all the rage at the moment—first appeared in history classrooms in the first decade of the twentieth century.[4] To be sure, we have been very innovative when it comes to the topics in history we study and teach about, but when it comes to teaching methods in history, until recently there hasn’t been much new under the sun. As the example of my student’s Nuremberg remix indicates, we should be very worried that we are losing the rising generation of students because our approach to the past seems increasingly out of sync with their heavily intermediated lives.

    Let’s be clear—my student’s remix of that newsreel signified was not just a playful approach to the past. He was also demonstrating concrete evidence of a way of thinking about the nature of evidence and how evidence can and should be used to make sense of past events. As I first wrote these words in the spring of 2010, the novel Axolotl Roadkill by seventeen-year-old German author Helene Hegemann sat in the number-two position on the hardcover fiction best-seller list of the magazine Der Spiegel. Much to the outrage of critics (most of whom were significantly older than seventeen), Hegemann freely admitted lifting substantial portions of her book from the work of other authors without any attribution. Hegemann called this remix of other authors’ work legitimate because, as she said in a formal statement via her publisher, “There’s no such thing as originality anyway, just authenticity.”[5] Following her line of argument, the remixed version of that Nuremberg newsreel was a more authentic source, at least in my student’s eyes, which helps to explain why I had such a difficult time convincing the class that I should not use it when teaching about Nuremberg.

    My student was making history out of factual evidence in ways that a number of prominent historians over the years have advocated.[6] To be sure, he was altering a primary source to make a point about the past, but it is worth considering two things: to what degree was his alteration of the source to make a point substantially different from, say, a historian’s decision to crop an image so it will fit neatly into the point he or she is trying to make in class? Certainly my student’s decision was much closer to that of photographer Roger Fenton’s staging of photographs taken in the Valley of the Shadow of Death during the Crimean War in late April 1855, or Alexander Gardner’s similarly staged photographs from the American Civil War.[7] History abounds with fakery and forgeries like Fenton’s and Gardner’s, and one of the tasks of the historian is to uncover such alterations of the historical record if it is possible. But history also abounds with a more subtle problem—facts played up or played down by storytellers, chroniclers, journalists, and historians to make a point they want to make. I submit that my student altered that source to make a historical argument—something we lament the absence of in so much of our students’ work—and while I wish he could have made the argument without altering a source, I also recognize that his act of history making lies somewhere between the deliberate forgeries of Fenton and Gardner and the severe injunctions of Leopold von Ranke demanding that history be told as it actually was. One of the main purposes of this book is to explore the gray areas that acts like my student’s open up in hopes of helping us think about what history may become in the digital age.

    What then is a historian to do in the face of students who may be more interested in authenticity than originality? First and foremost we have to set aside our squeamishness, if only so we can examine those feelings for what they are. I will admit to having had to force myself to do just that over the past several years. After all, I am a firm believer that history is built upon a foundation of evidence—evidence drawn from primary sources in as close to their original state as can be accessed. Any remixing of those sources makes me more than a little squeamish: it makes me downright uncomfortable, just as I imagine many art critics in Vienna felt when Gustav Klimt unveiled his Medicine mural in the Assembly Hall of the University of Vienna more than 100 years ago. Klimt’s work was so far outside their understanding of what constituted art or beauty that most of those critics had difficulty finding a way to describe the work and simply rejected it out of hand, with many decrying it as an obscenity. While we do not, or at least should not, expect our students to establish new ways of making sense of the past that are as groundbreaking as Klimt’s work, it seems to me that it is incumbent on us to give them enough free rein to experiment and to accept the results of those experiments as worthy of consideration as history. In fact, one of the main arguments in this book is that by giving students the freedom to experiment, to play with the past in new and creative ways, whether using digital media or not, we not only open ourselves up to the possibility that they can do very worthy and interesting historical work, but also that there are significant learning gains that result from giving students that freedom. When students work on topics they are interested in, in ways that make sense to them, the level of their engagement not only with the assignment, but also with the fundamental historical assumptions that the assignment raises, certainly goes up.

    I am not arguing that students should be free to do whatever they want, however they want—quite the contrary, in fact. I am, however, arguing that by structuring learning opportunities that address fundamental historical problems and give students enough free rein to take real ownership of their work, we open ourselves (and them) up to the possibility that much more can happen in our courses than the development of the most basic skills of historical analysis. At the same time, I argue that we do not have a great deal of time when it comes to making the transition to new ways of teaching and learning that are grounded in the potentialities of digital media. Thomas Kuhn introduced us to the idea that when existing and accepted paradigms no longer suffice to answer pressing scientific questions, first a crisis and then a revolution occurs, leading to new ways of thinking about old problems.[8] Historians are more fortunate than physicists, because we are experiencing no such obvious crisis. In fact, as a discipline, we seem fairly well pleased with ourselves when it comes to the state of historical research and analysis, and many of us remain generally dismissive of the value of new media technologies for the teaching and learning of our discipline.[9] But we ignore the revolution going on all around us at our peril.

    While Helene Hegemann’s notions of originality and authenticity might seem easy to dismiss as a passing fad of the young, it is not so easy to dismiss the work of award-winning Canadian environmental and digital historian William Turkel on “interactive ambient and tangible devices for knowledge mobilization.” Turkel argues that “As academic researchers we have tended to emphasize opportunities for dissemination that require our audience to be passive, focused and isolated from one another and from their surroundings. We need to supplement that model by building some of our research findings into communicative devices that are transparently easy to use, provide ambient feedback, and are closely coupled with the surrounding environment.”[10] Turkel, the historian’s ambassador to the “maker” movement, further advocates the use of new digital devices to fabricate objects from the past in real time as a way to give students access to the three-dimensional look and feel of historical objects.[11] In other words, in Turkel’s view, historical knowledge and analysis can become tactile, not as a replacement for other forms of the representation of knowledge, but as another way to give students of history access to insights about the past. For example, historians and art historians have written many books and articles about the graffiti decorating buildings and other structures around the world over the many centuries. Students of the past can view those images on the page or the screen and can read the historian’s analysis of the images; the cultures within which they were produced; and the biographies of the artists, if the artists are known. At a conference in 2010, Turkel and I used a digital camera, off-the-shelf image-manipulation software, and a device called a Craft ROBO, to reproduce a graffiti stencil I photographed on a street corner in Vienna, Austria, in 2008.[12] With the stencil we made and a can of spray paint, we could have (but did not) gone around town tagging buildings with that Austrian stencil. We would not have been re-creating the historical object I photographed in 2008, but we would have been reenacting, in an authentic way, the process by which that Austrian stencil was used by whomever tagged the building I photographed two years earlier, thereby at least opening up the possibility that we might have gained some new or different insights into what it was like to be a public artist in the Austrian capital. Of late, cognitive psychologists have called into question the empirical basis for claims that students have different “learning styles,” but those same studies do point to strong evidence for learning gains accruing from students encountering evidence, problems, and analysis from multiple perspectives.[13] Had Turkel and I gone about tagging local buildings with that stencil we created, the tactile nature of that experience would certainly have fallen into the category of a different perspective on the past.

    You would be well within your rights if Turkel’s tactile approaches to the past sound like they are a long way from writing a book or a scholarly article. He, and those working with him, represent just one variant of serious historical investigation that bears almost no resemblance to the work we have done for more than a century. “Interactive ambient and tangible devices for knowledge mobilization” have almost nothing to do with the forms of historical scholarship we have grown comfortable with—or even with primary sources as we know them. Turkel is not alone. My colleague Dan Cohen, director of the Center for History and New Media, recently launched a new version of the historical journal. Digital Humanities Now uses an algorithm to scrape content from the Internet (blogs, websites, social media) and then editors decide which items to feature on the journal’s home page. Content gathered by the algorithm includes blog posts, updates to historical wikis, new content from selected Twitter feeds, and other forms of rapidly changing information about the digital humanities. Because the content on the home page changes daily (and more often, in the case of the river of unfiltered content also summarized on the site), readers get a real-time view of what is happening in the digital humanities.[14] The fact that serious historians like Turkel and Cohen—among others—are doing this sort of work is a harbinger of the sort of change we can expect in our discipline. If new media are changing our discipline, then how can the teaching and learning of our discipline not change as well? In his essay “Historical Thinking and Other Unnatural Acts,” Sam Wineburg argues, “the essence of achieving mature historical thought rests precisely on our ability to navigate the jagged landscape of history, to traverse the terrain that lies between the poles of familiarity with and distance from the past.” I submit that somewhere between Leopold von Ranke and Helene Hegemann lies a similarly jagged landscape of history, and scholars like Turkel and his colleagues in the Lab for Humanistic Fabrication, Cohen, and my colleagues at the Center for History and New Media will be the ones to help us traverse that landscape. My hope is that this book will help readers negotiate those parts of that landscape that have to do with teaching and learning.

    The task I have set for myself is a bit daunting, especially given how entrenched notions about how history ought to be taught are among those who teach history. One reason these notions are so powerful is that for more than 100 years historians have been teaching their courses much the same way.[15] The typical high school or college history class is dominated by lectures aimed at imparting a mix of facts and analysis to students who are expected to dutifully listen, take notes, study that information, and then demonstrate their mastery of the material either in essays (if the class is small enough) or in exams. History is not alone as a discipline that relies upon lectures as the primary mode of instruction. We are also not alone in ignoring the fact—demonstrated again and again in studies of student cognition—that lecturing to/at students is among the worst possible ways to teach them anything.[16] Even in lecture courses carefully designed to maximize student recall of factual information, most students retain only about 20 percent of what was taught to them in lectures.[17] Moreover, after twenty minutes of being lectured to, most students report that their minds have wandered at least once from the subject at hand (and this finding comes from before the days when students brought laptops, cell phones, and iPods to class). Even in those classes where time is set aside for discussion on a regular basis, researchers who study such things find that the majority of questions asked by instructors across the disciplines focus on the recall of factual information. Recalling factual information on an exam is not, by any definition, the kind of real learning that leads to higher order thinking about complex ideas, nor is it in any way a sign of what we like to call “historical thinking.” Writing about the past is one way students acquire and demonstrate the higher order thinking we are hoping to teach. The skills of analysis students demonstrate in writing one five-page paper after another is not something to be scoffed at, and is, moreover, a set of skills that employers value. However, analytical writing is only one of the many ways students can advance both their knowledge of the past and their analytical skills.

    One reason historians seem to feel it is so necessary to present students with so much factual information is that we know in our hearts that students cannot be expected to engage in sophisticated analysis of historical events unless they know what those historical events actually were. Because most college history curricula have dispensed with prerequisites for most courses, it is very difficult to assume that students arrive in class on day one knowing anything about the subject of our courses, and so we feel honor bound to start somewhere near the beginning of our subject—not all the way back to humanoids wandering out of Olduvai Gorge—but back a good way nevertheless, so that our students will have some sense of what led up to the events that will be focused on for the rest of the semester. But, because those prior events are less central to the main subject of the course, we often knock them all off in a couple of lectures. Imagine trying to take differential equations without first having taken calculus, but having a nice professor who spends the first week reviewing algebra, geometry, and then calculus before diving into the heavy lifting of the rest of the semester—that is what that first week of rapid review of the prior century (or three) must seem like to many of our students. Once we have told them what happened before the course began, then we make sure to tell them what happened during the time frame of the course itself. The time constraints of the ten-week quarter or the fourteen-week semester mean that even in the smallest class of students efficiency seems to dictate a certain amount of lecturing—or, as we often put it—“covering” the main events. But as Lendol Calder so cogently pointed out several years ago, “cover” can also mean to obscure or hide from view.[18] Thus, if we want to uncover what is really important in our courses, it seems clear that we need to give up on lecturing as the primary mode of historical instruction. How might that be possible in classes with 50, 100, 200, or even 500 students?[19] As we will see, digital technology offers us a way forward that makes it possible for our students to uncover important insights, no matter how many other students there are in our courses. It is worth noting that students are not unaware that listening to lectures and taking notes are not the best ways to learn. Is it any wonder then, that at a moment in time where they can suddenly access more information about any topic than they can possibly use or make sense of, that more and more students have lost patience with us and our teaching methods and have either shut down—choosing the path of least resistance to a grade they want—or have begun to make sense of the past in ways that seem as foreign to us as the remix of the Nuremberg newsreel did to me?

    It is likely that even if you agree with some of my argument(s), you may be thinking, “Ah, but his critique doesn’t apply to me.” After all, you may lecture no more than a few times in an entire semester and your classes may be built around a series of learning exercises that emphasize active learning, community-engaged learning, problem-based learning, or other teaching methods demonstrated to engender the kinds of historical thinking almost all of us say we strive for with our students. If that is the case, you are in a very small minority. Study after study turns up the same data; namely, that between 75 and 90 percent of college instructors in courses not designated as seminars rely upon lecturing as the primary mode of instruction in their courses. While most historians I know claim that they make the analysis of primary source materials a central feature of their courses, a reasonably recent analysis of college history syllabi by Dan Cohen indicates that in introductory American history surveys, a substantial fraction of college faculty assign no book other than the textbook, and that only a small number assigned the primary source reader tied to the textbook.[20] Amazingly, at a time (2005) when millions of primary sources in American history were already online from reputable organizations such as the Library of Congress and the National Archives, only 6 percent of the 792 syllabi Cohen included in his study offered students links to online primary sources. A more recent study (2010) by Robert Townsend of the American Historical Association (AHA) indicates that in the five years since Cohen’s article appeared, still fewer than half of the more than 4,000 teaching historians responding to an AHA survey regularly use online sources in their classes.[21] Anyone who has taught history in the past decade knows that the first, and often the only, place students of any age look for primary sources is online. When students look online almost exclusively, and fewer than half of their professors point them to online resources, we see another reason why students and their instructors are proceeding into the past on rapidly diverging tracks. No wonder students are teaching themselves what to do with those sources.

    In 2000, it was possible for a scholar like Sarah Horton to argue that, “although moving your course materials onto the Web may not shake the foundations of Learning [sic], it is the first step to devising a Web teaching method.”[22] Perhaps in 2000 it was also possible to write an entire book on “web teaching” that only “touches” (her word) on the effectiveness of using digital media to teach. That is not the case any longer. But in 2000, “the Web” was mostly about image and text availability. The world of the World Wide Web has changed radically in the past decade—not only because we now call it the Internet. When Horton was writing about how to teach with Internet resources, the resources she was talking about were websites created either by what we now call “legacy institutions”—that is, museums, libraries, and archives that pushed lots of content onto the Internet for users to view, or by teachers who likewise pushed content online, or created teaching exercises from that content that students were expected to use. The most interactive websites in 2000 were those that offered users access to discussion forums or, in rare cases, chat rooms where various topics could be discussed. But only a tiny fraction of Internet users had ever created content for the web beyond contributing to a discussion forum. In 2000 creating web content still required a fair amount of technical skill and the term “social network” had a completely different meaning than it does in today.

    By contrast, the young people arriving on our campuses this fall have been creating content online for as long as they can remember. According to a Pew research study published in February 2010, 75 percent of Americans between the ages of 18 and 29—the “Millennials”—have created a personal profile on a social networking site such as Facebook, 62 percent have accessed the Internet away from home via a wireless connection, and one in five has posted video of themselves online on a site like YouTube. When the 18–29 cohort is broken down into subgroups of 18–24 and 25–29 years old, the percentage of those using social media rises to 81 percent.[23] College students are even more aggressive adopters of Internet sites where the user creates the content rather than simply consuming content; they use the Internet in active, not passive ways. In the fall of 2005, 85 percent of freshmen at the University of North Carolina-Chapel Hill had a Facebook account at the beginning of the semester, and by the end of their first semester 94 percent had such an account.[24] It is worth noting that in 2005 the use of such social media by students was still relatively new. It seems safe to assume, therefore, that by 2010, when the number of Facebook users worldwide has surpassed 700 million, the percentage of incoming freshmen who already have a profile on one or more social networking sites is substantially greater than the 85 percent found at UNC six years ago. By contrast, the Pew Research Center report found that in the next generational cohort—the so-called Gen X, now 30–45 years old—only 50 percent had an online profile on a social networking site, and only 6 percent had posted video of themselves online.

    As these data make abundantly clear, not only is the Internet of 2012 radically different than the Internet of 2000, but more importantly, students’ use of digital media is substantially different. They still consume a great deal of online content, but just as important, if not more important, they are aggressive creators of online content as well. As a recent report from the Massachusetts Institute of Technology (MIT) on young people and technology argues, “The growing availability of digital media-production tools, combined with sites where young people can post and discuss media works, has created a new media ecology that supports everyday media creation and sharing for kids engaged in creative production.”[25] Thus, the second central argument of this book is that any use of digital media for teaching and learning that does not take into account this shift from consumer to creator is problematic from the start. Throughout this book, I suggest various ways we can capitalize on this creative impulse of our students to make the past more exciting and more relevant to them, not only in the classes they are taking, but also in the lives they have planned for themselves. By structuring our teaching and their learning about the past around ways that digital technology now promotes active engagement with, rather than passive acquisition (and reading) of historical content, we will be creating learning opportunities for our students that have a much higher likelihood of producing the learning gains we hope for when we teach. Instead of asking them to sit, listen, and record what we say—a teaching strategy that cognitive science has demonstrated quite conclusively to be unproductive—we can now ask our students to do what we do: make history out of the raw material of the past.

    The goal of this book is to challenge historians, but also others teaching in the humanities and social sciences, to think carefully about the ways that digital media are changing teaching and learning in our fields in the face of changes such as those mentioned earlier. At its most challenging, this book considers how the remix culture developing around and through new media is making it possible for our students (and us) to produce either new knowledge about the past, or old knowledge presented in new ways. Even though we may not be able to anticipate the results of our students’ work in the digital age, it remains incumbent upon us to guide them through the past, and through the ways digital technology might be used to understand and represent the past. After all, the values of the professional historian do not change just because the medium changes. To help with that task, chapter 1 provides an overview of several decades’ worth of research on how students learn about the past, which sets the stage for a discussion and analysis of how students search for, and find, historical content. Subsequent chapters consider how students might actually analyze historical sources that now rain down on them, not by their dozens or hundreds, but by their millions or hundreds of millions—a problem of abundance that will only increase with each passing year. Once they have analyzed the historical data they acquire, our students have to do something with that data, and so the last portion of this book considers ways in which students can, and slowly but surely are, already creating new forms of historical knowledge. It is always risky for historians to write about the future—after all, we still know far too little about the past—but my hope is that by challenging the reader to think hard about the future of teaching and learning in the digital age, every reader will find at least one new way to think about both the past and the future in our discipline. Moreover, I hope to convince the reader that my two central arguments—that we should use digital media to create active learning opportunities wherein our students create content online, and that we should be open to the surprising results our students may come up with when they create that content—are worth taking seriously.