Over the weekend I continued my Anxious Bench series on the challenges of writing biographies by reflecting on the problem of historical evidence. While the biographer whose book I’m currently reading seems to have enough evidence to narrate his subject’s entire life on a weekly (sometimes daily or even hourly) basis, I know that he actually is deploying a considerable amount of historical imagination to connect what are still fragmentary sources. And the vast majority of people historians and biographers might study leave behind hardly any evidence at all:
Indeed, the most distinctive feature of history as an academic discipline is the relative paucity of the sources available. All we’ve got to go on are whatever artifacts survive the passing of time, and most of those sources erode. Past supporting preservation and archival efforts (including oral history projects), there’s not much historians can do about this. We can’t retire to our laboratories and test theories by running experiment after experiment. There’s no petri dish or computer simulation in which we can reliably recreate historical change over time. And unlike our friends in the social sciences, we can’t design a survey to administer to the dead.
But what if the problem wasn’t scarcity, but abundance? “Now,” writes digital historian Ian Milligan in The Chronicle of Higher Education, “instead of too little information, we have too much.”
To illustrate, he contrasts the records of London’s Old Bailey (billed as “largest body of records documenting the lives of non-elite people ever published”) with the evidence left behind by GeoCities, the pioneering web hosting service that was closed in 2009. The Old Bailey has preserved nearly 200,000 transcripts from nearly 240 years; in just 15 years, 7 million GeoCities users created 186 million digital documents.
On the surface, this seems like it would be an enormous boon to historians and other scholars. (Indeed, Milligan goes so far as to state that it “would be intellectually dishonest to tackle” topics from the period starting in 1996, when web archiving began, “without turning to the web.”) He rightly acknowledges that the digital record, like the analog one, “is not entirely representative of society; considerable barriers to web access and publishing still exist along lines of race, ethnicity, class, and gender. Still, events, feelings, and ideas are now being recorded that would never before have been, and they are being left behind by the sorts of people who used to be largely absent from the historical record.” Seen in this light, technological change has given us the possibility of “a potentially more democratic form of history.”
But Milligan points to several major challenges complicating this task. First, the technical challenge of sifting through data that defies unassisted human abilities: “…historians cannot uncritically depend on ranked keyword search results. They need to help develop, craft, and make sense of these new and emerging digital tools.”
I’d also add that there are holes in the digital-archival record. As my co-writer Fletcher Warren explained, the 21st century chapter of our digital history of Bethel at War was relatively thin because “the War on Terror occurred during the emergence of what I call the ‘digital void’ — that era in which most documentation is ‘born-digital,’ or created on a computer and perhaps never transitioning to hard copy. Such documents usually ‘die-digital’ as well, and are thus irrevocably lost to future historians.”
But even assuming that the technical issues are overcome, ethical ones remain. As much as in an earlier, analog age, historians retain enormous power to make public meaning of the private lives of people who didn’t expect such scrutiny. (Milligan suggests that oral historians, “who have long operated outside of formal archives and instead in the living rooms and workplaces of their subjects, may offer a useful model for scholars using web archives.”)
Read the full article (if you have a Chronicle subscription) here.