The Digital Humanities have attracted much attention as of late, including an increased concentration of DH-specific panels at the MLA conference, a focus in the MLA’s publication "Profession" (2011b), and a series of blog posts by Stanley Fish (2012a, 2012b). While this coverage has done much to expose scholars to the variety of the work being done in the digital humanities, the general sense of the digital humanities is still that it has more in common with the social sciences and computer sciences than the humanities, a sense that is apparent in popular articles like Fish’s blog posts (2012a, 2012b) and Kathryn Schulz’s New York Times article “What is Distant Reading” (2011), and can even be felt to some degree when Johanna Drucker separates the digital humanities from speculative computing (2009 4-5). Data mining, database construction, and the development of visualisation tools require deep technological engagement, and appear entirely alien when compared to standard humanistic methods of enquiry.
For Stephen Ramsay, this disconnect is but a matter of perspective, and is not inherent to computational techniques, a misperception that he aims to rectify in his book Reading Machines: Towards an Algorithmic Criticism. Ramsay proposes that we “create tools—practical, instrumental, verifiable mechanism—that enable critical engagement, interpretation, conversation, and contemplation” (x). Algorithmic criticism is always performed with an eye to humanistic inquiry, with all technological approaches tempered to that end.
From the outset, it is important to note that Reading Machines is a highly condensed book, one that covers a fair bit of ground in its 85 pages. Due to this short length, Ramsay frequently has to summarize or even skip over the mathematical and technological foundations on which algorithmic criticism is built. This will not prove a problem for a reader who has a background in math, stats, physics, or computing, but might give pause to a reader without any exposure to such work. At the same time, the reader willing to do some minor research while working through Reading Machines will learn much about the basic techniques of a number of algorithmic approaches, as Ramsay uses simple-yet-effective examples of statistical analysis, Lisp-based programs, and tools for text analysis like "HyperPo" and "TAPoR" throughout the book. While these examples are clear enough that they should make sense to a reader new to the Digital Humanities, they will prove much more useful to the reader willing to experiment with them on her own time.
Despite the somewhat-intimidating density of the book, Reading Machines has much to offer to readers unfamiliar with the digital humanities. Ramsay methodically develops his arguments to reveal the intimate links between traditional literary criticism and algorithmic criticism. In the chapters “Potential Literature” and “Potential Readings,” Ramsay situates algorithmic transformations within the greater context of literary tradition and humanistic inquiry. The use of procedural, algorithmic, and otherwise automatic or mechanistic techniques is by no means foreign to literature, as Ramsay demonstrates in “Potential Literature.” For example, the writers of the French authors' group Oulipo, Ramsay notes, regularly invoke rigorous restrictions when creating literary works, a practice that leads to an acute awareness of the functioning of language: in Walter Abish’s novel Alphabetical Africa, Abish’s practice of restricting the use of words by their initial letter (chapter 1 uses only words beginning with “a,” chapter 2 with “a” or “b,” and so on), makes it so that in later chapters it is almost impossible to read them “without an awareness of the first letters of the words—of the rich liberation of writing” (2011, 29). The automation and constraints employed by authors do not lead to a narrowing of expression, Ramsay declares, but rather a broadening as authors find new and inventive ways to use language (see Chapter 1).
By contrast, “Potential Readings” focuses on the role such experimentation has on the interpretation of literature. Ramsay draws on examples of literature intended to be read in reconfigured ways, such as the I Ching (an example drawn from Espen J. Aarseth’s seminal Cybertexts ). He also discusses novel ways of reading traditional texts, such as the technique of reading a poem backwards to better understand how the parts function together. It is these techniques that provide the context for Ramsay’s algorithmic criticism, helping to elucidate the linkage between pre-computational practices and those of the digital humanists.
Ramsay notes that many of the principles of the digital humanities are older than we might think they are, reminding us that the Jesuit priest Roberto Busa is the first digital humanist: Busa's computer-generated concordance of the works of Thomas Aquinas, produced in the 1940s, is in fact the foundational work of digital humanism (1). Ramsay also reminds us that Busa’s work itself draws from a long tradition, the first concordance to the Vulgate having been developed by Dominican friars in the thirteenth century (2). Each of these works provides a new, "deformed" way of reading an existing text, a type of reading strategy that Ramsay argues is at the heart of the digital humanities and algorithmic criticism (xi).
The deformations of algorithmic criticism will no doubt have a number of scholars worried that any criticism based on computational procedures will invariably invite apophenia, resulting in arguments that may seem sensible out of context but that have more in common with conspiracy theories than rigorous scholarship. Just as a concordance doesn’t prove anything in and of itself, algorithmic criticism does not stop with the production of data. As Ramsay notes, “criticism is concerned not with determining the facts of the text, but with the implications of the text in its potentialized form” (67). Once we have produced the data, it is critical that we then analyse it, much the way we might with any other text. If we accept, then, that computational procedures don’t “determin[e] the facts of the text” and that we must then return to the text in our analysis, we can avoid the types of errors we would make when using only the deformed texts.
Ramsay demonstrates the flexibility of his brand of algorithmic criticism by walking the reader through a number of algorithmic deformations of texts. Ramsay’s initial analysis, one of the word frequency of the speakers in Virginia Woolf’s The Waves, demonstrates well the way that relatively simple algorithmic approaches can significantly influence our understanding of a text (2011, 25). Ramsay asks whether we are to understand Woolf’s characters as different parts of an individual consciousness, as might be indicated by the novel’s structure, or distinct individuals. Ramsay investigates this issue by generating lists of the words most distinctive of each of the six characters, lists that demonstrate not only that the characters do indeed demonstrate unique vocabularies, but that these words speak to the personalities of the characters (13). Louis, for example, seems uniquely obsessed with his Australian heritage, while Bernard, an aspiring novelist, uses the words “thinks,” “letter,” and “curiosity” far more frequently than the others. By examining the raw data of the word lists within the context of The Waves and modernist notions of subjectivity, we are able to learn much about the text using techniques that are outside the reach of traditional scholarship, while at the same time grounding our criticism firmly in the text and other contextualising sources so as to avoid being misled by quirks of the data.
The approach to the digital humanities Ramsay demonstrates here stands in stark contrast to the “distant reading” approach heralded by Franco Moretti and his colleagues. Ramsay bases his work entirely within the humanist tradition, whereas Moretti argues that there is much “to be learned from the natural and the social sciences” (Moretti 2005, 2), an inclination that bears out in his work. Central to Ramsay's argument in this book is the assertion that algorithmic criticism differs from scientific criticism in that it does not seek a single solution—literature is not a "problem" to be solved—but rather it encourages us to use computers to discover new ways of reading that open up new multiplicities. The algorithmic criticism is not there to provide authoritative understandings of texts; instead “it must endeavor to assist the critic in the unfolding of interpretive possibilities” (10). This is not to denigrate the “distant reading” approach adopted by Moretti and his fellow scholars, but rather to affirm that computational methods are multivalent, and that the type of multi-textual (sometimes almost pantextual) data-mining Moretti bases his work on is neither the only nor even the natural approach.
Ultimately, Ramsay argues that we as humanist scholars can use computers to develop new perspectives on texts, as ways to push and prod and nudge ourselves towards fresh understandings. Algorithmic criticism will not overthrow existing critical traditions, nor will it render them obsolete. Algorithmic criticism is but one more way to explore literature, one that opens up new horizons and that complements existing critical approaches. This understanding demonstrates a kinship between Ramsay’s work and the now defunct SpecLab, where DH work was undertaken by scholars such as Jerome McGann, Johanna Drucker, Bethany Nowviskie, work that is chronicled in part in Drucker’s book SpecLab (2009). It is clear that Ramsay is not so much arguing for the adoption of the techniques of the digital humanities as he is simply biding time until these techniques are as natural to literary criticism as those of deconstruction or queer criticism now are: “algorithmic criticism looks forward not to the widespread acknowledgement of its utility but to the day when ‘algorithmic criticism’ seems as odd a term as ‘library-based criticism.’ For by then we will have understood computer-based criticism to be what it has always been: human-based criticism with computers” (81).
Aarseth, Espen J. Cybertext: Perspectives on Ergodic Literature. Baltimore: Johns Hopkins UP, 1997.
Abish, Walter. Alphabetical Africa. New York: New Directions, 1974.
Drucker, Johanna. SpecLab: Digital Aesthetics and Projects in Speculative Computing. Chicago: U Chicago P, 2009.
Fish, Stanley. "The Digital Humanities and the Transcending of Mortality." New York Times, Jan. 9, 2012.
—. "Mind your P’s and B’s: The Digital Humanities and Interpretation." New York Times, Jan. 9, 2012.
Modern Language Association of America. PMLA 126, No. 5 (2011)
—. Profession 2011. (2011)
Moretti, Franco. Graphs, Maps, Trees: Abstract Models for a Literary History. New York: Verso, 2005.
Ramsay, Stephen. Reading Machines: Toward an Algorithmic Criticism. Chicago: University of Illinois Press, 2011.
Schulz, Kathryn. "What is Distant Reading?" New York Times, Jun. 24, 2011.