In 2015, every measure of significant change from research to academic programming and hiring indicates that Digital Humanities (DH) has moved from nascent to significant on the higher education landscape of many countries. Those like Canada are embracing DH, if judged by the proliferation of undergraduate courses and programs, tenure-stream searches and the continuing growth of centres and institutes. In some ways, the ascendance of DH describes a linear and expanding trajectory from Father Roberto Busa's pioneering work on text in Italy during the 1950s to today's widespread collaborations including international projects such as the numerous transatlantic initiatives. Just as digital technologies have become increasingly important in other fields on campus, it could be said that digitally-enabled work in the humanities started modestly and then rose in prominence and importance as more and more scholars recognized the value of computers for studying human expression and behaviour. With much talk of generational change on campus especially in North America, we might take for granted that time is fully on the side of those who push for academic program renewal, new institutional structures and DH-compatible policies and practices.

But is the future of DH bright? Among the concerned voices in recent years is, perhaps surprisingly, that of Edward Ayers, one of the pioneering leaders in the rise of DH since the later 1990s. By 2013, Ayers had become deeply concerned that DH was being "domesticated" to bolster old approaches that otherwise had passed their best-before date. The result was in his view that "the foundation of academic life—the scholarship on which everything else is built—remains surprisingly unaltered. The articles and books that scholars produce today bear little mark of the digital age in which they are created. Researchers routinely use electronic tools in their professional lives but not to transform the substance or form of their scholarship." Moreover, "the very ubiquity of the web" was leading to "a lowered sense of excitement, possibility, and urgency" (Ayers 2013, 28).

As a contribution to current assessments of the present and future of DH, the following discussion highlights aspects of DH's history from a Canadian-based perspective in order to contextualize pressing issues today and to offer suggestions for steps forward. The goal of this paper is three-fold: to stimulate further historical research on the emergence and development of DH; to provoke further critical analysis of current activities and initiatives; and to help cultivate creative thinking about how we can work together to ensure the ability of digitally-enabled scholarship to enhance knowledge and understanding of human expression and action. With examples primarily from what began as "History and Computing" as well as other text-based Humanities fields, this discussion does not seek to capture the diversity of developments across the Humanities. Rather, it attempts to build on pioneering efforts by Susan Hockey, John Bonnett, Kevin Kee, Ian Milligan and others to examine the multi-faceted activities that, in hindsight, we are beginning to connect within a nascent narrative of DH's history (Hockey 2004; Bonnett and Kee 2009; Milligan 2013). While this paper offers only select contributions in the pursuit of this goal, the hope is that greater attention to the surprising features of the past will better emphasize their enduring and changing importance for DH's uncertain future.

To begin with, well-known historical evidence reminds us that major societal figures in the mid-twentieth century did not think that computers would become part of everyday life anywhere let alone on campuses. The now-famous estimate in 1942 by the President of IBM, Thomas Watson, that "there is a world market for maybe five computers" was similar to the observation in 1977 by Ken Olsen, founder of Digital Equipment Corporation, that "there is no reason anyone would want a computer in their home" (Strohmeyer 2008). Such comments certainly add to the robust history of failed predictions but their importance during this formative period should be taken seriously since those like Watson and Olsen were obviously not dinosaurs or luddites. The fact that key leaders of the emerging computer industry saw a limited and specialized role for their products during these decades reminds us that academics who did not support the introduction of the new technologies could articulate their opposition in the context of larger skepticism about their widespread value.

Indeed, the historical record emphasizes how unlikely it seemed at the time that what we now call DH would gain a toehold in the disciplines that study human culture. One major obstacle was the words-numbers dichotomy that characterized campuses by the mid-twentieth century. When computers became a topic of public discussion, humanists were already identified with words and scientists with numbers. While much of the actual technology was brand new, these larger, distinctive associations were already well-established. Alfred W. Crosby has traced the origins of quantification in western society to the thirteenth century while other historians have documented how mass schooling accelerated the "spread of numeracy" during the nineteenth and twentieth centuries (Cohen 1999; Crosby 1997). Certainly, the recent fiftieth anniversary attention to C.P. Snow's 1963 description of "two cultures" emphasized how the popular currency of this expression does not do justice to the multiple dimensions of his argument (Ortolano 2008). Nonetheless, no one has contested Snow's mid-century view that professors and students in the humanities were characteristically not comfortable then, and implicitly are not still, with mathematics.

The association of numeracy with computers was similarly well-established by the mid-twentieth century. The first widespread use of Hollerith cards in the 1880s facilitated the counting of responses to census enumeration questions in the United States and this technology was soon introduced in countries like Canada. Subsequent applications involved calculations to support military efforts such as decisions about missile trajectories and both creating and decoding encrypted messages. In addition, some public fascination with the computer's reliance on a binary number system fuelled the perception that the great value of this rapidly developing technology was its increasing speed in manipulating unusually large numbers. In turn, the value of those who could contribute to such work similarly rose in stature. Not surprisingly, elementary and high school educators began emphasizing more explicitly the key importance of studying arithmetic while also bringing televisions into classrooms where children watched as computer-enabled space ships launched in the post-Sputnik era (Isaacson 2014).

Meanwhile on campus, leaders in disciplines such as History recognized some advantages but many emphasized dangers in embracing computers and their numerical power. In 1962, Carl Bridenbaugh used the occasion of his Presidential Address to the American Historical Association to offer a wide-ranging assessment of state and future of the discipline. Bridenbuagh perceived that

like nearly any activity in the Western world, historical scholarship has undergone a technological revolution, and we now possess, and probably will add to and improve, remarkable techniques for handling our raw materials, advantages of which previous historians never dreamed. Among other ways, bigness has struck us by proliferating sources and editing, thereby deluging us with an overwhelming mass of data for the study of the last one and a half centuries of history. The new age has built up a stock pile of sources and forced us to resort ever more frequently to statistics. (Bridenbaugh 1963, 321-322)

While admitting some advantages, Bridenbuagh focused on the negative consequences of this emerging "quantitative history." He emphasized that the "realization that historical facts are unique in character, space, and time restrains the historian from trying to fit them into a rigid theory or fixed pattern—and here he can render emergency yeoman service to his unhistorical colleagues in other disciplines." In his view, a historian must not ever "worship at the shrine of that Bitch-goddess, QUANTIFICATION, (sic) History offers radically different values and methods" (Bridenbaugh 1963).

While it may be tempting to dismiss Carl Bridenbaugh's address as an ill-tempered harangue by an aging academic, the thrust of his remarks became characteristic in continuing controversy through the 1970s that, in turn, saw the decline of both quantitatively-focused academic courses and research projects. Along the way, expressions like "cliometrics" came and went along with quantitative research projects just as curricula introduced and then often dropped courses to enhance computational skills for humanities students. It is noteworthy that the claim continues today that the study of words is incompatible with the use of numbers. Peter Baskerville has recently reminded us of Michael Rothberg's frank admission in 2010 that "when I first read the phrase 'quantitative cultural history,' I was tempted to reach for my revolver…." (Baskerville 2015; Rothberg 2010, 341). Similarly, the remark by Anthony Grafton that Patricia Cohen has made so well-known is more welcoming to DH but still assumes that counting can be separated from interpretation. "The digital humanities do fantastic things. I'm a believer in quantification. But I don't believe quantification can do everything. So much of humanistic scholarship is about interpretation" (Cohen 2010).

While far more research is required on the subsequent relationships between and among numeracy, the humanities and digital technologies, preliminary evidence certainly suggests that this aspect of the continuing growth of fields like Computing and History was surprising in the context of both campuses and the larger society. If little value were attributed to counting in humanities research or education, why did anyone perceive computers as more than clerical aids? This question points to the need for a better understanding of how new ways of thinking, rather than only new technologies, explain why a small number of researchers began activities like counting millions of words or tabulating who moved and who stayed in various settings as ways to gain insight into cultural and social expression. To what extent did these researchers have to break the mould of the dominant established scholars? How did their own backgrounds prepare them to pursue new ways of thinking in their research? In turn, how did their digitally-enabled pursuits affect their own careers as well as the larger orientation of their fields? In other words, how did their surprising activities contribute to remaking scholarly cultures that would increasingly support the ascendance of what we now call Digital Humanities?

To address such questions, a point of departure is the increasing rejection of the intellectual claim that words and numbers are distinct in ways that justify the separation of scholarly and scientific cultures. This claim emerged from the later twentieth-century rediscovery of thinkers such as Charles Sanders Pierce along with the fresh and accessible writing of Umberto Eco and others that brought a renewed focus on communications, especially in terms of meaning-making. Without any reference to digital humanities or even computers, such work emphasized why an appreciation of words and numbers should be shared by all disciplines. In describing an intellectual bridge across campus, Paul Perron and his colleagues explain that "a digit in numerical representation…has the exact same structural features in representational terms that, say, a noun in language has – i.e. both are signs with specific forms, functions, and meanings." For this reason, they conclude that "the difference between a digit and a noun is thus not to be located in structural patterns, but in the different functions of the representational systems to which they pertain….Indeed, semiotics helps unravel the structural reasons why poetry and mathematics make their meanings, as different as they might appear to be, in comparable ways" (Perron et al. 2000, 12-13).

Such changing thinking was, of course, enabled, accelerated and then influenced in iterative ways by increasingly powerful digital technologies that went far beyond their initial focus on counting. These developing digital technologies underpinned an expanded definition of computers as "collaborators" for humanities scholarship. In his recent survey of the technological history of the digital revolution, Walter Isaacson notes with concern that "many people who celebrate the arts and the humanities, who applaud vigorously the tributes to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don't understand math or physics ("Isaacson 2014, 487). In contrast, his survey emphasizes the extent to which "those who helped lead the technological revolution were people…who could combine science and the humanities." In Isaacson's account, this characteristic describes "innovators" from nineteenth-century Ada Lovelace whose parents "instilled in her a love for what she called ‘poetical science'" all the way to recent leaders such as Steve Jobs who launched the iPad 2 by declaring that "technology alone is not enough." He explained to the audience: "it's technology married with the liberal arts, married with the humanities, that yields us the result that makes our heart sing" (Isaacson 2014, 487).

In this spirit, David M. Berry's description in 2001 of "the computational turn" in DH is helpful in understanding why digital now transcends the qualitative-quantitative dichotomy; indeed, this expression offers a promising way to move beyond the legacy of the "quantitative" debate. While it is not necessary to imagine as he does that all disciplines "become focused on the computationality of the entities in their work," Berry's concluding suggestion is compelling "that we take a philosophical approach to the subject of computer code, paying attention to the wider aspects of code and software, and connecting them to the materiality of this growing digital world" (Berry 2011, 17).

At the same time, the history of DH encourages full engagement with the substantive critiques that highlight how the larger societal and academic context threatens as well as supports continued ascendance. In addition to Ayers, for example, Adeline Koh has recently reminded us that the "insistent focus on computing and methodology in the humanities without incisive, introspective examination of their social implications is devaluing the humanities" (Koh 2015, emphasis in original). Moreover, declining enrolment and research funding are consistent with a larger devaluing of the humanities often by unwarranted claims about "return on investment" from the perspective of a corporatized campus. The extent to which and how DH can successfully and appropriately contribute to the re-enchantment of the humanities in this challenging context deserve robust discussion as well as specific initiatives to demonstrate promising approaches.

One significant insight for such critical discussion is the notable extent to which the early use of computers to study human thought and behaviour appears to have involved research teams far more than individual scholars. Recent studies have challenged the established image of "solitary" humanists while also revealing the importance of distinct scholarly traditions such as those involving author attribution. In her ethnographic style study of the peer review process, Michèle Lamont highlights the considerable diversity in "how professors think" even within the humanities and social sciences where some disciplines embrace recognition of collective effort while others privilege evidence of individual achievement (Lamont 2010). This study complements the research of Jerry A. Jacobs who looks behind appearances to explore how researchers actually undertake scholarly activity. He finds that familiar stereotypes are often unjustified especially in terms of disciplinary silos and solitary scholars. Rather, Jacobs identifies robust traditions of "cross-disciplinary collaborations" (Jacobs 2014, ix). Harvey Graff advanced this historical re-thinking in his systematic study of how different policies, practices and cultures (including those of external agencies) supported both boundaries and bridges on campus during the twentieth century. His sophisticated comparative and chronological research exposes the complexity of intellectual and institutional life as well as considerable change and continuity (Graff 2015).

Such studies implicitly contextualize the fact that DH's current ascendance is being led by those outside as often as within the Humanities. This characteristic is evident in recent job postings. During 2015, for example, Delft University of Technology's Faculty of Technology, Policy and Management announced ten doctoral positions for their "responsible digital future" programme that "combines insights from the engineering sciences with sights from the humanities and the social sciences" (Delft University of Technology 2015). At the same time, the newly-created Digital Humanities Institute at the École Polytechnique Fédérale de Lausanne (EPFL), with a computer scientist as Director, is leading a strategic campus-wide initiative to attract three professors for an expected cluster of ten "who will develop and drive an interdisciplinary research program at the intersection of the humanities and computer science/engineering." Both TU Delft and EPFL have been consistently ranked among the world's top universities in engineering, technology and computer science but without any significant presence heretofore in the humanities and social sciences. Such examples join those of initiatives such as the DH cluster hiring by the College of Social Sciences and Humanities (CSSH) at Northeastern University and the four positions in visual narrative within the Chancellor's Faculty Excellence Initiative at North Carolina State University. In other words, DH continues to ascend today thanks to diverse institutional decisions that reflect past success in moving beyond twentieth-century dichotomies of arts and sciences to pursue digitally-enabled studies of human culture and society.

At the same time, one concern for humanists is the possibility that DH leadership will be taken over by those from academic traditions elsewhere. While the multiple DH cluster hires for 2015 were being launched, an op-ed in the New York Times claimed that the humanities risk being sidelined by "biologists, economists and physicists" who had begun analyzing text within a new "mathematical theory of culture." In One republic of learning: Digitizing the Humanities, Armand Marie Leroi, a professor of evolutionary developmental biology at Imperial College, London, described how a "code-capable graduate student" could now trump a "traditional, analog scholar" in examining a "claim about the origin, fate or significance of some word, image, trope or theme." She wondered whether or not all humanists, "weakened by their own interminable, internecine Theory Wars", would "gratefully accept the peace imposed by science" (Leroi 2015).

The possibility of humanists happily aping scientists seems hardly likely for many reasons, perhaps most importantly since so doing would fly in the face of the positive change exemplified by DH in contributing to the larger questioning of the dominant empirical assumptions of the natural sciences. But recent decades suggest that it is a mistake for those of us self-identified within the Digital Humanities to largely ignore or simply criticize the growing examples of text analysis by researchers in other fields. Rather, we should welcome and embrace campus-wide interest in text analysis as an indication of its perceived fundamental value for enhancing cultural knowledge and understanding. Rather than dismissing or ridiculing such interest even when misguided, we should engage with others both to enrich epistemological thinking across all fields and to build DH in keeping with its origins and development thus far. The fact that, for example, Thomas Piketty chose to analyse nineteenth-century text in his global best-seller on economic inequality offers a timely opportunity for DH to enhance public debate (Piketty 2014). While certainly naïve in his research on gothic novels, Piketty was testifying (however unknowingly) to DH's success in highlighting the value of using such text to address diverse, urgent societal questions about the past (Underwood, Long and So 2014). It is also highly relevant that Delia Dumitrica and Sally Wyatt have recently emphasized the importance for those in DH to critically explore the extent to which "the conditions of inequality" are "embedded within technologies and their social use" (Dumitrica and Wyatt 2015).

In Geek sublime: The beauty of code, the code of beauty, novelist and computer programmer, Vikram Chandra, recalls Paul Graham's well-known claim that "what hackers and painters have in common is that they are both makers. Along with composers, architects and writers, what hackers and painters are trying to do is make good things." Similarly, Chandra emphasizes how both writers and programmers "struggle with language" and he explores the ways in which both activities share commonalities in both process and results. Such engagement is far more promising than maintaining out-dated dichotomies especially as a way to think about the changing character of scholarship in what now is called DH. To say that "the first wave of digital humanities was quantitative, mobilizing the search and retrieval powers of the database" while the "second wave is qualitative, interpretive, experiential, emotive, generative in character" is accurate in some ways but this description does not do justice to the complexity of past or present digitally-enabled scholarship (Chandra 2014, emphasis in original).

More generally, the success of literary scholars in drawing upon diverse ways of knowing to re-interpret even the most studied texts emphasizes the value of embracing both disciplinarity and interdisciplinarity as part of the larger scholarly rejection of either-or and embracing of both-and approaches. In this spirit, Susan Brown and Michael E. Sinatra describe DH as a "commons" that brings together all researchers interested in both using and thinking about digital technologies (Brown and Sinatra 2014). The importance of defining this commons in open ways is illustrated by the participation of multiple research granting agencies in the international funding opportunity, Digging into Data. In the Canadian contribution, this support broke new ground in bringing together as separate sponsors, the Social Sciences and Humanities Research Council, the Natural Sciences and Engineering Research Council and the Canada Foundation for Innovation that funds research infrastructure. DH is also inspiring engagement with other research fields to develop new epistemologies along with new vocabularies and metaphors. One recent example was the symposium "Shared horizons: Data, biomedicine, and the Digital Humanities" organized in 2013 by the National Library of Medicine and the National Endowment for the Humanities in the United States.

In keeping with the importance of both-and rather than either-or mindsets for DH, it would be misleading to see DH as primarily led by large collaborative projects. While this association was strong during the mainframe era, today's technologies support a wide spectrum of research activity from mobile to HPC. In fact, even explicitly interdisciplinary DH is now significantly individual as well as team-produced scholarship. One of the most impressive examples of a single-authored major contribution is Ian Lancashire's Forgetful muses: Reading the author in the text that draws upon concepts and methods from psychology to neurobiology to read closely authors such as Chaucer and Agatha Christie. The resulting insights to the process of writing and the roles of language and (un)consciousness illustrate the importance of avoiding a new normative model for DH (Lancashire 2010).

Recent research also implies the need for greater attention to the fact that the early DH initiatives often involved collaborators from beyond campus. It is instructive that IBM's Watson responded favourably to Father Busa's request to explore how computers could help deal with the eleven million words written by St Thomas Aquinas and other authors (Winter 1999). The extent to which Watson was surprised by this request (given his earlier assessment of the limited use for computing power) is not clear but both his eventual support and Busa's decision to contact him are noteworthy given today's common assumptions about historic campus-community separations, especially private-sector partnership. In ways that reflect IBM's initial collaboration with Father Busa in the 1950s, Dean Irvine suggests "taking a longer historical view of the digital humanities and its formative partnerships with commercial enterprise." In particular, he sees both the value of, and need for, "the ongoing development of innovative business models as a means of achieving the sustainability for digital infrastructure" (Irvine 2015, 2). For her part, Bethany Nowviskie has challenged the notion of a "'dark side'" of literary studies' 'digital turn' resulting from the willingness of some DH scholars to engage with those focused beyond the campus. Nowviskie has asked "how isolated from internal conversations driving the development of the American security state should we humanists remain? How irrelevant from them must we keep our viewpoints and hard-won humanities knowledge, in order to stand clean before our peers?" (Nowviskie 2015)

Such complex questions deserve our attention since the evidence shows that, today, DH includes significant individual and collaborative, disciplinary and interdisciplinary, campus-based and community-connected (including private sector) initiatives. The value of such work reflects how it is undertaken as valuably emphasized by the current critiques of DH such as those of Ayers and Koh. Successfully bridging twentieth-century boundaries either on campus or beyond depends on the specific ways in which such connections are imagined and implemented. For this reason, DH's epistemological diversity extends to the heart of current discussions about undergraduate and graduate programs. Are curricula changing appropriately to keep pace with the ascendance of DH? The good news is that the multiple recent conferences on undergraduate education generally encourage approaches that support (explicitly and implicitly) key features of DH such as collaborative projects, co-curricular activities, and "active learning" pedagogy. Similarly, recent initiatives to re-imagine doctoral programs in light of the increasing importance of discipline-based interdisciplinarity, engaged scholarship and the need for better preparation for multiple possible futures on and off campus reflect DH characteristics (White Paper 2013).

While the continuing undergraduate and graduate calls for program and curricula reform inspire optimism, they also testify to the slow pace of action. One result is the exponential growth of the University of Victoria's Digital Humanities Summer Institute, which has grown from several dozen attendees in 2001 to almost one thousand in 2016. The genius of DHSI is multi-faceted including its organization and content, both of which have shaped and cultivated the developing scholarly culture and substantive range of DH. Its growth, however, also reflects the extent to which the kind of enrichment offered at DHSI is not available elsewhere domestically or internationally. Fortunately, Ray Siemens and the DHSI team have generously supported the expansion of their initiative as illustrated by Dalhousie University's two-week DHSI@DAL organized by Krista Kesselring and Julia Wright in May 2015. For the foreseeable future, such initiatives will be key for professors seeking to enhance their DH skills as well as for discussions about new approaches to learning and research. The growing popularity of summer institutes and special training opportunities suggests that the building of a DH option within the curricula available to students around the world must continue to be a top priority.

Past decades attest to the importance for DH of focusing on renewing educational programs in order to better realize the potential of digital technologies. In his contemporary analysis of nascent work in Computing and History as it had developed by the end of the 1960s, Robert P. Swierenga remarked presciently on the importance for scholars to create appropriate digitally-enabled methods. "Borrowing from other disciplines is not the solution" since historians deal with different kinds of evidence that call for different statistics and computer programs, he emphasized. Swierenga cited Robert Zemsky's earlier call in 1969 for historians to "invent a methodology – including computer programs – of our own, a methodology designed to cope with the peculiar kinds of evidence with which we deal". Swierenga defined this need in 1970 as "the vital task of the next generation" (Swierenga 1970, 21).

The history of the following forty-five years includes impressive examples of work on this "vital task" including new approaches to textual analysis, visualization and simulation based on archival evidence. Similarly, it is encouraging that, following the lead of the Modern Languages Association, the American Historical Association has developed guidelines for the evaluation of digital scholarship (American Historical Association 2015). At the same time, it is also clear that much more needs to be done. The most pressing need is to help students acquire the multiple competencies required for "inventing" digitally-enabled approaches that do justice to the "peculiar" character of research on human expression. The continuing need for such initiatives contrasts with the optimism of the DH pioneers. Swierenga anticipated that "it may soon be impossible to identify computerized studies ‘on their face'" since they would become the norm. He observed that "several obvious computer-aided projects have been published recently that contain no mention of the fact, just as they do not refer to other mechanical aids such as photocopiers, typewriters, desk calculators and the like." His conclusion was that "this is the way it should be" (Swierenga 1970, 20). Today, emphasizing digital approaches seems more important than ever, especially in undergraduate and graduate programs. It seems unlikely that the adjective will be dropped anytime soon from DH.

One reason that DH is gaining rather than losing currency within and beyond the Humanities is the widespread agreement today that computers have the potential to be far more transformative than mechanical aids like photocopiers. But to what extent is DH actually increasing innovation as opposed to enabling domestication by print culture mindsets? Will the future of DH maintain paradigm-shifting or normalizing trajectories? The complexity of the myriad DH issues today remind us of Bruno Latour's description of how, centuries earlier, pen and paper affected human thinking in obvious and subtle ways that went far beyond expectations at the time of introduction (Latour 1986).

In his recent article, Michael Roy, the Dean of Library and Information Services and CIO at Middlebury College, asks if the current state of DH represents a "temporary moment in our disciplinary history, one that will serve as scaffolding for what the humanities will become in the 21st century?" (Roy 2014) His answer suggests that it all depends on our success in "thinking through a set of interconnected and complex issues, many of which are playing out well beyond the confines of individual institutions" (Roy 2014, 20). Viewed from a Canadian perspective, the ascendance of DH has certainly been impressive in thinking through such issues related to established policies and practices both formally within institutions and informally within scholarly cultures. At the same time, the evidence examined thus far also emphasizes how the metaphysics and epistemology of DH ran counter in many cases to major features of North American campuses as they were developing during the twentieth century. The continuing strength of these major features is similarly evident if we judge by the ongoing debate at the numerous conferences focused on higher education in Canada and internationally including those concerned about the digital reinforcement of inequality. It is in this context that the current DH moment represents both past developments and future uncertainty. The good news is that by coming to grips with the complexity of DH's surprising ascendance since the mid-twentieth century, we can enhance the short and long term prospects for digital technologies to benefit both scholarship and the larger society.

Acknowledgements /Remerciements

The author is grateful for the insightful and helpful comments of Susan Brown, Padmini Ray Murray and Jon Saklofske on an earlier version.

Works cited / Liste de références

American Historical Association. 2015. "Guidelines for the evaluation of digital scholarship in history." Accessed August 2, 2016.

Ayers, Edward L. 2013. "Does digital scholarship have a future?" EDUCAUSE 49:4. Accessed August 2, 2016.>.

Baskerville, Peter. 2015. "Big History, Big Data: So What?" McMaster University, Department of History's Annual Graduate Colloquium. Hamilton, March 26.

Berry, David. 2001. "The computational turn: Thinking about the Digital Humanities." Culture Machine 12: 1-22.

───. ed. 2012. Understanding Digital Humanities. Palgrave Macmillan Basingstoke: Palgrave Macmillan.

Bonnett, John and Kevin Kee. 2009. "Transitions: A prologue and preview of digital humanities research in Canada." Digital Studies 1.2.

Bridenbaugh, Carl. 1963. "The great mutation." American Historical Review 68.2: 315-331.

Brown, Susan and Michael E. Sinatra. 2014. Letter to Dugan O'Neil, Chief Science Officer, Compute Canada, September 8.

Chandra, Vikram. 2014. Geek Sublime: The beauty of code, the code of beauty. Minneapolis, Minn: Greywolf Press.

Cohen, Patricia Cline. 1999. A calculating people: The spread of numeracy in Early America. New York, NY: Routledge.

Cohen, Patricia. 2010. "Digital keys for unlocking the Humanities' riches." New York Times. November 16.

Crosby, Alfred W. 1997. The measure of reality: Quantification and Western Society, 1250-1600. Cambridge: Cambridge University Press.

Delft University of Technology. 2015. "PhD program in engineering social technologies for a responsible digital future", January 2015. Accessed on August 2, 2016.

Dumitrica, Delia and Sally Wyatt. 2015. "Digital technologies and social transformations: What role for critical theory?" Canadian Journal of Communication 40:589–596.

Graff, Harvey. 2015. Undisciplining knowledge: Interdisciplinarity in the Twentieth Century. Baltimore, Maryland: Johns Hopkins University Press.

Hockey, Susan. 2004. "The history of Humanities Computing." In A companion to Digital Humanities, edited by Susan Schreibman, Ray Siemens, and John Unsworth. Oxford: Blackwell,

Irvine, Dean. 2015. "From angel to agile: The business of the Digital Humanities." Scholarly and Research Communication 6.2.

Isaacson, Walter. 2014. The Innovators: How a group of inventors, hackers, gniuses and geeks created the digital revolution. Simon and Schuster: New York.

Jacobs, Jerry. 2014. In Defense of Disciplines: Interdisciplinarity and Specialization in the Research University, Chicago: University of Chicago Press.

Koh, Adeline. 2015. "A letter to the Humanities: DH will not save you." Hybrid Pedagogy. April 19. Accessed August 2, 2016.

Lamont, Michèle. 2010. How professors think: Inside the curious world of academic judgment. Cambridge, MA: Harvard University Press.

Lancashire, Ian. 2010. Forgetful muses‬: Reading the author in the text‬. Toronto: University of Toronto Press.‬‬‬‬

Latour, Bruno. 1986. "Visualisation and cognition: Drawing things together." In Knowledge and Society: Studies in the Sociology of Culture Past and Present, edited by Henrika Kuklick and Elizabeth Long, 1-40. London: Jai Press.

Leori, Armand Marie. 2015. "One republic of learning: Digitizing the Humanities" New York Times. February 13.

Milligan, Ian. 2013. "Illusionary order: Online databases, Optical character recognition, and Canadian history, 1997-2010." Canadian Historical Review 94.4: 540-569.

Nowviskie, Bethany. 2015. "open and shut," Bethany Nowviskie. February 14. Accessed August 2, 2016.

Ortolano, Guy. 2008. "The literature and the science of ‘two cultures' historiography." Studies in History and Philosophy of Science 39:143–150.

Perron, Paul, Leonard G. Sbrocchi, Paul Colilli, and Marcel Danesi, eds. 2000. Semiotics as a bridge between the Humanities and the Sciences. Toronto: LEGAS 2000.

Piketty, Thomas. 2014. Capital in the 21st Century. Cambridge: Harvard University Press.

Rothberg, Michael. 2010. "Quantifying culture? A response to Eric Slauter." American Literary History 22.2: 341-346.

Roy, Michael. 2014. "Either-Or? Both-And? Difficult distinctions within the Digital Humanities." EDUCAUSE Review 49.3:16-20.

Strohmeyer, Robert. 2009. "The seven worst tech predictions of all time," PC World. January 5. Accessed August 2, 2016.

Swierenga, Robert P. 1970. "Clio and computers: A survey of computerized research in history." Computers and the Humanities 5.1:1-21.

Underwood, Ted, Hoyt Long, and Richard Jean So. 2014. "Cents and sensibility." Slate. December 10.

"White Paper on the Future of the PhD in the Humanities." 2013. Institute for the Public Life of Arts and Ideas, McGill University. December.

Winter, Thomas Nelson. 1999. "Roberto Busa, S.J., and the invention of the machine-generated concordance." The Classical Bulletin 75.1: 3-20.

Zemsky, Robert 1969. "Numbers and history: The dilemma of measurement." Computers and the Humanities 4.1: 31-40.

Valid XHTML 1.0!