In November 2006 the Canada Foundation for Innovation (CFI) announced its financial support for an initiative to create a national High Performance Computing (HPC) platform and a new organization, Compute Canada, to govern it. The funding application was submitted by the seven regional networks that supported and continue to support HPC research in Canada: ACENET, CLUMEQ, SCINET, HPCVL, RQCHP, SHARCNET, and WESTGRID.  Combined with matching funds from provincial and other granting agencies, the grant infused one hundred-fifty million dollars to support the construction, operation and management of Canada’s next generation HPC infrastructure, with four regional networks being the beneficiary of this round of funding: CLUMEQ, RQCHP, SCINET, and WESTGRID. CFI assumes that further funds will be available in the future to upgrade the infrastructure of the remaining networks, and to enhance the capacity of the national network as a whole. CFI contributed sixty million of the proposed one hundred-fifty million through its National Platform Fund (NPF). The NPF fund was established for two reasons: to support the development of HPC in Canada and to support the development of knowledge management resources for the humanities and social sciences. With respect to HPC, the NPF is designed to support three aims: to ensure access to internationally competitive HPC resources that will benefit Canadian institutions and researchers, to raise Canada’s international profile in HPC, and to provide a stable funding base to support the development of present and future HPC infrastructure. Prior to 2005, CFI was subjected to multiple applications from multiple consortia as each organization’s need to upgrade infrastructure emerged. Members of Canada’s research community realized the need for a more stable and coordinated source of funding; in 2005 c3.ca expressed that need in its Long Range Plan (LRP) for HPC in Canada (c3.ca 2005). CFI’s creation of the NPF fund was facilitated, in part, by c3.ca’s LRP.
In their call for proposals for the NPF, CFI made clear its position that all future HPC research must be undertaken through the proposed national platform. No independent HPC initiatives would be funded. The proposed infrastructure must meet the needs of all disciplines within Canada’s research community, from particle physics to history and computing. Accordingly, Compute Canada approached select members of Canada’s human science community to determine their needs (Rockwell, Lancashire and Siemens 2006). Consequently, now that Compute Canada’s application has received a favourable decision from CFI, Canada’s social science and humanities scholars (SSH) should consider the implications that the proposed HPC platform will present for their disciplines. Will Compute Canada’s proposal – with respect to infrastructure, operation, and governance – enable SSH researchers to conduct leading edge research devoted to computationally challenging problems?
As currently conceived, I do not believe that Compute Canada's proposal meets that mandate for humanities and social science researchers. I further suggest that Compute Canada could not have met that mandate, even if it had enjoyed an extended period to assess SSH scholars’ needs, which it did not. Canadian social science and humanities scholars will soon enjoy access to unprecedented levels of computational power. The unfortunate fact, however, is that most scholars in our respective communities have no idea how to exploit that power to support their research. And even if they did, SSH researchers, unlike their colleagues in the natural and medical sciences, do not possess the computational expertise necessary to use the proposed network or specify their requirements for it.
If Canada’s human science community is to make meaningful use of the country’s emerging HPC infrastructure, then I suggest there are a number of prerequisite steps it must take:
SSHRC, I argue, can and must take a leading role in encouraging scholars’ migration to High Performance Computing.
My arguments regarding the Compute Canada proposal, including the steps Canada’s SSH community must take to engage in HPC, rest on a specific conception of what constitutes a “leading edge” problem; it is not a conception that will be shared by all social science and humanities scholars. Currently, Canada’s HPC networks are configured to support the processing and mining of large quantities of data, and the creation of highly complex simulations in domains ranging from astrophysics to proteomics. Projects are run on a queued batch basis, meaning you submit your project, the HPC networks compute it, and the results are returned to you. There are human science scholars who can and do operate comfortably in such an HPC regime. Economists, for example, are currently deploying micro databases and econometric models in high performance computing networks to assess the probable impact of social policies. In the domain of digital humanities, HPC could potentially be used to support text analysis and text mining.
Doubtless, the projects described above can and should be characterized as leading edge. However, to delimit human science high performance computing to these initiatives, and no other, would in my opinion be a serious failure of imagination. There are other possibilities that Canadian researchers can pursue, and should pursue due to the potential social and economic benefits they would confer. Realization of such a research agenda will require Compute Canada to change the form of access it provides researchers, from queued batch processing to interaction. The organization will also be required to dedicate parts of Canada’s HPC infrastructure to meet the specific needs of Canada’s human science research community. Instead of simply running a program, I suggest social science and humanities researchers will eventually undertake research in which human subjects interact with simulations or persistent virtual environments in ways that are useful for research. Two examples will be used to clarify my point: serious games and massive multi-user persistent worlds.
One possibility – of potential interest to the social sciences – would be to devote HPC infrastructure to projects that combine large-scale agent-based simulations with serious games. Among other things, agent-based simulations are currently being deployed to assess the potential epidemiology of pandemics. For instance, Chris Barrett and his team at Los Alamos National Laboratory created a virtual Portland comprised of one million agents to determine the trajectory of a smallpox pandemic. The purpose of such HPC simulations is to determine the potential hotspots for transmission in order to enable policy-makers to contain and mitigate the transmission of smallpox and other contagions (Barrett, Eubank, and Smith 2005).
This is important work. I argue, however, that the value of such simulations could be enhanced if they were combined with serious games. Serious games are games that appropriate the technologies and expressive forms of computer games for purposes ranging from military training to education and public safety. The research purpose of combining serious games and HPC simulations would be to assess the strengths and weaknesses of the operational constructs Canada’s three levels of government deploy in order to contain and mitigate public safety and environmental threats. In essence, when jurisdictions face an environmental or public safety threat, they deploy agencies, equipment, and individuals to meet that threat. Serious games, combined with training and exercising, would enable policy-makers to determine how well current or proposed operational constructs are functioning. Using simulations, policy-makers could answer questions such as: who is cooperating? Who is not? What policies are working? Which are not? The public benefit that would result from the combination of IT, HPC, and training and exercising cannot be emphasized too highly. Emergency responders in the aftermath of 9/11 indicated that preparation for the Y2K incident was invaluable in assisting them in their response and recovery to the World Trade Center crisis.
A second potential domain of research – of potential interest to the humanities –would centre on the effective deployment of massive multi-user persistent worlds (MMPWs). MMPWs are on-line virtual environments that enable hundreds, if not thousands, of users to from groups and interact with each other, other groups, and the virtual environment in real-time. Their initial domain of application has been games for entertainment, but non-game applications ranging from communications, training, distributed collaboration, predictive modeling, and education have also been identified (Gehorsam 2003). The significance of MMPWs is two-fold. Culturally, its advent is as important as the emergence of the codex in ancient Rome. MMPWs, like computer-supported collaborative environments, are novel platforms supporting the generation, analysis, storage, and distribution of information. Their effective implementation will require the creation and testing of novel conventions for narration, expression, and documentation.
Humanities scholars are well placed to contribute to such a practice-based research agenda, in part because they have a deep knowledge of extant conventions ranging from art to literature and film, and in part because they have undertaken this type of research before with the book, in the wake of the invention of the printing press. Scholars in conjunction with printers devised, tested, copied, and codified the conventions that we now take for granted in the book, including the title page, table of contents, index, headers, footnotes and so on (Eisenstein 1979: 52; Grafton 1997; Lehmann-Haupt 1950: 53-54). MMPWs, like film at the start of the 20th century, will require the development of a new “grammar” specific to the medium of virtual environments.
Consider, for example, the problem of creating and expressing animated forms comprised of heterogeneous objects such as text, two-dimensions, three dimensions, four dimensions, and sound. Such forms will require the development of a “grammar” to govern the interpolation of these objects, and, indeed, in the field of HCI researchers scholars such as Doug Bowman and Nicholas Polys have established a new research domain – designated "Information Rich Virtual Environments" – designed to support that very purpose (Polys and Bowman). Humanities scholars as HCI researchers well know are well placed to contribute to the implementation of such a research agenda (Blythe and Wright xv-xvi). The present is hardly the first time in which communicants have sought to combine heterogeneous forms of representation to express what they know and believe. As Peter Daly and others have observed, humanities scholars can point to a rich array of practices that were designed to support emblematic literature and its precursors, precursors including Egyptian hieroglyphics, medieval bestiaries and lapidaries, Renaissance loci communes, and commonplace books (Daly 1998: 9-41). As practitioners and scholars begin the process of formulating new conventions and work practices for virtual and other digital environments, it would be a tragic oversight to overlook the design heritages from the West and East. Humanities scholars have the deepest knowledge of these practices, and accordingly would be well placed to initiate and lead research programs dedicated to their appropriation and adaptation to digital environments.
Such research will require the support of High Performance Computing because the datasets employed will be extremely large. In domains such as history, for example, scholars will be considering how to structure and interpolate information in virtual environments comprised of thousands, hundreds of thousands, and even millions of agents. The incorporation of the Rome Reborn project’s model of ancient Rome into Google Earth is one precursor of the kinds of historical reconstructions that historians will attempt to create in virtual environments, given the time, resources, personnel, and computational power (Ancient Rome in 3D 2009). Such representations in the future will be comprised of not only buildings, but also the populace that lived and worked in them – autonomous avatars capable of modeling the behaviour of a given populace over the course of a given day.
Processing and rendering buildings and avatars alone in real-time will require far more computational power than historians can access through the personal computer, in addition to the other constituents of urban life that would need to be modelled, including prosaic objects like pottery, complex objects like plants, and external objects like telephone poles. If historians and other humanities scholars intend to appropriate the analytical and expressive potential of MMPWs and other large virtual environments, they will need to learn how to design and construct them. They will need to create and test expressive and attestive practices, tools, and workflows to support narration, expression, and documentation in virtual environments. And to do that, they will require the support of High Performance Computing computers are capable of supporting tera-scale levels of computing.
Secondly, aside from its expressive implications, the MMPW is significant because of its potential economic impact. This form of argument is one that social science and humanities scholars generally prefer to avoid, largely because many feel that the value of research should never be subordinated to economic considerations. Research, particularly in the human sciences, should be judged by other criteria, its contribution to knowledge, and its potential cultural, social or political impact. I understand the position. In some cases, I even sympathize with it. But with respect to the future deployment of HPC in the human sciences, I suggest that such a position is counter-productive. If SSH researchers move into the domain of HPC along the lines I am describing, they will be asking a great deal from CFI and Compute Canada. In essence, they will be asking for clusters specifically dedicated to SSH projects. Such a request will be necessary because researchers will require access to HPC either on a protracted basis (one, two, or three days) in the case of serious games, or a persistent basis (months, potentially years) in the case of MMPWs. For Compute Canada and CFI to agree to such a request, the SSH community must provide arguments that will resonate with policy-makers. Economic arguments, in my estimation, will provide the most persuasive rationale for CFI and Compute Canada to allocate HPC resources for the purposes we designate.
This point can be elaborated by considering the impact of Canada’s cultural sector generally, and the potential economic impact of MMPWs particularly. In 2004, Statistics Canada reported that Canada’s arts and cultural sectors contributed some forty billion dollars to the country’s gross domestic product (The Canada Council for the Arts 2007). On a more concrete level, the producers of the science fiction series Stargate have estimated that the franchise has generated some five hundred million dollars (US) for the British Columbia economy in the past years (McNamara 2006). With respect to the potential economic impact of MMPWs, in 2005 the game World of Warcraft generated some two hundred-fifty million (US) in revenue (Schiesel 2005).
These figures to my mind present five important implications for Canada’s humanities and social science community and SSHRC:
This conclusion raises the question as to what SSHRC should do now, and in the near and far future, with respect to facilitating the integration of HPC into the research agendas of the disciplines it represents. I have no specific policy recommendations. What I can offer is three principles that I believe should govern the formation of future policy.
Principle One: SSHRC and Canada’s social science and humanities research communities should pursue the strategic objective of obtaining dedicated clusters in future iterations of Canada’s HPC infrastructure.
For the reasons specified above, SSH researchers will require the capacity to interact on a sustained basis with the virtual environments and simulations they are using to support their research. They will further require dedicated clusters within Canada’s HPC infrastructure to support that research. However, I do not believe the acquisition of such clusters to be an achievable aim for this round of funding. There are two reasons why; first, the community of humanities and social science researchers is currently too small to merit allocation of one or more clusters. Second, Compute Canada’s proposal has already been approved for funding. The proposal was based on a certain understanding of the needs of our various research communities. Our respective communities no longer have the time or opportunity to change that understanding. Our efforts are better dedicated toward preparing for future iterations of Canada’s HPC infrastructure.
Principle Two: SSHRC should develop a deeper working relationship with CFI and Compute Canada in order to realize the aspiration of increased access to HPC.
Rationale: At this stage, there is little reason to expect more from CFI and Compute Canada. If HPC is to be integrated into SSH research and practice, I am suggesting that the human science community and SSHRC on the one hand, and CFI and Compute Canada on the other must jointly make that integration a long term strategic goal to be realized in ten to twenty years time.
If SSHRC means to encourage the incorporation of HPC into social science and humanities research, it must deepen its relationship with CFI at all levels, and develop a relationship with Compute Canada so that there is an understanding in both organizations of the present and future research needs of our respective communities. SSHRC and SSH scholars, for their part, must present CFI and Compute Canada with concrete research initiatives along the lines I have described. Otherwise, neither organization will have an incentive to adjust the configuration and operation of Canada’s HPC infrastructure.
Principle Three: SSHRC, CFI, and Compute Canada should take individual and collective steps to encourage the next generation of SSH researchers to incorporate HPC into their research.
Rationale: Such encouragement, I believe, will not come from established scholars in the disciplines represented by SSHRC. Most scholars have a surface understanding, if any, of high performance computing, and would not likely know how to exploit it even if they did have a deeper understanding. The potential for social science and humanities HPC rests with today’s graduate students. Realization of that potential will only come if they are exposed to the opportunities that HPC will afford.
Ancient Rome in 3D. “See Ancient Rome in 3D.” Google Earth Website. 2009. Web. 19 May 2009 <http://earth.google.com/rome/>.
Barrett, Chris L., Stephen G. Eubank and James P. Smith. “If Smallpox Strikes Portland…‘Episims’ unleashes virtual plagues in real cities to see how social networks spread disease. That knowledge might help stop epidemics,” Scientific American 292.3: 54-61. 2005. 2005. Web. 19 May 2009 < http://www.scientificamerican.com/article.cfm?id=if-smallpox-strikes-portl >.
Blythe, Mark and Peter Wright. “From Usability to Enjoyment.” Funology. Eds. Mark A. Blythe, Kees Overbeeke, Andrew F. Monk, and Peter C. Wright. New York: Kluwer Academic, 2004: xiii-xix. Print.
c3.ca.Engines of Discovery: The 21st Century Revolution. 2005. Web. 19 May 2009 <www.c3.ca/LRP>.
Canada Council for the Arts, the. “Government investments in arts and culture yield very significant returns.” Canada Council for the Arts Website. 2007. Web. 19 May 2009 < http://www.canadacouncil.ca/aboutus/Promotion/qn127306575550156250.htm>.
Daly, Peter Maurice. Literature in the Light of the Emblem: Structural Parallels between the Emblem and Literature in the Sixteenth and Seventeenth Centuries. Toronto: U of Toronto P, 1998. Print.
Dawes, Sharon S., Anthony M. Cresswell, and Bruce B. Cahan. “Learning from Crisis: Lessons in Human and Information Infrastructure from the World Trade Center Response.” Social Science Computer Review. 22.1 (2004): 52-66. Print.
Eisenstein, Elizabeth L. The Printing Press as an Agent of Change: Communications and Cultural Transformations in Early-Modern Europe. Volumes I and II. Cambridge, UK: Cambridge UP, 1979. Print.
Farenc, Nathalie, Et. Al. A Paradigm for Controlling Virtual Humans in Urban Environment Simulations.” Applied Artificial Intelligence Journal. 14.1 (2000): 69-91. Print.
Gehorsam, Robert. “The Coming Revolution in Massively Multi-user Persistent Worlds.” In IEEE Computer. 36.4 (2003): 93-95. Print.
Gonçalves, Luiz, Marcelo Kallmann, and Daniel Thalmann. “Defining Behaviors for Autonomous Agents Based on Local Perception and Smart Objects.” In Computers and Graphics. 26.6 (2002): 887-897. Print.
Grafton, Anthony. The Footnote: A Curious History. Cambridge, MA: Harvard UP, 1997. Print.
Lehmann-Haupt, Hellmut. Peter Schoeffer of Gernsheim and Mainz. Rochester,
New York: The Printing House of Leo Hart, 1950. Print.
McNamara, Mary. “Stargate 200: Science-Fiction Series ‘SG-1’ Is Cable’s First to Reach Historic Milestone.” Multichannel News: A Variety Group Publication. 2006. Web. 19 May 2009 < http://www.multichannel.com/article/CA6332083.html>.
Polys, Nicholas F. and Doug A. Bowman. “Design and Display of Enhancing Information in Desktop Information-Rich Virtual Environments: Challenges and Techniques.” Virtual Reality 8 (2004): 41-54. Print.
Rockwell, Geoffrey, Ian Lancashire and Ray Siemens. “Large-Scale Text Analysis.” Compute Canada – A Proposal to the Canada Foundation for Innovation – National Platforms Fund. 2006. Web. 19 May 2009 < http://www.c3.ca/ce/NPF-FINAL.pdf>.
Schiesel, Seth. “World of Warcraft Keeps Growing, Even as Players Test Its Limits.” New York Times. 2005. Web. 19 May 2009
Thalmann, Daniel, Soraia Raupp Musse, Marcelo Kallmann . “From Individual Human Agents to Crowds.” Informatik Magazine 1 (2000): 6-11. Print.
 Compute Canada replaced c3.ca, a national organization that was founded in 1997, in order to advocate High Performance Computing in Canada. The new organization functions both as an advocacy group and as a governing body to ensure the proposed platform functions as a national infrastructure. Compute Canada is comprised of researchers, the seven originating HPC networks, and Canadian universities. See(Compute Canada 2009).
 Atlantic Computational Excellence Network. Members: Dalhousie U., Memorial U., Mount Allison U., St. Francis Xavier U., St. Mary's U., U. of New Brunswick. U. of Prince Edward Island. Acadia U., Cape Breton U.
 Consortium Laval UQAM McGill and Eastern Quebec for High Performance Computing. Members: McGill U., U. Laval, UQAM, and all others branches and institutes of 1'Universite du Quebec: UQAC. UQTR, UQAR, UQO, UQAT, ETS, ENAP and INKS.
 SciNet. Member: University of Toronto.
 High Performance Computing Virtual Laboratory. Members: Carleton U., Loyalist College, Queen's U., Royal Military College, Ryerson U., Seneca College, U. of Ottawa.
 Réseau québécois de calcul de haute performance. Members: Bishop's U., Concordia U., École Polytechnique, U. de Montréal, U. de Sherbrooke.
 Shared Hierarchical Academic Research Computing Network. Members: Brock U., Fanshawe College, U. of Guelph, Lakehead U., Laurentian U., Sir Wilfrid Laurier U., McMaster U., Ontario College of Art and Design, U. Ontario Institute of Technology, Sheridan College, Trent U., U. of Waterloo, U. of Western Ontario, U. of Windsor, York U.
 Western Canada Research Grid. Members: Athabasca U., Brandon U., Simon Fraser U., U. of Alberta, U. of British Columbia, U. of Calgary, U. of Lethbridge. U of Manitoba, U. of Northern British Columbia, U. of Regina, U. of Saskatchewan, U. of Victoria, U. of Winnipeg.
 I’m indebted to Geoffrey Rockwell who in a discussion brought this issue to my attention prior to the composition of this article.
 For one recent example of a social science contribution toward improving the operational effectiveness of policy-makers and first responders see (Dawes, Cresswell and Cahan 2004).
 Researchers such as Daniel Thalmann and Marcelo Kallmann have pioneered much of the research dedicated to creating avatars capable of functioning autonomously in virtual environments. See (Gonçalves, Kallmann and Thalmann 2002) (Thalmann, Musse and Kallmann 2000) and (Farenc et al 2000).
 Tera-scale computing refers to the capacity to current HPC computing clusters to calculate one trillion calculations on one trillion bytes of data per second. HPC computers are now in the process of proceeding to peta-scale levels of computing, the calculation of one quadrillion calculations on one quadrillion bytes of data.
 It is also worth considering that the Canada Council is the source for this statistic. Its appeal to economic arguments suggests that economic and artistic interests are not always at odds. A similar argument could be made that there is not always a conflict between the research interests of SSH scholars and research initiatives that would serve the economic interest of Canada. See (The Canada Council for the Arts 2007).