/ The Triple Helix: Cyberinfrastructure, Scholarly Communication, and Trust

Despite the image of the solitary individual at the bench or desk, scientific research — and research in general — is a process of communication as well as investigation and discovery. It is actually intensely social. Thomas Edison is credited by some with having invented the industrial laboratory in the 1870s and 1880s when he hired talented scientists to work on problems of electricity and engineering at his northern New Jersey labs.[1] Today, research teams are the norm, and we study them as sociological objects.[2] We even have a journal called Social Studies of Science (Sage Publications) where in the October 2007 issue (vol. 31) readers can find articles about history, social dynamics within medical research, ethical implications of biomedical research, politics, public health, and assigning credit and authorship. In this special issue of the Journal of Electronic Publishing, Jeremy Birnholtz reports on his research on exactly this point, the tension between formal and informal systems of authorship in the high-energy physics community at CERN (Conseil Européen pour la Recherche Nucléaire, or European Council for Nuclear Research). This community of researchers exemplifies engineering-enabled big science in its use of advanced technology to plumb fundamental questions, large and distributed teams, and associated social systems. CERN specifically, and high energy physics generally, are also simultaneously consumers and producers of “cyberinfrastructure.” Famously, Sir Tim Berners-Lee’s work that culminated in the World Wide Web began as a hypertext project to manage thousands of researchers and hundreds of computer systems.[3]

The articles that follow examine the role of communications in the computationally intensive research infrastructure, known as the “cyberinfrastructure.” Infrastructure can be a slippery concept, in part because infrastructure systems are hierarchical. For example, local roads provide the transportation infrastructure for a community; the national interstate highway system incorporates those roads as feeder systems into a national network of high-speed, limited-access throughways. We can think of cyberinfrastructure as the connective tissue of shared services, capabilities, and resources that mediate between users and the discrete entities that the infrastructure systems knit together. Similarly, a router by itself is piece of equipment; a router as an element in a system of nodes and links is a component of the Internet infrastructure, and as such, that router helps propel both the notion of cyberinfrastructure and the research that it enables.

Cyberinfrastructure, Communication, and Scholarship

The term “cyberinfrastructure” originated in a report by the National Science Foundation (NSF) where it is defined as the comprehensive infrastructure required to capitalize on advances in information technology, which “integrates hardware for computing, data and networks, digitally-enabled sensors, observatories and experimental facilities, and an interoperable suite of software and middleware services and tools.”[4] The American Council of Learned Societies subsequently adopted the term in its report on a cyberinfrastructure for the humanities, and it has crept into routine discourse in higher education and advanced research. The cyberinfrastructure is composed of three principal layers: the two layers of the network (a physical layer and a logical layer), and the shared facilities, resources, and services that broadly enable research and with which the end user eventually interacts. All three of these dimensions — physical, logical, and social — are intertwined, so that the image might be more rightly characterized as a triple helix than a stratigraphy of horizontal layers. Although the investigator at the desktop ideally should not have to worry about the magic behind the technical curtain, decisions at many points, from the network infrastructure to the user interface, affect what the scholar can do and therefore how the research progresses.

The existence of the networks and the creativity with which they have been used drive infrastructure development and justify continuing to do so. The network enables use and users shape the network, explicitly in the way that they build hardware and software, and implicitly in their demands for services. These demands require more than simply bandwidth; they also trigger development of utilities on which other services and applications are built, so that the infrastructure itself grows progressively more complex — just as the national transportation infrastructure is composed of complementary and competing road, water, and air systems of local, regional, and national scope. In cyberspace, the Web browser has evolved in barely an academic generation from a user interface to a platform for Web 2.0 services, becoming a part of the infrastructure and at times almost invisible to the user. In short, success has begat success.

Expectations of IT support have escalated, straining resources on university campuses. Joel M. Smith and Jared L. Cohon, respectively vice president/chief technology officer and president of Carnegie Mellon University, have noted that in the framework of supporting a university computing environment, “difficult decisions are required to select which expectations to meet and which to disappoint.”[5] Indeed, the distributed nature of the system enabled by high-speed networks means that expensive resources like data repositories and digital libraries can be consolidated and replicated in a few places while made locally accessible to investigators. In this issue William Arms and Ronald Larsen discuss sources, character, and issues associated with the emergence of “cyber-scholarship,” a new type of research that becomes possible with networked, high-performance computing and large-scale network-accessible collections of data. In another article here Sayeed Choudhury describes one of the pioneering examples at the Johns Hopkins Virtual Observatory.

One consequence of the network is a proliferation of information objects, from databases that can be networked to collections of articles and papers, to wikis, online lab notebooks, and so on. Investigators now share interim and final research results in an ecosystem of formal and informal communication objects with different characteristics and different audiences, and we take on the roles of reader, writer, observer, and critic, depending on the context. So far, the technologies have resulted in an increasingly heterogeneous information environment with more and more varied objects rather than substitution and displacement of any one of them. Serious and sometime heated discussions revolve around the implications of the information technologies for traditional relationships between libraries, scholarly societies, and publishers and their roles in the higher education and advanced research enterprise and traditional modes of scholarly publishing, notably the journal article and the monograph. Karla Hahn, Charles Henry, Peter Suber, and Donald Waters all consider aspects of these altered relationships and their implications for scholarship and public policy in this issue of JEP.

Two themes percolate through these issues and concerns: (1) the ability of the technology to reduce barriers and (2) the resulting increased access, first to the scholarly literature and, more recently, to the underlying research data or source material. Where exchanges have historically been limited, investigators have been able to expand the conversation at an earlier point in the research process and to expose the content of these exchanges to observers, who may not themselves be affiliated with higher education or the research establishment. In the main, there seems to be broad consensus that this has enriched the scientific enterprise,[6] and the expectation is that the humanities will witness a similar florescence. In his article in this issue, Stephen Nichols describes one example of such new technology-enabled research, based on a digital library that contains digital representations of all known copies of the medieval manuscript Roman de la rose, thus surmounting barriers created by geography, ownership, and the very fragility of the original sources and allowing undergraduates as well as senior scholars to confront the manuscript in ways not otherwise possible.

Intrinsic to the networked research environment is the model of collaborative research, which both motivated the network and is a consequence of it. While powerful, collaborative research also poses challenges for individuals and to the culture of research and scholarship in which they function. As Birnholtz points out, the very collaborative nature of the research at CERN is at cross-purposes with the prestige and reward systems of higher education, which are predicated on recognizing individual achievement. CERN has stringent protocols for authorship, but because they are so inclusive the effect has been to reduce the importance of formal publication and to strengthen the significance of informal communication, despite the general value of formal publication in key journals for promotion and tenure review.

Creating repositories of reusable data of broad value to a discipline is most vulnerable to the tension between the need for individual recognition and the benefit to the community of the shared resource. Indeed, new mechanisms are emerging to both provide credit and establish validity. Sayeed Choudhury describes ways that the Johns Hopkins Virtual Observatory accommodates the value stream of higher education by acknowledging individual contributions to the reusable datasets that help build a shared resource. The structure of the Protein Data Bank[7] tracks provenance and assigns credit to those who deposit sequence data. Finally, Waters describes how meteorology investigators in the UK, with support from the Joint Information Systems Committee (JISC), have cooperated “to establish a new kind of electronic publication called a data journal, where practitioners would submit data sets for peer review and dissemination.” Such strategies not only satisfy investigators’ need to establish authorship of ideas and contributions but also offer them ways to evaluate the data for their future research and thus to determine whether to trust it.[8]

The Role of the Public

The tension between collaboration and recognition of individual achievement is one source of dislocation to traditional behaviors and practices. A second is the expanded visibility into the research process, which was hitherto constrained by physical access to offices and labs and the formal presentation of results at professional conferences or in journals and monographs. But the public is actually quite interested in scholarship and certainly in science, whether measured by the popularity of the Smithsonian Institution’s National Air and Space Museum or the Pew Internet and American Life’s 2006 survey of Internet users’ interest in science.[9] Besides offering more channels of information through websites like Discovery Online or National Geographic, the global diffusion of Internet connectivity has offered an online opportunity for volunteers to engage directly in the research process, much like those volunteer observers who call in neighborhood weather conditions to local television stations. SETI@home is a good example. Launched in May 1999, the experiment uses Internet-connected computers in the Search for Extraterrestrial Intelligence (SETI) to leverage distributed, unused capacity on home computers in lieu of special purpose supercomputers and thus satisfy Radio SETI’s “insatiable appetite for computing power” to analyze radio telescope data.[10] Participation is not casual. Participants in SETI download client software and must first register, providing a valid e-mail account and identifying the computer. Personal identity is not revealed, but information about the computer is exposed.[11] At the same time, there are explicit efforts to explain the science and engineering to the participants, who then become part of a community surrounding this enterprise and have a stake in its success.

This model of leveraging distributed capacity has been used in other contexts, notably protein folding,[12] and suits problems that can be cleanly parsed into discrete tasks. It builds on a longer tradition of structured, volunteer participation in research projects, particularly where the need for detailed observational data is simultaneously broad and deep and well beyond the resources of a given project.[13] These are wonderful examples and the public discourse can be enriched by providing such opportunities for citizen scientists to contribute to the research enterprise and to engage in lifelong learning. But democratizing access can also introduce risks. The obvious areas of concern are privacy and data confidentiality, as Arms notes.

Biomedical and social science research have formal requirements for protecting individuals’ privacy, but other disciplines also have concerns about data confidentiality that become more important when physical barriers to access no longer tacitly protect the data from broad public view. Certain kinds of anthropological research may have political overtones associated with indigenous rights movements. Archaeological collections have long restricted access to locational information about sites, since looting destroys the scientific value of the resource. Distribution of water, mineral, and other natural resources has commercial implications, and information about these resources has profound economical and political ramifications.[14] Moreover, in the global context of science, cultural mores may affect what is considered proper for viewing. Building collections and providing access to information are refracted through a cultural prism by both creators and users, and successful systems, like all successful communication systems, must take into account both sides of the exchange.[15]

The notion of privacy extends to the researchers as well, a topic less frequently acknowledged. Research results, not just the data, are potentially exposed when they are still speculative. This has the useful effect of allowing more exchanges among scholars, particularly in niche areas. But it also opens the door to misunderstanding, especially when the results are at their most speculative and feedback might be most useful. Freedom to fail is an important dimension to research, and inappropriate visibility potentially constrains that freedom. Sure, some amorphous “they” laughed at Einstein and Galileo, but to complete this well-known quip by the late Carl Sagan, the same “they” also laughed at Bozo the Clown. Few careers outside of the circus have advanced as a result of having been labeled the work of “Bozo the Clown”; few scientific careers have flourished as a result of having won a Golden Fleece award from the late Senator William J. Proxmire (who perused titles of scientific research projects, necessarily out of context, to identify the most absurd-sounding as examples of waste in government spending).[16] Moreover, Huntington Willard, head of the Institute for Genome Sciences & Policy at Duke University, pointed out in his remarks at the North Carolina Science Blogging Conference earlier this year, no one wants to read the phrase, “contrary to the result of Willard and his colleagues,” even though it is understood that science is an “imperfect process,” and it progresses by one generation finding the mistakes and limitations of the work that came before.[17]

Willard argued passionately for the importance of engaging the public in science, but shared several cautionary experiences. Poignantly, he described dealing with queries from gravely ill people who had read about some his research team’s work in the New York Times, which cast the work in terms of its long-term implications for gene therapies. Readers had a different notion of “long-term” and sent him desperate messages seeking help, confusing the results of early research with prospects for therapeutic intervention. He also described downright threatening e-mail messages he had received from those who had taken exception to an op-ed piece he had written about evolution. Finally, he described the ridicule inadvertently visited on a student when the highly technical title of her award-winning research project was blared out over the loudspeaker at a football game — without context or explanation — in a well-intentioned but clumsy attempt to congratulate her.[18]

Although anecdotes, these examples remind us that audience matters and that not all audiences are alike. In contexts in which we might expect broad readership of varying levels of sophistication, we might start asking ourselves a few questions about ways to structure informal communications to encourage the exchange of ideas, engage the public, and protect the integrity of the research, including the protection of strange or unpopular ideas, or freedom to imagine the unthinkable without being held inappropriately accountable. The example of SETI and other projects like it suggest that fostering community and shared context and responsibility are key elements. The famous New Yorker cartoon notwithstanding, sometimes we do need to know whether the user is a dog — even on the Internet.

Communication and Trust: Knowing what to expect

Given his experience, Willard speculates that scientists of his generation remain more comfortable with incremental work published within the confines of the research journal because it is predictable and is directed toward a professional readership. Certainly, studies have consistently found that scholars do not wish to forego publishing in the traditional journal, be it online or in print, even though they also employ other forms of communication.[19] By contrast, Birnholtz’s research on the high-energy physics community shows that although informal modes of communication and recognition outweigh the significance of formal authorship, there is also unease and anxiety among the junior researchers precisely because the parameters of the informal systems are not explicit. From a cultural perspective, the system of scholarly communication exercises a gatekeeper function, implicitly promising a professional community of readers and authors that share the same context, as well as a widely used but imperfect system for measuring productivity and impact in promotion and tenure reviews. In effect, the journal system promises authors a certain audience, screening out the professional equivalent of displaying a technical research poster for a football game’s audience.

So let’s think for a moment about why journals have been successful. Willard’s observation and the surveys consistent with it point to a base of support among the researchers themselves. I have suggested that the journal system offers reassuring structure, intentionally or not, that provides tacit, broadly accepted rules for communicating results and evaluating those findings for several purposes, including career advancement.

Journals also signify community, and in this regard, the recent organization of the International Journal of the Commons is telling. Following publication in Science of Garrett Hardin’s seminal article “The Tragedy of the Commons” in 1968, writers in biology, economics, political sciences, humanities, and a number of other disciplines examined the notion of the commons and published their findings in more than 250 journals. But the “scattered” pattern of publication led to the establishment of the journal this year, among other reasons, to facilitate the “emergence of a research community in which the members are aware of each others’ findings.”[20] Releasing findings across a range of outlets, even when those outlets are peer reviewed and when researchers meet at various conferences, is not thought sufficient.

Finally, journals provide rigorous (although not foolproof) controls intended to minimize professional embarrassment. Researchers value the carefully scrutinized editing and ability to consolidate findings in a recognized way in a process that also has a formal channel for responses through letters to journal editors and vetted and scrutinized papers formally critiquing the published results. The full sequence, from paper to article to publication to responses, establishes a disciplinary consensus on the formal record as distinct from the less formal and perhaps more speculative working papers, interim reports, and so on that constitute the grey literature. Both are important.

The underlying issue is trust. Broadly conceived, the research enterprise is a social and collaborative endeavor necessarily predicated on trust, a topic that sociologists and students of organization theory have systematically studied.[21] Critical to understanding conditions favorable to building trust relationships, and hence to building collaborations, is the notion of “control,” which can be either explicit, as in policies and procedures, or implicit, as in shared values, informal exchanges, and interpersonal trust.[22] Both forms of control are necessary. As a social system, the cyberinfrastructure is similarly collaborative; it requires widely distributed cooperation among numerous people and interests to keep the systems running and the information flowing, and the more complex it becomes — expanding from the communications network of the Internet itself to include data repositories and services — the more important trust becomes, both in the performance of the network and in the content of the data that now becomes broadly accessible.

If we want people to depend on and use the infrastructure, they must trust it, just as we trust that the electricity will flow when we flip the switch. Our collective outrage when levees fail in New Orleans or a bridge collapses in Minneapolis indicates the extent of our dependency, which is predicated on trust and the expectation that such massive structures will always be there, that they will do what we expect them to do and not do what we expect them not to do.[23] Information infrastructures are more challenging because they lack the physical presence of dams and bridges, but our collective expectations are no less real. Thus the journal, which was a 17th-century invention for the communication of primary research results, is clearly no longer the first or primary port of call in the scientific communication process for working researchers in a given field. But the journal system — vetting, review, publication, distribution, and comment — has become a form of social control. Its value and associated processes are widely understood. When viewed in the framework of organization theory, the scholarly communication system can be seen as conducive to promoting an environment of trust, particularly when the participants in a collaboration may not have extensive prior experience with each other.

As Willard and others point out, science progresses by building on itself, and to do so, a stable and archived record that reflects some sense of the agreed-upon knowledge within a community of professionals is essential. What is the version of results that researchers can trust so that they can build on it? Probably the juried one in Science, Nature, or one of the authoritative journals in their fields. What publications sway reviewers, whether for grants or for promotion and tenure? Probably the same ones. Lisa Spiro and Jane Sigal make a similar point in their research on the use of three digital scholarly collections in the humanities, the Walt Whitman Archive,[24] the Dickinson Electronic Archives,[25] and Uncle Tom’s Cabin and American Culture.[26] They find that scholars consult these collections more frequently than they cite them. Among the reasons given for lack of citations were lack of awareness, confusion about the appropriate citation format or reluctance given a perceived clumsiness to the citation format, and, significantly, a preference for the “standard print edition, which is thought to have more credibility and be more permanent.”[27]

Whither journals?

In the last decade or so, the notion of the peer-reviewed scholarly journal has become decoupled from the fixity of print on paper, thus migrating the trust model of peer-reviewed print on paper to electronic journals. There is no inherent reason why the journal object could not be decoupled conceptually from the publisher just as peer review has been decoupled from print. Indeed it has been proposed that a peer review overlay, organized by the professional societies, could be placed on institutional repositories, enabling them to serve the archival function of a virtual journal-like aggregation and displacing the relationship between scholarly societies — which generally connect to the research communities — and publishers, which provide the management, production, and distribution platforms.[28] This is probably an oversimplification, given the management challenges that even a simple journal poses and the credibility invested in journals by the researchers and the culture in which the research occurs. Moreover, the institutional repository movement, described in this issue by Kathlin Smith, has yet to obtain substantial traction and cannot now project persistence as traditional journal publishing models can, although historically preservation in fact fell largely to the academic libraries.

Over time, though, institutional repositories, too, may well become part of the cyberinfrastructure — if they survive and if they meet the needs of their constituencies. The interesting questions are: what functions will these repositories fulfill? And will journals as vehicles for formal communication prove redundant? My guess is that we will continue to have both journals and repositories, as well as objects like the complex website that Nichols and his colleagues are building. The key to the success of these new objects, including ones we have yet even to imagine, will be their ability, like the Roman de la rose digital library or the scientific databases, to embody scholarly values, take advantage of the affordances of the new medium, and achieve acceptance and widespread use of this distinctive form of scholarship and expression.

For now, the scholarly publication system fulfills a substantial component of a broad requirement for a collective trusted persistent record for multiple audiences in a way that distributed papers and information objects of a range of stability, polish, quality, and credibility presently do not. This is not to say that any of these forms of communication is unimportant; rapid communication of interim findings is vital and its importance can be seen in the eagerness of researchers to sign up for RSS feeds and alerting systems. It is to say that these resources serve different functions and address different audiences.

Scholarly publications have been successful because they have met core needs of their respective communities and embody a shared trust model. There is no inherent reason why other forms of communication cannot become components of the cyberinfrastructure and the research environment that infrastructure enables, extending and augmenting formal communications without replacing them. Although they are ubiquitous, foundational, and sometimes physically massive, infrastructure systems evolve and knit together new components quite effectively. It is but a question of time and trust. And of the cyberinfrastructure’s ability to embody the physical, logical, and social helix to meet users’ expectations so effectively that those users forget about the magic behind the curtain, take the systems for granted, and go about their work. And yes, even the dog can participate.

Now Director of Programs at the Council on Library and Information Resources (www.clir.org), Amy Friedlander is the founding editor of D-Lib Magazine (www.dlib.org) and the author of the five volume history of large scale, technology-intensive infrastructures in the U.S. for the Corporation for National Research Initiatives (www.cnri.org/series.html). Her interest in infrastructure was piqued by her research for the history of five large US infrastructure systems: railroads, telephone and telegraph, electricity, banking, and radio. Ms. Friedlander continues to write frequently on topics at the intersection of information technology, libraries, higher education and advanced research.


    1. Edison maintained labs at Menlo Park and then West Orange, New Jersey. This view of Edison’s role in the invention of the research laboratory is described in Thomas Parke Hughes, Networks of Power: Electrification in Western Society, 1880–1920 (Baltimore and London: The Johns Hopkins University Press, 1983), 21–27, and amplified by Robert Friedel and Paul Israel with Bernard S. Finn, Edison’s Electric Light: Biography of an Invention (New Brunswick, NJ: Rutgers University Press, 1986). The role of AT&T and Bell Labs is emphasized by Lillian Hoddeson, “The Emergence of Basic Research in the Bell Telephone System, 1875–1915,” Technology and Culture 21 (1981): 512-44; and Leonard S. Reich, “Industrial Research and the Pursuit of Corporate Self-Security: The Early Years of Bell Labs,” Business History Review 54 (Winter 1980): 504–29.return to text

    2. See, for example, Jonathon Cummings and Sara Kiesler, “Coordination Costs and Project Outcomes in Multi-University Collaborations,” July 15, 2007, http://www.cs.cmu.edu/~kiesler/publications/PDFs/ResearchPolicy7-15-07.pdf.return to text

    3. Living Internet, Web History,“Tim Berners-Lee, Robert Cailliau, and the World Wide Web,” http://www.livinginternet.com/w/wi_lee.htm.return to text

    4. National Science Foundation, Cyberinfrastructure Council, Cyberinfrastructure Vision for 21st Century Discovery (Arlington, VA : National Science Foundation, Cyberinfrastructure Council, March 2007), p. 6. http://purl.access.gpo.gov/GPO/LPS80410. return to text

    5. Joel M. Smith and Jared L. Cohon, “Managing the Digital Ecosystem,” Information Technology and the Research University, Issues in Science and Technology (Fall 2005). http://www.issues.org/22.1/smith.html (accessed Jan. 10, 2008).return to text

    6. This conclusion is actually hard to pin down but is based on my own experience as a participant in and observer of discussions among researchers and is also suggested by C. Lynch, “The Shape of the Scientific Article in the Developing Cyberinfrastructure,” CTWatch Quarterly 3, no. 3 (August 2007), http://www.ctwatch.org/quarterly/articles/2007/08/the-shape-of-the-scientific-article-in-the-developing-cyberinfrastructure/, in his introductory discussion of the contextual factors contributing to the rethinking of the scientific article. .return to text

    7. RCSB Protein Data Bank, http://www.rcsb.org/pdb/home/home.do. An entry in the PDB includes the following information: Title, Authors, Primary Citation, History, Experimental Method, Molecular Description Asymmetric Unit, Classification, Source, Ligand Chemical Component, and GO Terms. The data file includes provenance information and “Remarks,” which capture any correspondence between reviewers and depositors and are permanently associated with the sequence data. Thus, the data itself is subjected to vetting and review analogous to journal article review, provides credit to the author, and is transparent to future users. return to text

    8. Lynch makes a similar point (ibid.); see especially his section “Scientific Articles and their Relationships to Data.”return to text

    9. John Horrigan, “The Internet as a Resource for News and Information about Science,” Pew Internet and American Life Project, November 20, 2006; http://www.pewinternet.org/pdfs/PIP_Exploratorium_Science.pdf. The project was conducted in cooperation with the Exploratorium, which is located in San Francisco, and with funding from the NSF. Probably not surprisingly, younger, white, well-educated Internet users with broadband access at home turn to Internet sources about science and technology more than other demographic groups. Also not surprisingly, the most important issues to these users were topics related to origins of life, climate change, and stem cells. The National Air and Space Museum is said to be the world’s most visited museum; see National Air and Space Museum Press Kit, “Frequently Asked Questions,” http://www.nasm.si.edu/events/pressroom/presskits/museumkit/qanda_nasm.cfm (accessed Jan. 10, 2008).return to text

    10. “About SETI@home,” http://setiathome.berkeley.edu/sah_about.php (accessed Jan. 10, 2008).return to text

    11. SETI@home, “Read our rules and policies,” http://setiathome.berkeley.edu/info.php (accessed Jan. 10, 2008).return to text

    12. Folding@home, http://folding.stanford.edu/ (accessed Jan. 10, 2008).return to text

    13. Public participation in archaeological investigations is a well-known example of this practice in analog contexts. Libraries, museums, hospitals and schools also rely in varying degrees on volunteers so that the excess labor of these frequently well-educated individuals is mobilized.return to text

    14. See, for example, Lydia Polgreen’s stories in the New York Times about the underground lake in Darfur detected by Boston University scientists using remote sensing technology, reporting not only the presence of the lake but the 20 years of politics that have contributed to environmental degradation and that many believe stoke ethnic rivalries and the current genocide. See Lydia Polgreen, “A Godsend for Darfur, or a Curse?” Week in Review, New York Times, July 22, 2007, http://www.nytimes.com/2007/07/22/weekinreview/22polgreen.html?_r=1&oref=slogin (accessed Jan. 10, 2008); Lydia Polgreen, “Sudan: Underground Lake Could Ease Darfur’s Crisis,” World Briefing | Africa, New York Times, July 19, 2007, http://query.nytimes.com/gst/fullpage.html?res=940DE5DF1431F93AA25754C0A9619C8B63 (accessed Jan. 10, 2008). For the press release from Boston University, see “Hope for the Sahara,” May 17, 2007, http://www.bu.edu/phpbin/news-cms/news/?dept=4&id=45099&template=4 (accessed Jan. 10, 2008).return to text

    15. See, for example, Andy Guess, “Downloading Cultures,” Inside Higher Ed News, December 3, 2007; http://insidehighered.com/news/2007/12/03/newmedia (accessed Jan. 10, 2008).return to text

    16. Huntingdon Willard, “Promoting Public Understanding of Science,” 2007 NC Science Blogging Conference, January 20, 2007, http://wiki.blogtogether.org/blogtogether/show/Conference+Program+07 (accessed Jan. 10, 2008). Willard spoke eloquently of the damage done by quoting scientific terms out of context, specially citing the Golden Fleece awards. return to text

    17. Ibid.return to text

    18. Ibid. return to text

    19. In their initial study, King and Harley found that the critical issue was peer review, which could be conceptually separated from journals but in practice was associated with them; see C. Judson King, Diane Harley, et al., “Scholarly Communication: Academic Values and Sustainable Models,” Center for Studies in Higher Education, University of California, Berkeley, July 27, 2006, http://repositories.cdlib.org/cshe/CSHE-16-06/ (accessed Jan. 10, 2008), pp. 5–6. A larger study, based on these early results, is now in progress. A second study prepared by the University of California Office of Scholarly Communication and the California Digital Library eScholarship Program in cooperation with Greenhouse Associates, Inc., found that the faculty “overwhelmingly rely on traditional forms of publishing, such as peer-reviewed journals and monographs”: see “Faculty Attitudes and Behaviors Regarding Scholarly Communication: Survey Findings from the University of California,” August 2007, http://osc.universityofcalifornia.edu/responses/materials/OSC-survey-full-20070828.pdf (accessed Jan. 10, 2008), p. 4. Finally, faculty surveys conducted by the research organization Ithaka did find that faculty were more willing to move away from print journals to online journals, but they were wedded to journals and increasing numbers of them believe in the importance of their long-term preservation (82 percent in 2006, up from 74 percent in 2003; see Roger Schonfeld and Kevin M. Guthrie, “The Changing Information Services Needs of Faculty,” Educause Review (July/August 2007), p. 9.return to text

    20. Frank Laerhoven and Elinor Ostrom, “Traditions and Trends in the Study of the Commons,” International Journal of the Commons 1, no. 1 (October 2007): 8–9.return to text

    21. On cooperation and human societies, see Noah P. Mark, “Cultural Transmission, Disproportionate Prior Exposure, and the Evolution of Cooperation,” American Sociological Review 67 (June 2002): 323. Note that Mark argues that the experience of trust among individuals generates trust over time so that there arises a cultural transmission of the notion of trust. For a broad discussion of the notion of trust in economic relationships, see Francis Fukuyama, Trust: The Social Virtues and the Creation of Prosperity (New York: Simon & Schuster, 1996). Studies of collaboration in higher education resonate with the findings from the management literatures, particularly the role of trust; see Robert B. Stein and Paula M. Short, “Collaboration in Delivering Higher Education Programs: Barriers and Challenges,” The Review of Higher Education 24 (2001): 417–35; and Adrianna Kezar, “Redesigning for Collaboration in Learning Initiatives: An Examination of Four Highly Collaborative Campuses,” The Journal of Higher Education 77 (2006): 804–38. Michael Day of UKOLN has made a similar point about collaboration and cyberinfrastructure, based on a separate line of research; see Michael Day, “Toward Distributed Infrastructures for Digital Preservation: The Roles of Collaboration and Trust,” http://www.ukoln.ac.uk/preservation/publications/2007/idcc-2007/day/draft.pdf (accessed Jan. 10, 2008).return to text

    22. Margaret H. Christ, Karen L. Sedatole, and Kristy L. Towry, “All Control is Not Equal: The Effect of Formal Control Type on Trust and Cooperation” (July 2006), AAA Management Accounting Section (MAS) 2006 Meeting Paper available at SSRN: http://ssrn.com/abstract=776884 (accessed Jan. 10, 2008), 1.return to text

    23. I am indebted to Stephen J. Lukasik, Director of the U.S. Defense Advanced Research Projects Agency (DARPA) from 1971 to 1975, for this definition of trust and its relationship to infrastructure. He writes in an e-mail message dated December 5, 2007: “The construction [of this definition of trust] comes from the definition of command and control in military usage. Command means that forces do what you tell them to do and control means forces do not do what you have not told them to do. In the case of trust, the use of ‘expectation’ is intentionally fuzzy. Trust is not a matter of do or do not do. It has to do with the nature of a two-sided relationship where each has expectations, but these are not absolute or even invariant. I trust you not to intentionally wreck my car when you borrow it, but I do not have any expectation that if you go to buy me a CD you will get what I most want.” Lukasik’s explanation resonates with research that suggests that individuals are more likely to trust others if they believe that trust will be reciprocated; see, for example, Laurie P. Milton and James D. Westphal, “Identity Confirmation Networks and Cooperation in Work Groups,” Academy of Management Journal 48 (2005): 199, 205; Christ, Sedatole, and Towry, “All Control is Not Equal,” 22–23.return to text

    24. http://www.whitmanarchive.orgreturn to text

    25. http://www.emilydickinson.org return to text

    26. http://www.iath.virginia.edu/utc/ return to text

    27. Lisa Spiro and Jane Segal, “Scholars’ Usage of Digital Archives in American Literature,” November 30, 2007, unpublished manuscript cited by permission; see Spiro’s blog, Digital Scholarship in the Humanities, http://digitalscholarship.wordpress.com/ return to text

    28. Chris Armbruster, “Society Publishing, the Internet and Open Access: Shifting Mission-Orientation from Content Holding to Certification and Navigation Services?” (July 2007), available at SSRN: http://ssrn.com/abstract=997819 (accessed Jan. 10, 2008).return to text