Dear Readers: Have pity on me. I was supposed to go to the Fourth International Peer Review Congress in Barcelona, Spain, a wee bit earlier this year. I had worked long and hard to gather my thoughts on the progress of research on screen design, and had submitted, and lo!, had my ideas accepted for presentation at this prestigious gathering. I was on a roll, so I figured, why not go for it! After presenting my thoughts there, I planned my first ever "Mommy Vacation" — driving with an old high-school chum down the late summer coast of Spain. No work, no deadlines, no kids, no laundry, no commute, no phones. It was to be ten days of glorious bumming. Nothing but sun and surf.

I was scheduled to leave Washington, D.C. for the conference on September 12.

Needless to say, I didn't go. However, my presentation did, albeit in a truncated form. Ah, the glory of the Internet, safer for travel at the moment than my local airbus or thruway. Not only did my presentation go, but the thoughts and perspectives I sent have triggered new relationships, the exchange of opinions, the very stuff of collaboration and consensus-building. Below I present a version of what I presented there for your contemplation, and hope that you too will respond to me (p.benson@conservation.org) and let me know what you think of these musings.

Moving between Print and Screen: The Necessity of Collaborative Information Design

A tremendous amount of work has been done in a wide range of disciplines — including engineering, psychology, sociology, and design — on the readability and usability of information in hardcopy and electronic forms (for comprehensive reviews, Dillon 1994; Haas 1989; Schriver 1997, Sellen & Harper in press). Collaborative approaches to the design of information displays have led to highly functional electronic displays that facilitate users' abilities to quickly find, understand, and use task-specific information needed in problem-solving contexts. Similar multidisciplinary approaches are necessary to improve the development of electronic — and print — forms of scientific information. With improved technology, legibility of computer-displayed text is no longer a critical factor for the usability of electronic information (see Jorna & Snyder, 1991; Dillon 1994; O'Hara and Sellen, 1997). Rather, the usability of documents now rests squarely on how well information is designed to meet readers' tasks and goals in readings. Studies of professionals at work underscore three points that are critical to the design of scientific information: paper isn't going away, reading and writing are inextricably intertwined, and readers sample and navigate text according to specific purposes and tasks.

Paper isn't going away

Numerous ethnographic studies show that, despite a dramatic increase in the use of electronic information, readers in professional settings continue to rely heavily on paper documents to accomplish goals at work (see Sticht 1977; O'Hara 1996; Schriver 1997; Adler et al. 1998, Brown et al. 2000; Sellen & Harper in press). On the one hand, scientific information displayed in electronic form can offer readers advantages that paper cannot, including the ability to

  • display information in multiple media forms (e.g., text, video, audio),
  • search full texts quickly and accurately,
  • rapidly link to other electronic information,
  • dynamically modify or update content, and
  • store and access very large amounts of information.

However, paper gives readers advantages that electronic information cannot, at this point in the development of display technology. Paper allows readers to

  • quickly navigate through and around multiple documents,
  • read across more than one document at once,
  • annotate to support comprehension while reading, and
  • easily interweave reading and writing.

For example, readers in the workplace turn to paper documents when they want to annotate texts, mark texts to anchor the location of information, reference more than one document at a time, or write separate documents. Until electronic displays can offer readers the advantages of paper, readers will rely on both paper and electronic forms to accomplish job-related tasks. Decisions about what information to display on screen must be coordinated with what is made available on paper. The design and format of the two versions then need to be developed in the context of the "reading real estate" that is available.

Reading and writing are inextricably intertwined

Studies of readers working in professional settings underscore that readers write while reading, and writers read while writing (Sticht 1985; Bazerman 1988; Haas 1989,1992; McGinley 1992; Wright 1994; Wright 2000; O'Hara et al. 2000; Sellen & Harper in press). Among other writing activities, readers annotate texts to highlight structure and main points, record questions, and mark ideas that require reflection or further investigation, all of which impact their comprehension and recall of the text. Writers in professional settings draw from and reference multiple sources of information while composing, particularly in the sciences, where writers must anchor their arguments in specific places in specific texts that they have read in their efforts to gain acceptance for new ideas in discourse communities (Bazerman 1988; Rymer 1988; Myers 1990; Swales 1990; Spivey 1992; Halliday & Martin 1993; Hendry 1995; Paul & Charney 1995; Berkenkotter & Huckin 1995; Penrose & Katz 1997).

Electronic displays must accommodate the limits of the technologies by recognizing that readers need to write, and readers will often choose to switch to a paper format when the activity of writing will enhance the functionality of the task for them.

Readers sample and navigate text according to specific purposes and tasks

Readers in professional settings spend a great deal of their time working with texts searching for information, and engaged in activities other than word-by-word, sentence-by-sentence, linear reading (Dole et al. 1991; O'Hara 1996; O'Hara et al. 1999; Harmon & Gross, 1996; O'Hara & Sellen 1997). Instead, these task-driven readers spend large amounts of time navigating through text, strategically sampling information and making decisions about next steps in their search for information. Studies of the reading patterns of scientists, in particular, show that as readers, scientists often have distinctive patterns of sampling text as they read, searching first for author names and affiliations, followed by skimming abstracts, introductions, reference lists, and conclusions. Readers of scientific reports often do this navigation and sampling of information to determine whether a text is worth further detailed inspection. (Bazerman 1988; Dillon et al. 1989; Charney 1994; Wright & Lickorish 1994; Burrough-Boenisch 1998; Hackos & Redish 1998; Hartley 1999).

User and Task Analysis

As publishers increasingly make information searchable, readable, and writeable in electronic forms, they need to consider how to structure and design their information in forms that most readily support their readers' needs. Critically, these needs include the ability to work with information in paper forms when needed, the ability to easily integrate writing and reading tasks, and the ability to find the information they need to accomplish tasks quickly and easily.

Prior to the rapid expansion and use of the Internet, problems related to the design of information on computer screens were the domain of software developers alone. Today, however, practically anyone can create a Web page and make it available for use. Nonetheless, in professional settings, many involved in making decisions on Web-page designs base their decisions on their own perceptions of the information itself, or their beliefs about how and why users want the information they have to offer. The findings of decades of research in the computer industry — that user and task analyses must be done to inform and guide the design of electronically displayed information — often goes unheeded (for overviews see Drury 1993; Hackos 1995; Drury et al. 1987; Schriver 1997; Kostelnick & Roberts 1998; Schneiderman 1998; Dumas & Redish 1999; Lynch & Norton 1999; Nielsen 2000).

Recommendation: Know thy users

To improve the usability of the electronic forms of scientific information, scientific publishers need to:

  • Gather accurate information about the tasks that readers want to accomplish when reading electronic displays by observing and interviewing users in actual work settings. This direct data collection is a critical step in developing information designs that support how users really work with electronic information in the workplace, rather than how developers think they work. Although this endeavor may take an initial investment of time and training, task analyses can be done without hiring high-priced consultants. Kirwan & Ainsworth (1992), Harmonn & Gross (1996), Wixon & Ramey (1996), Hackos & Redish (1998), or Dumas & Redish (1999) are examples of straightforward instructional texts that can guide readers through the task analyses and usability testing.
  • Pay attention to when readers want to use paper when gathering data about the tasks that are supported (or deterred) by electronic information, attending both to when they want to mark documents they are reading, and when they use paper documents to support writing tasks.
  • Test, test, test. After initial documents and Web pages are designed, test the designs with a range of users in the workplace, working on a range of tasks. Be willing to change features that do not work well: usually more than one iteration of an information design is needed.


Philippa J. Benson is currently the Managing Editor of Publications at the Center for Applied Biodiversity Science at Conservation International in Washington, D.C. She received her Ph.D. in Rhetoric from Carnegie Mellon University in 1994. She has worked as a science editor for the National Institutes of Health, Alcoa Corporation, the American Institutes for Research in the Behavioral Sciences, and many other organizations. She has also taught scientific communication, and writing in particular, to practicing scientists and to graduate students for well over a decade. She comes to JEP eager to throw light to the still vital reality of the art of rhetoric, that is, the ability to find the available means of persuasion in all things. She can be reached at p.benson@conservation.org.


Endnotes

Adler, A., Gujar, A., Harrison, B., O'Hara, K. & Sellen, A. 1998. A diary study of work-related reading: Design implications for digital reading devices. In Proceedings of Conference on Human Factors in Computer Systems, April 18-23, Los Angeles, CA, pp.241-148.

Bazerman, C. 1988. Shaping Written Knowledge: the Genre and Activity of the Experimental Article in Science. Madison, WI: University of Wisconsin Press.

Berkenkotter, C. & Huckin, T. 1995. Genre Knowledge in Disciplinary Communication: Cognition/culture/power. Hillsdale, NJ: Earlbaum.

Brown, B., Sellen, A. & O'Hara, K. 2000. A diary study of information capture in working life. In Proceedings of Conference on Human Factors in Computing Systems. April 1-6, The Hague, Netherlands. pp.438-445.

Burrough-Boenisch, J. 1998. Survey of EASE Conference delegates sheds light on IMRAD reading strategies. European Science Editing 24, pp.3-5.

Charney, D. 1994. A study of rhetorical reading: How evolutionists read 'The Spandrels of San Marco.' In J. Seltzer (Ed.), Understanding Scientific Prose, pp.203-231. Madison, WI: University of Wisconsin Press.

Dillon, A. 1992. Reading from paper versus screens: A critical review of the empirical literature. Ergonomics 35, pp.1297-1326. [doi: 10.1080/00140139208967394]

Dillon, A. 1999. "So what do we know? An overview of the empirical literature on reading from screens." In Designing Usable Electronic Text: Ergonomic Aspects of Human Information Usage. pp.28-58. Bristol, PA: Taylor & Francis.

Dillon, A., Richardson, J. & McKnight, C. 1989. Human factors of journal usage and design of electronic texts. Interacting with Computers 1, pp. 183-189. [doi: 10.1016/0953-5438(89)90025-8]

Dole, J.A., Duffy, G.G., Roehler, L. & Pearson, P. 1991. Moving from the old to the new: Research on reading comprehension instruction. Review of Educational Research 612, pp.239-264.

Drury, C.G. 1983. Task analysis methods in industry. Applied Ergonomics 14, pp.19-28. [doi: 10.1016/0003-6870(83)90215-6]

Drury, C., Paramore, B., Van Cott, H., Grey, S. & Corlett, E. 1987. Task Analysis. In G. Salvendy (Ed.), Handbook of Human Factors, pp. 370-401. New York: Wiley.

Dumas, J. & Redish, J. 1999. A Practical Guide to Usability Testing. Exeter, UK: Intellect.

Haas, C. 1989. Does the medium make the difference? Two studies of writing with pen and paper and with computers. Human-Computer Interaction 10, pp.149-169. [doi: 10.1207/s15327051hci0402_3]

Haas, C. 1992. Writing technology: Studies on the materiality of literacy. Mahwah, NJ: Erlbaum.

Hackos, J. 1995. Finding out what users need and giving it to them: A case-study at Federal Express. Technical Communication 42, pp.322-327.

Hackos, J. & Redish, J. 1998. User and Task Analysis for Interface Design. New York: Wiley.

Halliday, M. & Martin, J. 1993. Writing Science: Literacy and Discursive Power. Pittsburgh: University of Pittsburgh Press.

Hartley, J. 1999. From structured abstracts to structured articles: A Modest Proposal. Journal of Technical Writing and Communication 29, pp.255-270. [doi: 10.2190/3RWW-A579-HC8W-6866]

Hendry, D.G. 1995. Breakdowns in writing intentions when simultaneously depolying SGML-marked texts in hard copy and electronic copy. Behavior and Information Technology 14, pp.80-92. [doi: 10.1080/01449299508914628]

Jorna, G.C. & Snyder, H.L. 1991. Image quality determines differences in reading performance and perceived image quality with CRT and hard-copy displays. Human Factors 33, pp.459-469.

Kirwan B. & Ainsworth, L. (Eds.). 1992. A Guide to Task Analysis. London: Taylor & Francis.

Kostelnick, C. & Roberts, D. 1998. Designing Visual Language. Strategies for Professional Communicators. Needham Heights, MA: Allyn & Bacon.

Lynch, P.J. & Horton, S. 1999. Web Style Guide: Basic Design Principles for Creating Web Sites. New Haven, CT: Yale University Press.

McGinley, W. 1992. The role of reading and writing while composing from multiple sources. Reading Research Quarterly 27, pp. 227-248. [doi: 10.2307/747793]

Myers, G. 1990. Writing Biology: Texts in the Social Construction of Scientific Knowledge. Madison: University of Wisconsin Press.

Nielsen, J. 2000. Designing Web Usability: The Practice of Simplicity. Indianapolis, IN: New Riders.

O'Hara, K. 1996. Towards a Typology of Reading Goals. XRCE (Xerox) Technical Report EPC-1996-107.

O'Hara, K. & Sellen, A. 1997. A comparison of reading paper and on-line documents. Proceedings of Conference on Human Factors in Computing Systems, March 22-27, Atlanta, GA, pp.335-342.

O'Hara, K., Sellen, A. & Bentley, R. 1999. Supporting Memory for Spatial Location while Reading from Small Displays. XRCE (Xerox) Technical Report EPC-1999-108.

O'Hara, K., Taylor, A., Newman, W. & Sellen, A. 2000. Understanding the Materiality of Writing while Reading from Multiple Sources. Paper submitted to ACM SIGCHI, 2001.

Paul, D. & Charney, D. 1995. Introducing chaos (theory) into science and engineering reports: Effects of rhetorical strategies on scientific readers. Written Communication 12, pp. 396-438. [doi: 10.1177/0741088395012004002]

Penrose, A. & Katz, S. 1997. Writing in the Sciences: Exploring Conventions of Scientific Discourse. New York: St. Martin.

Rymer, J. 1988. The scientific composing process: How eminent scientists write articles. In D. Jolliffe (Ed.), Writing in Academic Disciplines, pp.211-250. Norwood, NJ: Ablex.

Schneiderman, B. 1998. Designing the User Interface: Strategies for Effective Human-Computer Interaction. (3rd Ed.) Reading, MA: Addison-Westley.

Schriver, K. 1997. Dynamics in Document Design. New York: Wiley.

Sellen, A. & Harper, R. (in press). The Myth of the Paperless Office. Cambridge, MA: MIT Press.

Spivey, N. 1992. Writing from sources. In J.R. Hayes, R.E. Young, M. Matchett, M. Caffrey, C. Cochran & T. Hajduk (Eds.), Reading Empirical Research Studies. pp. 467-512. Hillsdale, NJ: Earlbaum.

Sticht, T.G. 1977. Comprehending reading at work. In M.A. Just and P.A. Carpenter (Eds.), Cognitive Processes in Comprehension. pp.221-246. Hillsdale, NJ: Earlbaum.

Sticht, T.G. 1985. Understanding readers and their uses of text. In T.M. Duffy & R. H. Waller (Eds.), Designing usable text. pp.315-340. New York: Academic Press.

Swales, J. 1990. Genre Analysis: English in Academic and Research Settings. Cambridge: Cambridge University Press.

Wixon, D. & Ramey, J. 1996. Field Methods Casebook for Software Design. New York: Wiley.

Wright, P. 1994. Quality or usability? Quality writing provokes quality reading. In M. Steehouder, C. Jansen, P. van der Poort & P. Verheijen (Eds.), Quality of Technical Documentation. Amsterdam: Editions Rodopi. pp.1-38.

Wright, P. 2000. Document-based decision-making. In J. Rouet, J. Levonen & A. Biardeau (Eds.), Integrating Text and Graphics in Computer-supported Learning Environments: Cognitive and Instructional Issues. pp.31-43. New York: Plenum.

Wright P. & Lickorish, A. 1994. Menus and memory load: Navigation strategies in interactive search tasks. International Journal of Human-computer Studies 40, pp. 965-1008. [doi: 10.1006/ijhc.1994.1045]


Links from this article:

Fourth International Peer Review Congress in Barcelona, Spain: http://www.ama-assn.org/public/peer/peerhome.htm