Editor's Gloss: Seeking Quality OnlineSkip other details (including permanent urls, DOI, citation information)
This work is protected by copyright and may be linked to without seeking permission. Permission must be received for subsequent distribution in print or electronically. Please contact email@example.com for more information. :
For more information, read Michigan Publishing's access and usage policy.
My daughter owns a button with the slogan "The Web is the world's biggest library, only all the books are on the floor."
As publishers, writers, and archivists of the documents in this colossal library, we are fighting to make sure that our work is not only worth reading, it is recognized as worth reading. And not incidentally, we have to make sure that it can be found (but that will be the subject of another JEP issue).
The quality issue has been much discussed in the biomedical community, with the National Institutes of Health's recent announcement of a program to put research papers in the field on line for free access.[*] Both papers that have been published in peer-reviewed journals and ones that have been "screened" by appropriate groups are eligible for the archive.
The aim is to make available research findings that otherwise might not be published, such as "negative" results—the finding that a drug combination has no effect on a particular problem, for instance. However, many were concerned that minimally screened papers might include those that should not see the light of day. Even with its imperfections, peer review, they argued, was the best way to guarantee against such an occurrence.
As of this writing, the NIH is still exploring ways to expand the universe of published papers without compromising quality. It's tough, and the problem is exacerbated by the fact that the NIH has such respect: People assume that information on the NIH site is both important and correct.
Would that we all had those problems! Here we are struggling to find good information and to make it available, and to be recognized as publishers, writers, and archivists of note, and we're surrounded by sites that look as good as ours but don't have the same quality control. How is the poor Web surfer to tell the difference?
The answer is simple: Experience. As people learn to surf the Web well, they will also learn whom they can trust, and which sites are likely to be most useful. Unfortunately that takes time, and meanwhile, many of us sit here on the Web like wallflowers, hoping that someone will ask us to dance.
Two of our authors have addressed the issue of online quality, from two very different points of view.
Julie Hooke is taking her fourth class (called a "subject" in Australia, where a "course" is the entire set of classes that make up the requirements for a degree) toward a master's degree in information management and systems at Monash University, which is about five hundred miles from her home in Adelaide. Without the Internet she could not do it. It's not just the assignments that come by e-mail, or even the listserv that substitutes for class discussion, it's the research and the readings. Her local library rarely has the journals and papers she needs; if she can't get them on line, she can't get them. But on line there is no reference librarian to steer a student away from spurious material and toward solid research results. She details her experiences in Distance Education: The Perils of the Virtual Student in Cyberspace.
Michael Nentwich describes the European Research Papers Archive, a series of high-quality online papers in the field of European integration research, on the occasion of its first birthday. He writes about the archive's recently established policy on accepting new series, and discusses the wider issue of "quality filters." Several of his scenarios are interesting in the way they balance quality and comprehensive coverage. Read about them in The European Research Papers Archive: Quality Filters in Electronic Publishing.
Of course it won't make any difference how good our publications are if no one can read them. James Lichtenberg brings us up to date on the publishing industry and the Year 2000 problem. (Note that I did not refer to it as Y2K. That's what got us into trouble in the first place!) Jim isn't afraid to use a TLA (three-letter acronym), though. He thinks the publishing industry is doing fine. Find out more in Y2K: Compliance or Chaos? Publishing Confronts the Millennium.
We also have articles on even more sticky issues, copyright and privacy. Thomas G. Field, Jr., is back, this time with his professional assessment of Copyright in E-Mail. And your editor weighs in with some strong opinions about a publisher's responsibility for ensuring that readers are comfortable on line in Privacy in the Electronic Environment.
In From Print to Cyberspace, Birdie MacLennan details how the University of Vermont's libraries decided to make e-journals available, reminding us that librarians can be a publisher's greatest ally in establishing confidence.
Finally, Contributing Editor Thom Lieb has his say on the sectarian fighting in The Format Wars: HTML, PDF, and TXT. He probes the weaknesses and admires the strengths of each.
If you have not yet visited Potpourri, click there now. We've got kudos, questions, and comments (kudos, kwestions, and komments?), and we invite you to add more.
Judith Axler Turner may be reached by e-mail at firstname.lastname@example.org. Her day job includes consulting to the NIH on the PubMed Central project.
Link from this article:
* National Institutes of Health's announced program to put research papers in the field on line, http://www.nih.gov/welcome/director/ebiomed/ebi.htm