EPUBs are an experimental feature, and may not work in all readers.

The scholarly and scientific articles we know today have been the major conveyors of knowledge for close to 300 years. We receive those articles bundled in journals, published by an industry that grew out of the monastery tradition of manuscript reproduction. Indexing and abstracting services have grown up around the journal-publishing industry to help us gain access to those journals. Other industries, such as binderies — and libraries — assure permanency of access.

Scholarly and scientific articles have many unique properties. The knowledge presented is unique and rarely replicated. Each article is a gem unmatched; it cannot be substituted for another. That uniqueness makes each article, each issue, and each journal a separate treasure.

But those unique entities conglomerated, or maybe agglomerated, creating horizontal and vertical monopolies. Until only recently, a single article could not be purchased by itself. Although articles have long been accepted as a basic unit of information, they have not been accepted as unit of trade; they have been made part of a bundle of broad-based knowledge to be stored and preserved together in journal format. If the trade in knowledge were judged to be important to the economic health of this nation, the Justice Department would ignore Microsoft and investigate, instead, publications with names like The Journal of Eclipse Measurement or the John Donne Review for their monopoly on information.

Moreover, no one has precise data on who reads articles. We know who buys journal subscriptions, and some studies have shown who takes them off the shelves in libraries, but we don't measure usage at the article level. Even citation studies, which give us some indication of use and value, have their critics.

Price is, of course, no measure of quality or value. The costs of production, from editing to printing, can be determined, but the price of articles or journals varies more widely than those costs. Studies that account for the numbers of characters, illustrations, and equations show no correlation between those variables and prices. Pricing seems to be what the market will bear, and the market in scientific, technical, and medical articles accepts a much higher price than the market for articles in the humanities.

So how do we measure the value of an article? By measuring the value (that is to say, the usage) of the journal.

"Precision could be brought to a system that is collapsing of its own weight."

The authors who regularly contribute to a specific journal create the value of the entire bundle of articles in a journal. In some ways, then, weak articles (and authors) can gain prestige by being bundled with high-quality articles by well-known authors.

In the digital world of the Internet, it is possible to measure usage of individual articles. And because that breaking up of the monopoly of the journal is possible, perhaps it is time to unbundle the article from the journal. But what would such a free market in information look like?

  • The price of each article could be determined by the reputation of the author, the subject, timeliness, the length, and various value-added features such as sound, color, motion, and active links to other information sources. Vanilla articles with none of those advanced features could be sold for less. As an article ages, its price would drop (much like films, which are offered first run at the theater for a high price, and later at lower prices via premium cable channels and video rental stores, and eventually free on broadcast television). The latest research, therefore, would sport a premium pricetag, worth the cost to researchers, but as the content ages the price would drop and the market might be primarily undergraduates. Eventually the information would be practically free and hosted on archival public servers.

  • In the current information market, scientific, technical and medical journals already sell at a premium price. In the new information market, articles could be priced by discipline, depending on the market value of each discipline, perhaps paralleling the faculty-salary discrepancies in disciplines. Articles about education and English would probably sell at rock-bottom prices. Of course, articles in those disciplines might also have few added-value features; articles in the higher-priced categories would more likely have multimedia enhancements and active links to other relevant information.

  • Some articles could be enriched with added datasets or reviews and commentary by the author. Indeed, the publishing industry could be like the automotive industry, adding convenience features to its articles at an additional price.

  • Articles with more reviewers would cost more than those with fewer reviewers.

  • Translations would cost more than articles in the original language to account for the value added.

  • Pricing could be on the fly, instead of once a year as we have it now. Publishers would no longer need to plan a year ahead for changes in the exchange rates. As the dollar fluctuates, so could the price of articles. An article by a Nobel-prize winner would rise with the announcement of his or her award. Established scholars' articles would sell for more than junior faculty members' articles. The award of tenure could immediately raise the price of a faculty member's articles, since he or she would now have an additional stamp of approval. Tenure could even be pegged to the cost of an author's article. Below a certain price, no tenure; above, tenure and even a promotion!

  • Publishers could hold fire sales on articles to raise needed capital, and libraries could buy article futures as if they were pork bellies, in the hope of getting a better deal on future research and scholarship. Instead of spending year-end money on big ticket items, libraries could simply buy 10,000 or 20,000 articles to be chosen later, and stockpile them for the future.

Under my system, publishers and libraries would use the computer's ability to count hits to provide concrete information on the usage of specific articles. That would give them a much better idea of authors and subjects in demand. Publishers could make better choices about which articles to accept for publication. Libraries could avoid stockpiling and preserving hosts of articles that are of no value to the scholarly community. Precision could be brought to a system that is collapsing of its own weight as more and more is published, as more and more specialized journals are created at higher and higher prices.

Swift's original Modest Proposal was born of frustration with the economic and social order of his day: my proposal emanates from a similar frustration. If information truly is a commodity in the information age, then let's begin treating it as such.



Paul M. Gherman serves as the University Librarian at Vanderbilt University. Before moving to Vanderbilt, Gherman directed the libraries at Kenyon College and Virginia Tech. He was instrumental in establishing the Blacksburg Electronic Village Project, an early experiment in community networking. He holds degrees from Wayne State University and The University of Michigan.

After writing this article (but before it was published), Gherman and Vanderbilt received an invitation from the University of Michigan to join them and Reed Elsevier in a study of buying articles "by the drink" to see what that might mean. His Modest Proposal may not be too far from the real future.