Add to bookbag
Author: John Bonnet
Title: Jay David Bolter and Richard Grusin's Remediation: Understanding New Media
Publication info: Ann Arbor, MI: MPublishing, University of Michigan Library
May 2002

This work is protected by copyright and may be linked to without seeking permission. Permission must be received for subsequent distribution in print or electronically. Please contact for more information.

Source: Jay David Bolter and Richard Grusin's Remediation: Understanding New Media
John Bonnet

vol. 5, no. 1, May 2002
Article Type: Book Review

Remediation: Understanding New Media

by John Bonnett


Jay David Bolter and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: The MIT Press, 1999. 295 Pp.

Rumors have begun to emerge recently that post-modernism is dead. Scholarly preoccupation with French literary theory has given way to concerns about globalization, from concerns about epistemology to concerns about politics. Michael Hardt's and Antonio Negri's Empire, judging by media and academic attention of late, is now the grand theory of choice, a statement that a supranational organism is emerging — an empire — that will supplant the international system and imperialism. International politics in the future will be beyond the power of any person, corporation, or country to control in isolation, because power will be distributed throughout the system. In the future, we will be able to resist concentrations of power, since we will all have a say. [1]

Scholars, it would seem, have begun their search anew for the heavenly city. [2] But in the midst of this tumult there still remains a constituency of scholars in history and the humanities who believe that the writings of Derrida, Foucault, and others remain powerful instruments for interpreting communication practices in the past and present. A debate on the issue emerged two years ago in History and Theory, and communication scholars such as Jay David Bolter and Richard Grusin also hold to the view that post-Modern theory remains useful. For historians interested in generating new communication conventions for the computer, or in assessing the impact of communication technologies on historic development, a natural question arises: Can these theoretical traditions be usefully applied to our domains of research? [3]

If the work of Bolter and Grusin is representative of these theoretical traditions in action, then I fear the answer is no. Two properties characterize this work: analytical muddle and empirical poverty. While the authors are to be commended for their bold attempt to synthesize post-Modern, psychoanalytical, and feminist theory to explain the cognitive, technical, and social implications of new media, they do not succeed. In their attempt to incorporate everything, they explain nothing. In their attempts to explain the relationship between the self and communication technology as well as the historic impetus behind the evolution of visual communication technologies, the authors rely on post-Modern frameworks that are problematic at best, and empirically suspect at worst.

The book's central argument is straightforward. Bolter and Grusin argue that communication media historically have been subject to a process called Remediation. The two do not claim it to be a normative process. It is not a universal law of media. But they do claim remediation to be a process that has pertained in the West since the time of the Renaissance. The process is comprises a dialectic between two opposing cultural logics, the logic of transparency, and the logic of hypermediacy.

The two authors argue that the history of media has been characterized by an ever-elusive attempt in the West to generate communication technologies that erase their presence from the viewer, to create a virtual reality that is indistinguishable from the reality it purports to represent. Such was the aim for example of painters during the Renaissance. They adopted conventions such as linear perspective and light shading to govern their representations, and used oil paint in order to erase their brush strokes. In addition to painting, the authors characterize other media such as photography, film, 3D digital animation, and virtual reality as historic attempts to take the logic of transparency to its ultimate conclusion, a utopian state in which the self becomes "one with the objects of mediation," free of the interference of media that heretofore have governed and delimited perception. [4]

In Bolter's and Grusin's estimation, utopia can never be attained. The effort to attain transparency is akin to Don Quixote jousting windmills. One may admire the tenacity with which the illusion is pursued, but in the end the effort is futile. The two argue that Europeans and North Americans historically have viewed their culture's dominant medium of visual representation as being transparent. Over time, the culture's construction of the medium is inevitably subverted, as new communication technologies emerge to the fore. Users are confronted with the problem of multiple representation, and challenged to consider why "one medium might offer a more appropriate representation than another." [5]

The process of comparison unmasks the shortcomings of the older technology in fulfilling the logic of transparency, and in so doing remedies it by revealing heretofore-invisible constraints, or forms of mediation. Each new technology comments on the shortcomings of its predecessors. Photography offers a more accurate representation than painting. Film offers dynamic representations instead of photography's static portraits, while virtual reality allows the viewer a freedom of movement that film does not.

As a result of remediation, the two authors argue that users historically have embraced an aesthetic that exploits the tension between transparency and opacity, the logic of hypermediacy. Instead of exploring how a medium can purportedly be made to erase itself and offer an unmediated view of the referent, the object of the artist is to explore the emotional and aesthetic effects obtained when the object of attention is partially obscured. Monet's paintings are one example of this process. The modernist style of painting — where the goal is not representation, but exploration of artistic process and the media of paint and canvas — is the logical culmination of the logic of hypermediacy.

Why is this dialectic important? The two authors suggest the two logics in their present manifestation within digital media complement each other, and in so doing are helping to bring about a contemporary re-definition of the self. The analogy rests on the different approaches to space offered by the two logics. The logic of transparency embraces a homogenous space; there is an Archimedean point, one perspective from which all can be observed. The logic of hypermediacy employs a heterogeneous space, employing multiple points of view and media. In the former, there is one point of view, say that of a house offered by an oil painting. In the latter, there are multiple points of view, as is the case with a house set in virtual reality. Users are exposed to multiple windows set on a computer desktop, each one offering different representations of the house, one in code with numbers specifying spatial coordinates, the other showing 3D objects. In media governed by the logic of transparency, the task of the user is situated viewing. In media governed by the logic of hypermediacy, the task of the user is constructing relationships or networks — between media, and between representations embedded in media — and assessing their significance.

These two ethics Bolter and Grusin argue — situated viewing, and networking — have become important constituents in contemporary definitions of the self. Both ethics suggest heightened opportunities for freedom and fulfillment. The first ethic, when combined with the medium of virtual reality, leads to the concept of the virtual self, a construct in which identity becomes a function of one's position in space. Individual traits are never fixed. Virtual reality provides an environment in which characteristics are easily assumed and easily shed. It also provides an environment in which users learn empathy through the assumption of a different point of view in space; an act the two authors contend enables users to experience the cognitive viewpoint of "the other." The second ethic offers freedom of association: "The hypermediated self is a network of affiliations, which are constantly shifting[. . . .] This networked self is constantly making and breaking connections, declaring allegiances and interests and then renouncing them[ . . . .]" [6]

While the concept of remediation is interesting, and supported by chapters linking various technologies such as television, the World Wide Web, and Digital Art to aesthetic strategies employed in other technologies, the authors run into problems in their attempts to explain the larger significance of remediation. Is it a historical process, one that emerges from time to time? Is it a statement regarding the sociological, cultural and cognitive consequences of communication technologies?

The short answer is it is hard to say. The authors in their account resort to so many theoretical traditions, from Derrida and Foucault, to Lacan, Latour, Hayles, and William James, two problems emerge. First, their concept of media mushrooms to include just about everything. And second, their concept of the subject emerges as contradictory. In so doing, the authors make it difficult for readers to assess the historical significance of communication technologies, or to assess if the human subject is the agent or object of historical change.

Much of the problem rests on the two authors' concept of media, which they identify with Bruno Latour's concept of the hybrid:

For Latour, the phenomena of contemporary technoscience consist of intersections or 'hybrids' of the human subject, language, and the external world of things[. . . . ] The events of our mediated culture are constituted by combinations of subject, media, and objects, which do not exist in their segregated forms. Thus, there is nothing prior to or outside the act of remediation. [7]

Media are conceived as networks and human subjects are seen as constituents of such networks. Very well, then. Is there anything within these networks that we should observe as significant catalysts of historical change? Again, in their account, the authors offer nothing for historians to grab on to, not even a historicist reading in which the social constituents of a media complex are deemed significant in one context, and technical factors in another. In Remediation, nothing is privileged, in part because the authors wish to avoid the pitfalls of technological determinism:

We propose to treat social forces and technical forms as two aspects of the same phenomenon: to explore digital technologies themselves as hybrids of technical, material, social, and economic factors[. . . .] Because our digital technologies of representation are simultaneously material artifacts and social constructions, there is no need to insist on any aspect as cause or effect. [8]

The technical, material, social, and economic facets of media are so tightly bound "that it is unproductive to try to tease them apart." [9] The authors wish to avoid Marshall McLuhan's doctrine of media determinism, not for empirical reasons that justify the concept of human agency, but due to the implications of the doctrine, which the authors do not like:

Nothing good can come of technological determinism, because the claim that technology causes social change is regarded as a justification for the excesses of technologically driven capitalism in the late twentieth century. [10]

There is too much of this specious thinking in a variety of literatures. I offer the somewhat dated observation that an argument should be accepted on its empirical merits, not on whether it provides aid or comfort to a political or economic doctrine we hold dear. In any case, to avoid the conundrum of determinism the authors offer their study as an application of Foucauldian genealogy, Michel Foucault's proposition that historical inquiry should be the search "for historical affiliations or resonance and not for origins."

Foucault (1977) characterized genealogy as 'an examination of descent,' which 'permits the discovery, under the unique aspect of a trait or a concept, of the myriad events through which — thanks to which, against which — they were formed.' [11]

The genealogical traits of immediacy, hypermediacy, and remediation are offered as genealogical traits, enabling the authors to construct a genealogy "defined by the formal relations within and among media as well as by relations of cultural power and prestige." [12]

What we are provided with, then, is a theoretical basis for classifying forms of media, and a description of their changing characteristics over time. We have an acknowledgement of the fact of evolution, but no theory of natural selection to explain evolution. Nor is there an account of a medium's interaction with its environment because the social and cultural environments have themselves subsumed into the authors' construct of media. In this construct, everything is connected to everything else. Everything becomes at once cause and affect. The authors are free to make such an argument, of course. But bereft as it is of an account of historical change, it is not likely to appeal to historians.

A second major problem with the authors' theory of media is that the construct leaves little apparent room for the autonomy of the subject. Even if Bolter and Grusin had refused to privilege the social, cultural, or technical constituents of media, but left the subject to arrange them into frameworks of his own devising, they would have left us with a more plausible theory of media. Their insistence, however, that "there is nothing prior to or outside the act of remediation" precludes such an elevation. Subjects are embedded in Latour's hybrids. They cannot perceive phenomena extraneous to their construct, and therefore cannot change its constitution.

To their credit, the authors seem to realize this danger. They deny they are strict social constructionists, and labor mightily to ensure their framework leaves sufficient room for human agency. [13] They do not succeed. Through their appropriation of William James' concept of the empirical self, the two assert that the individual has autonomy. James argued that the self was comprised of multiple constituents, including the spiritual self, which he defined as the active element of consciousness. [14] Furthermore, Bolter and Grusin apparently acknowledge occasions when phenomena are prior to mediation, when constituents are loosed from one paradigm, and incorporated into another. [15]

The problem is that the two authors, through their continued reference to Latour and Jacques Derrida's concept of mimesis, undercut the coherence of their constructs of the self and of media of communication. There is little room here for human volition. Derrida argued that when individuals voice recognition regarding the properties of a representation, the relationship they perceive is not one between a model, or signifier, and the object it purports to represent. Rather, the relationship is one of similarity between representations already in circulation, between an old signifier and a new. [16] Subjects remain embedded in discourse, or in the case of Bolter and Grusin in a social-technical-cultural complex that is a medium, a medium that is always prior to the subject. The consequent of this formulation, as Derrida explained long ago, is the elimination of the subject as an independent agent:

This implies that the subject in its identity with itself (or eventually in its consciousness of its identity with itself, is self-consciousness) is inscribed in language, is a 'function' of language, becomes a speaking subject only by making its speech conform - even in so-called creation, or in so-called transgression - to the system of the rules of language as a system of differences, or at very least by conforming to the general law of difference[. . . .] [17]

By appealing to multiple theoretical traditions, then, the authors have created more problems for themselves than they solve. Either the subject enjoys autonomy to conceive the world anew, or he does not. Either phenomenon can exist prior to or outside a construct associated with a media hybrid, or they cannot.

Aside from problems in argumentation, the author's constructs of the self, media, and the historic evolution ofmedia are suspect on empirical grounds. The premise that the self has priority over prevailing constructs was one that was strongly argued by Noam Chomsky in a famous 1959 critique of B.F. Skinner's behavioral linguistics. Chomsky pointed to widespread evidence that children are fully capable of forming sophisticated grammatical constructs, despite the "poverty of stimulus" in their surrounding linguistic environment, behavior that should not occur if the self is dependent on its environment for its means of expression. [18]

Nor is there reason to accept the proposition that mediations are always prior. While Thomas Kuhn's The Structure of Scientific Revolutions is not without its problems, Kuhn does provide repeated evidence that scientists in their experimental practice have encountered anomalies, phenomena that cannot be accounted for by prevailing constructs, and accordingly, phenomena that exist outside or prior to prevailing mediations. Scientists are required to construct new theories to account for their findings. [19]

Finally, there are empirical grounds to contest Bolter and Grusin's contention that a historical dialectic between transparency and hypermediacy has governed the evolution of visual communication technologies in the West. The impetus for innovation, as they see it, has been governed by the search for communication technologies capable of providing an unmediated relationship between viewer and represented objects, and capable of meeting the emotional, if not erotic, need of viewers to attain unity with the represented objects. [20]

Historians such as James Beniger, however, would assert that the imperative has been driven less by the desire to unite with information than the desire to control it. New communication technologies — even the visual technologies cited by the authors as examples of the desire for transparency — have been conceived by Beniger, Benjamin Wooley, Harold Innis, and other media theorists as instruments of control, designed to facilitate the identification of patterns embedded in data. [21]

Nor is it a given that an unmediated representations fulfill the emotional, erotic needs the two authors claim. Roland Barthes in his S/Z pointed to a similar ideal in his characterization of hypertext:

In this ideal text, the networks are many and interact without any one of them being able to surpass the rest; this text is a galaxy of signifiers, not a structure of signified; it has no beginning; it is reversible; we gave access to it by several entrances, none of which can be authoritatively declared to be the main one; the codes it mobilizes extend as far as the eye can reach, they are indeterminable[. . .]; the systems of meaning can take over this absolutely plural text, but their number is never closed, based as it is on the infinity of language. [22]

Yet Janet Murray, in experiments relating to the use of multimedia in the teaching of English literature, found that students, in experiencing environments that were without structure or mediation, often voiced a sense of alienation and frustration:

What students (and other readers presumably) object to here, when they do object, is that they don't know where they are going or on what basis they are meant to choose. This blind navigating tends to mitigate against the democratic aspirations of the postmodern hypertext group. Instead of feeling empowered by experiencing an undetermined text, students often feel tyrannized by a pre-determined set of often frustrating and incoherent paths. [23]

The key in generating emotional fulfillment was providing an environment with the right balance of structure and freedom was attained, not by immersing subjects in an environment where there is an absence of mediation.

As a contribution toward assessing the impact of communication technologies on societies past and present, therefore, Remediation does not provide much for historians to use. Neither does the work — as a case study — do much to recommend the utility of post-Modern literary theory. Jay David Bolter and Richard Grusin do not provide a clear construct of cultural change, nor do they provide an assessment of the relative importance of media, communication technologies, and human subjects as constituents of cultural change. Many of the post-Modern constructs to which they appeal are empirically suspect. Readers will find interesting descriptions of common aesthetic strategies employed in different media. But as a contribution to theory, Bolter and Grusin, despite their daring in attempting to synthesize multiple theoretical traditions, have produced a work that fails to satisfy.

1. See Emily Eakin, "What Is the Next Big Idea? The Buzz Is Growing," New York Times, July 7, 2001; Mark Kingwell, "You have now heard of Empire; Empire is over," The National Post, July 11, 2001.

2. Carl Lotus Becker, The Heavenly City of the Eighteenth Century Philosophers, New Haven: Yale University Press, 1932.

3. See Perez Zagorin, "History, the Referent, and Narrative: Reflections on Postmodernism Now," in History and Theory 38(1999): 1-24; Keith Jenkins, "A Postmodern Reply to Perez Zagorin," in History and Theory 39 (2000): 181-200; Perez Zagorin, "Rejoinder to a Postmodernist," in History and Theory 39 (2000): 201-209.

4. Jay David Bolter and Richard Grusin, Remediation: Understanding New Media (Cambridge, MA: The MIT Press, 1999): 236.

5. Bolter and Grusin, Remediation, 44.

6. Bolter and Grusin, Remediation, 232.

7. Bolter and Grusin, Remediation, 57-58.

8. Bolter and Grusin, Remediation, 77-78.

9. Bolter and Grusin, Remediation, 77.

10. Bolter and Grusin, Remediation, 76.

11. Bolter and Grusin, Remediation, 146.

12. Bolter and Grusin, Remediation, 21n.

13. Bolter and Grusin, Remediation, 72.

14. Bolter and Grusin, Remediation, 233.

15. Bolter and Grusin, _Remediation_, 77-78.

16. Bolter and Grusin, Remediation, 53.

17. Jacques Derrida, "Differance," in A Derrida Reader: Between the Blinds. Ed. Peggy Kamuf. (New York: Columbia University Press, 1991): 67. This section originally appeared in Derrida's Margins of Philosophy. Trans. Alan Bass Chicago: University of Chicago Press, 1982).

18. See Noam Chomsky, "A Review of B.F. Skinner's Verbal Behavior," at [February 18, 2002]. Originally published in print in Chomsky, "A Review of B.F. Skinner's Verbal Behavior," in Language 35(1): 26-58. 1959.

19. See Thomas Kuhn, The Structure of Scientific Revolutions, Chicago: University of Chicago Press, 1962.

20. Bolter and Grusin, Remediation, 236, 236 n3.

21. See James Beniger, The Control Revolution, Cambridge, MA: Harvard University Press, 1986; Benjamin Wooley, Virtual Worlds: A Journal in Hype and Hyperreality. (Cambridge, MA: Blackwell): 251; Harold Innis, "On the Economic Significance of Cultural Factors," in Political Economy in the Modern State (Toronto: The Ryerson Press, 1946): 91; Harold Innis, "The Problem of Space," in The Bias of Communication (Toronto: University of Toronto Press, 1991, c. 1951): 126, 128; Harold Innis, "Chapter 6: Printing in the Sixteenth Century", in "History of Communication", Harold Innis Papers, University of Toronto Archives, B72-003, Box 17, pp. 35-36, 38.

22. Roland Barthes. S/Z. Trans. Richard Miller. (New York: Hill and Wang, 1974): 5-6. Cited in Janet Murray. "The Pedagogy of Cyberfiction: Teaching a Course on Reading and Writing Interactive Narrative." in Contextual Media: multimedia and interpretation. Eds. Edward Barrett and Marie Redmond. (Cambridge, MA: Massachusetts Institute of Technology, 1995.): 144-145.

23. Janet Murray. "The Pedagogy of Cyberfiction: Teaching a Course on Reading and Writing Interactive Narrative." in Contextual Media: multimedia and interpretation. Eds. Edward Barrett and Marie Redmond. (Cambridge, MA: Massachusetts Institute of Technology, 1995.): 146.

By John Bonnett, Institute for Information Technology, National Research Council