Add to bookbag
Author: Jeff Pruchnic
Title: How to Surf in the Cybernetic Age
Publication info: Ann Arbor, MI: MPublishing, University of Michigan Library
Spring 2004
Rights/Permissions:

This work is protected by copyright and may be linked to without seeking permission. Permission must be received for subsequent distribution in print or electronically. Please contact mpub-help@umich.edu for more information.

Source: How to Surf in the Cybernetic Age
Jeff Pruchnic


vol. 4, no. 1, Spring 2004
Article Type: Essay
URL: http://hdl.handle.net/2027/spo.pid9999.0004.104

How to Surf in the Cybernetic Age

Jeff Pruchnic

Abstract

"How to Surf in the Cybernetic Age" considers the forms of mediation invoked over a half-century ago by the Turing Test in conjunction with three relatively recent revolutions in human-machine interfacing: EverQuest, a Massively Multiplayer Online Role-Playing Game (MMORPG); emotional avatars, voice recognition softwares that detect and approximate emotional responses; and the emerging technologies of selectively-mediated reality devices commonly associated with Steve Mann. The same sets of practices introduced by the Turing Test—interaction, mediation, and response—are also applied on another level, that of how we approach these technologies and their endless iterations from ethical, political, and/or social standpoints.

If the seventeenth and early eighteenth centuries are the age of clocks, and the later eighteenth and nineteenth centuries constitute the age of steam engines, the present time is the age of communication and control.
Norbert Wiener, Cybernetics: or Control and Communication in the Animal and the Machine (39)
Surfing has taken over from all the old sports.
Gilles Deleuze, "Postscript on Control Societies" (181)

Turing the Cybernetic Age

These puzzles where one is asked to separate rigid bodies are in a way like the "puzzle" of trying to undo a tangle, or more generally of trying to turn one knot into another without cutting the string.
Alan Turing, "Solvable and Unsolvable Problems" (11)

In 1950 Alan Turing first formally proposed the now famous test that bears his name. Though computer science research had yet to produce a machine that could rival human intelligence, human expectation — if not technology — had reached science fiction proportions. Bracketing the question "can machines think?"— a query he immediately indicts as "absurd," possessing a "dangerous attitude," entailing an infinite regress of definitional obligations (433), and "too meaningless to deserve discussion" (442) — Turing translates the thrust behind it from an abstract concern to multiple practices of human-machine interaction, mediation, and response. In place of this very epistemological concern with the nature of "thinking" Turing suggests a retrofitting of the popular "imitation game." The game is normally played with a man, a woman, and an "interrogator" of either gender. The interrogator asks the other two a series of questions (their responses are mediated through a third party or preferably transcribed or typed) and attempts to determine based on this information alone the gender of each participant. Turing hypothesizes that the substitution of a "thinking machine" for one of the respondents would create a more productive environment in which to engage the significance behind the original question of "can machines think?"

click to enlarge image
click to enlarge image
Jin Wicked's The Universal Turing Machine. Wicked's art bears witness both to the indiscernability between the practices of human bodies and the practices of machines expressed in Turing's work and to the violence done to his own body as a result of circuiting with the cybernetic system of government control and communication (Turing received estrogen injections to "reduce his libido" in lieu of a prison sentence after authorities discovered his sexual relationship with a young man in 1952, was denied government security clearance due to his homosexuality, and eventually committed suicide by ingesting a cyanide-injected apple in 1964).

Turing's essay was the virtual rehearsal for an experiment in the clinical sense of the word, and entailed all that goes with such a process, such as reproducibility and modification. As such, the original Turing Test comprises only the first iteration of a process that relentlessly has been duplicated and modified. In addition to this physical repetition, the Turing Test often is reiterated textually as a point of departure for narratives of computer science, artificial intelligence, posthumanism, etc. I repeat it here because I believe it can serve (perhaps only if stretched a little) not only as an historical precedent but a contemporary lens for understanding current technologies of A.I., virtual embodiment, and mediated reality.

Turing's game constitutes a series of algorithmically complex rhetorical practices of interaction and persuasion that hinge more on human response to machinery than any philosophical inquiry into the possibility of a "thinking" machine:

The original question, "can machines think?" I believe to be too meaningless to deserve discussion. Nevertheless I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted. (442)

The original query immediately entails questions of agency and cognition proper to philosophical reflection, an inquiry Turing pejoratively describes as one "best answered by a Gallup poll" (433). His rendering of it into a game depends no less on human response and no less on the quantification appropriate to a poll — "I believe that in about fifty years time...an average interrogator will not have more than 70 per cent chance of making the right identification after five minutes of questioning" (442) — but these effects emerge out of the material practices between human and machine when they are put into circuit with one another and assay not identity between the silicon and flesh contestants but the negotiation of their alterity.

The Turing Test and both its technological and cultural import have been popularly conceptualized as pivotal steps toward the modeling of machines on the basis of human behavior and the reflexive erasure of embodiment: if a machine can think, or at least fool an interrogator in the very pragmatic performance of the Turing Test, then surely the difference between the two can be reduced to so much vitalist hand-waving, and intelligence and consciousness have nothing to do with the very human quality of embodiment and the very humanist quality of self-sovereignty. The author that most aggressively pursues this line of thinking and its implications is N. Katherine Hayles, who in the prologue to How We Became Posthuman, depicts "Computing Machinery and Human Intelligence" as ground zero for cyberneticists' supposed dismissal of embodiment from considerations of consciousness. Hayles negates the significance of Turing's substitution by collapsing the imitation game back into the "meaningless" question dislocated by Turing at the beginning of his essay, insisting narrowly on the definitional capacity of the Turing Test: "If you cannot tell the intelligent machine from the intelligent human, your failure proves, Turing argued, that machines can think" (xi). [1] Though this reading certainly taps into an important cultural legacy of Turing's work, by focusing narrowly on Turing's transformation of the imitation game that begins "Computing Machinery and Intelligence" — neglecting both the twenty-odd pages that follow this description and detail the complex affinities between humans and machines as well as the larger cybernetic context within which Turing was writing — it occludes the novelty of Turing's ideas particularly insofar as they apply to current forms of machine-human interaction. It is these neglected elements I want to recuperate to use as a preliminary toolbox in investigating contemporary mediation technologies.

We might begin unpacking the broader implications of Turing's test by focusing on three interrelated topoi. Most importantly, critical readings of the test largely neglect the telos driving the majority of cybernetic research, a bundle of divergent investigations that largely have been characterized as a reduction of the human to a machinic construction and consciousness but which were primarily goaded by a very pragmatic concern with what might be achieved by harnessing the similarities between the biological and mechanical. Consider the following (in)famous statement by Arturo Rosenblueth and Norbert Wiener published the same year as Turing's piece; it was written in defense of an earlier paper ("Behavior") that Weiner retrospectively describes as the "statement of a program for a large body of scientific work" (Cybernetics 15) that would later be named cybernetics:

We also wish to explain why we use the humanistic terms purpose and teleology in the description of the behavior of some machines. The question of whether machines are or can be like men or the higher animals does not guide our choice. This question is on the main irrelevant for scientific objectives. We believe that men and other animals are like machines from the scientific standpoint because we believe that the only fruitful methods for the study of human and animal behavior are the methods applicable to the behavior of mechanical objects as well. Thus, our main objective for selecting the terms in question was to emphasize that, as objects of scientific inquiry, humans do not differ from machines. (36)

On the one hand, Rosenblueth and Wiener largely beg the question they ostensibly are attempting to answer; in defending their viewpoint as scientifically viable — the immediate exigence for their paper — the duo will persuade few peers merely by stating that these conceptual territories are appropriate for the purposes of "scientific inquiry." [2] On the other hand, Rosenblueth and Wiener dismiss (like Turing) the "irrelevant" question of whether machines can be like humans and speak to a largely forgotten goal of first-wave cybernetics through their focus on the pragmatic consequences of such a comparison, a new understanding of the machinic consciousness and behavior operating in and through humans. More importantly, like Turing, the two explicitly localize the behavioral similarity between machines and humans within a networked context. Rosenblueth and Wiener refuse to consider attempts at determining the cybernetic capacities of any object in isolation where "the degree of coupling of an object with its surroundings" is elided (323). The only context in which the cybernetic viewpoint makes sense and cybernetic practices can be deployed is one entangled with an "object which forms part of a larger system, i.e., to an object that is coupled to other objects or features in the environment in such a manner that changes in these objects or features will modify its behavior" (324).

Such a consideration hails not so much an epistemological resolution (human=machine), but a mobilization of applications consequent to a "programmable" view of human subjectivity. This current in cybernetics research was explored most relentlessly (and personally) by John C. Lilly, who writes of the possibilities for self-experiment with the human biocomputer through human-machine interaction, biotechnology, and manipulations of machinic consciousness. Lilly's unparalleled commitment to self-experiment led him alternately to splice his own body and consciousness into circuit with machines, animals, drugs, and animal drugs [3] in order to observe and alter human behavior through "metaprogramming" (Programming ix). These metaprogramming sessions and the unprecedented networks between humans, animals, and machines birthed through them led to the discovery of dolphin language and the invention of the isolation tank as well as a host of other medical and diagnostic technologies.

Secondly and similarly, Turing's test transfers the question of intellectual equivalence between humans and machines to a matter of response rather than deliberation (a process also performed both conceptually and materially by the itineraries of Lilly, Rosenblueth, and Wiener detailed above). He writes in reference to his complication of the imitation game:

We now ask the question, 'What will happen when a machine takes the part of A in this game?' Will the interrogator decide wrongly as often when the game is played like this as he does when the game is played between a man and a woman? These questions replace our original, 'Can machines think?' (434)

This transformation of equation into game foregrounds the interactivity required to negotiate the original query. It also gestures towards not so much a reduction of the human to the machine, but a growing indiscernability between the two resulting from the mediation of the imitation game: a questioning of not only the capacity of a "thinking machine" to mime human response, but of humans' capacity to be affected by and respond to them. In other words, although the Turing Test begins with a question of the similarities between human and machine intelligence, it quickly shifts focus to how humans will respond to technologies that mimic their behavior — and how human behavior itself will be affected by the possibilities and practices of this development.

Thirdly and finally, we might consider the modern repetition of Turing's test. Though Turing turned a popular game into a thought experiment, the material application of this experiment (made possible by advances in technology to which Turing's hypothesis looks forward) has instantiated a shift back into a game "proper": a collapsing of critical distance (already endangered by the textual Turing Test) that makes the experiment more a matter of affect and interactivity. Pop-science writer David Berlinski zeroes in on this phenomenon while discussing the ubiquity of Turing's Text mentioned above:

Such is Turing's Test. Enacting this test has actually become a ritual in recent years, with a row of solemn stalwarts facing a collection of curtained booths, trying to determine from the cryptic printed messages they are receiving — My name is Bertha and I am hungry for love — whether Bertha is a cleverly programmed machine or whether, warm and wet, Bertha herself is resident behind the curtain, stamping her large feet and hoping for a message or massage in turn.

The gender dichotomy of the original imitation game forecasts the more intersubjective aspects of Turing's adaptation and its subsequent performances: the immediacy of the test and the inevitable turn in its participants towards the personal and emotional makes the test as much about affects as effects, [4] and anticipates a slew of contemporary technologies based on this (sometimes pleasurable, sometimes annoying) often momentary indiscernability—"No really, who is behind that curtain?"

In what follows, I consider the practices invoked over a half-century ago by the Turing Test in conjunction with three relatively recent revolutions in human-machine interfacing: Everquest, a Massively Multiplayer Online Role-Playing Game (MMORPG); "emotional avatars," voice recognition softwares that detect and approximate emotional responses; and the emerging technologies of selectively-mediated reality devices — "EyeTap" systems — most commonly identified with Steve Mann. These phenomena are all "revolutions" in both the technical and populist sense, indicating not only advances in technology but expanding with varying degrees of rapidity the number of individuals that are by choice or chance affected and imbricated in their practice.

I also want to utilize the same sets of practices introduced by the Turing Test — interaction, mediation, and response — on another level, that of how we approach these technologies and their endless iterations from ethical, political, and/or social standpoints. I dwelt earlier on a certain party-line reading of the Turning Test and the cybernetic movement in general not only as background for the test's initial reception, but because I find them emblematic of a certain occupational psychosis [5] concerning new forms of mediated communication and interaction. This last question, particularly in a cybernetic context, is itself knotted with others concerning agency and transformation, a point I will return to at the end of this paper. Quite possibly, the reason the subject of contemporary technologies such as artificial intelligence, online role playing games, and mediated reality devices cause such a problem for contemporary critical tools is because these technologies create and demand a new subject (as in the user, participant, or critic) to interact with them — one that is itself not so much decided or detected as programmed.

Heroinware

click to enlarge image
click to enlarge image
EverQuest screenshot. An ogre and other players dope it up online.

...im [sic] kind of relieved to find im not the only one who is losing someone to this game.
post to EverQuest-Widows newsgroup

EverQuest is currently the most popular MMORPG running, possessing a subscriber list in excess of 400,000 who pay a $14.95 monthly subscription rate and exhibiting an average of 90,000 players simultaneously interacting through digital avatars in its virtual environment at any give time. [6] Although Sony will not release discrete information on usetime, several independent surveys of player activity clock average playing time at around twenty hours per week. [7] This prolonged immersion in the game has spawned not only a large assortment of satellite services for EQ users but multiple support groups for those who have "lost a loved one to EverCrack addiction," such as the newsgroup listed above. The sheer magnitude of Everquest's users, the massive amount of time dedicated towards conquering its constantly expanding landscape, and the game's unparalleled industry growth (embedded within an industry that is, as a whole, overtaking traditional media — video games currently generate around $17.9 billion annually, more than film and television combined) suggest a watershed moment in the genealogies of virtual embodiment, networked communication and interaction, and their related social practices. How can one respond to such phenomena?

Video Game Studies (and the larger academic, yet still overwhelmingly "sub" discipline within which it is embedded — Cyber Studies) certainly has not grown to the extent that its output has overrun the study of more traditional media (even as the production and consumption of its subject matter outpaces the same). However, it has reached the point where one might gauge emerging modes of critical response. To summarize greatly, Video Games Studies has provided methods for borrowing the tools of literary and cultural analysis to interpret games along ideological lines (Douglas), for viewing games as early training in distinguishing the real from the simulated in other aspects of life (Turkle), for analyzing the entities and activities of video games as substitutes and sublimations for "real life" (RL) practices and impulses (Bernstein), and for investigating the affinities between the particular mediation created by video game playing and more traditional media such as film and television (Bolter/Grusin).

These methods are joined by an exclusive interest in the representative capacity of video game technology, one that follows a similar trend in Cyber Studies. By now it has become almost commonplace for critiques of the World Wide Web and its related technologies (particularly those written within the humanities) to compare the realities of relationships between people and technologies and between people using these technologies against a certain utopian conception of the Internet or the "Information Age"; therefore, critical analysis has shifted towards demystifying claims that online interaction is a step towards the abolition of such societal ills as unequal access to information and education between and within countries and economic groups or that it will lead to the elimination of acts and discourses of discrimination that might be obviated by the "disembodiment" promised by virtual representation. Consider, for example, the approach taken by both Lisa Nakamura (97-100) and Andrew Herman/John H. Sloop (91-95) in reference to MCI Internet Television. Both of these analyses critique the service through a semantic deconstruction of the popular 1997 commercial campaign advertising it, arguing that MCI's claims to create an egalitarian cyberspace - no "race," "genders," "age," or "infirmities" on the Internet, only "minds" - can only come at the expense of eliminating all markers of difference in its users.

And yet, the twinned aspects of immediacy and immersion that characterize games such as EverQuest seem to shred any conception of the efficacy of these responses. Although networked practices are a very much a wetware phenomenon and one inseparable from a unique form of mediation, critical approaches largely have been concerned with confronting these forces from multiple levels beyond their immediate application (i.e., their circulation within a particular cultural or ideological context rather than the immediate interaction between a player and media [8]) and through further mediations of this original mediation (for instance, the tendency to critique the promotional materials or the marketing strategy surrounding a technology). In other words, the growing ubiquity of new forms of mediation (and the subsequent proliferation of their remediation in popular discourse) has seemed only to draw critics further away from their ostensible subject of analysis. Although such a perspective might foreground both a certain logic at play in the marketing of a game like EverQuest and its conceptual debt to earlier media technologies ("board-based" role-playing games being the most obvious one), it leaves us several steps away from the (often ecstatic) practices and compulsions of interactivity written over the glazed eyeball of the user on her twentieth hour and condemned by the spouse left behind in RL.

Brian Massumi is helpful here in unpacking the affective qualities inherent in such practices, identifying the gap between form and content that mirrors the gap between responding to a technology's practices of mediation and responding to mediations of (or about) these practices:

...it may be noted that the primacy of the affective is marked by a gap between content and effect: it would appear that the strength or duration of an image's effect is not logically connected to the content in any straightforward way. This is not to say that there is no connection and no logic. What is meant here by the content of the image is its indexing to conventional meanings in an intersubjective context, its sociolinguistic qualification. This indexing fixes the determinate qualities of the image; the strength or duration of the image's effect could be called its intensity. What comes out here is that there is no correspondence or conformity between qualities and intensity. If there is a relation, it is of another nature. (24; emphasis in original)

Massumi keys in on an element often forgotten in critical analysis, the affective intensity of media that operates beneath the level of signification and necessarily is occluded when a phenomenon is approached exclusively on the plane of representation.

Massumi derives this distinction from an experiment conducted by Hertha Sturm, a researcher who has devoted the majority of her work towards the "unique cognitive, emotional, and social effects" generated by different forms of mediation regardless of content ("Television," "Missing" 39). In the experiment Massumi cites, results were generated by assaying children's verbal and autonomic responses to a German television program, one that yielded divergent responses between what the children said and what their physiology intimated. Sturm's work is joined by a series of experiments performed around the same time by American physiologist Benjamin Libet in their mutual interest in surveying (and juxtaposing) physiological "effects" — measured by skin stimulus and brain activity — with verbal comments by the experiments' subjects. [9] Libet's investigations into the role of conscious will in voluntary action yielded the conclusion that "consciousness" as we experience it does not initiate physical volition, but acts to select and control prompts by unconscious cerebral processes (529). However, the real novelty of Libet's conclusions, as well as Sturm's, is their joint effort to (re)discover the unconscious generally (and unconscious motivation specifically) not within the conceptual territories of representation, ideology, or traditional psychology, but in and on the human body.

One would imagine that the intense interactivity of video game play, which combines and coordinates the two subjects of Sturm and Libet's analyses, perception of media and voluntary action, respectively, would produce effects beyond either of these two phenomena. This hypothesis recently has been confirmed through experiments initiated by the Hammersmith Institute using the more invasive technology of Positron Emission Tomography (PET) scanning (Koepp). Their conclusions suggest that the phrase "faster than thought" — the title of an early anthology on digital computers to which Turing contributed a piece on computer-mediated game playing ("Digital") —applies as much to the responses generated in the human player as they do to the calculations of her machinic opponent or mediator. The study yielded that subjects generated significantly higher levels of the neurotransmitter dopamine while playing a game in which tanks battle each other, levels continued to increase as players advanced through the game's levels. Dopamine has been implicated naturally in the control of movement, attention, and learning, but is also generated from the use of several opiates which selectively forestall reuptake of naturally occurring dopamine into the brain. Its production during game play is not altogether surprising given the demand placed on attention and movement in electronic game playing. However, dopamine's involvement in reinforcing behavior — such as dopamine prompted by the quenching of thirst, or in the reinforcing effects of cocaine and amphetamines — seems to suggest that the repetitive and goal-oriented practice of game play hails an odd form of agency, that of the player dosing on the game while learning how to reach the next level. [10]

Concomitantly, it suggests a certain machinic consciousness and programmed behavior in the player produced by this dopamine pedagogy, a point made explicit in Turing's prediction of the (admittedly eerily-titled) "child-machine":

We normally associate punishments and rewards with the teaching process. Some simple child-machines can be constructed and programmed on this sort of principle. The machine has to be so constructed that events which shortly preceded the occurrence of a punishment-signal are unlikely to be repeated, whereas a reward-signal increased the probability of repetition of the events which led up to it. ("Computing" 457)

Felix Guattari provides the best frame for the dual nature of this kind of phenomena in his short essay "Machinic Junkies." For Guattari, "machinic drugs" are those substances that "provide a sensation of belonging to something, of being somewhere, along with the sensation of forgetting oneself" (101). [11]EverQuest proffers the most robust sustainability for catching such a buzz; its staggered levels of "scoring" (distributed into various categories of gain through quantifiable levels of experience, power, property, currency, etc.) and its unending expansions (whenever anyone gets close to completing the game, new levels are added) provide not the limit case but a case of no limits for such machinic interaction.

Both the addict and the academic are quick to point out the importance of "set and setting" in the use of physiology-altering substances. [12] On this qualification, many have been quick to charge that the Hammersmith group's payment of its subjects (7 pounds for every level passed) unfairly skewed the test results. However, EverQuest accounts for such a possibility: the burgeoning economy of EQ has spread offline. [13] Although avatars and items produced in the game are still available for hard RL cash on eBay — at the time of this writing a level 60 cleric is currently going for $2000 — players now can buy these items direct or exchange their EQ platinum currency for real dollars (and vice-versa) at no fewer than five Amazon.com-style e-stores (delivery avatars will meet you at Greater Faydark or the Field of Bone).

This urge to capitalization threatens to intensify with the introduction of MindArk's Project Entropia (launched earlier this year), a game where players can buy items and skills direct from the server after converting their cash to "Project Entropia Dollars," invest in the stock of its virtual companies and services, or just let it ride at one of the game's casinos. On the one hand, these human-machinic interactions and their accompanying affects certainly speak to a certain novel colonization of silicon by Capital. On the other, the distinction between real and virtual economies in MMPORG's like EverQuest and Project Entropia may be not only increasingly indiscernible but increasingly unimportant. Written decades before the creation of high capacity digital computers, the Turing Test hypothesized that interactions between humans and machines or between humans communicating through the mediations of machines would produce value in relation to participants' belief and investment in their conclusions, regardless of their reality or "authenticity." The ostensibly unanticipated bleed over of value occurring in EverQuest's players' participation in more traditional structures of e-commerce and the combination of real and virtual economies explicitly planned by the creators of Project Entropia speak to a repetition of this process on the level of monetary exchange and value. As much as players must undertake a mutual investment in the reality of a game populated by virtual avatars, their decision to invest real money in virtual commodities is far from foolish and is supported by the same consensus that guarantees participation in the "real" economy of speculative capital.

In fact, the capitalization of EverQuest and Project Entropia in many ways suggests a privileged site for reading what Jeffrey T. Nealon describes as the logic of intensity at play in "real life" economies: "Such is the logic of intensity, then on both the global and subjective level: in a world that contains no 'new' territory — no new experiences, no new markets — any system that seeks to expand must by definition intensify its existing resources, modulate them in some way(s)" (82). The pixilated territories of MMORPGs create a new "frontier" for capital in two registers: a wholly constructed and virtual market, as well as a site for the production of new experiences and modulation of existing ones through the fantasy elements and role-playing practices that play large in both these games and critical writings about them. One might ask, however, in light of the findings of the Hammersmith study and the testimony of an increasing number of EverQuest Widows, if the kind of fantasy or experience created on the subjective level by this new territory introduces a level of compulsion and immersion more intense than any earlier market (including the most often cited emblem for the new economy — the casino). And yet, if EverQuest seems less like leisure or "pastime" and more like operant conditioning or a bundle of commands, its control and communication structure also hails not negation or foreclosure ("put a warning label on it" [14]) but for a redirection and manipulation of these new practices of mediation — practices that I will argue are already active in the emerging discipline of affective computing and being productively redirected in the EyeTap technology of Steve Mann.

In Love with the Machine: Hacking Emotional Bandwidth

click to enlarge image
click to enlarge image
CARA is online to assist (and affect) you. Emotional Avatar CARA fielded mortgage questions for UK bank First Direct between December 2002 and August 2003. Alternate versions can currently be demoed at Lexicle systems (http://www.lexicle.com/products/demo.html).

Even the simulation of an emotion tends to arouse it in our minds.
Charles Darwin, The Expression of the Emotions in Man and Animal (285)

Although, as indicated above, critics of video games may be largely blind to the affective operations provoked by their mediation, this point is old news to programmers. The logic of intensity abounds in design literature. One designer goes as far as to implicate the importance of physiology in noting the prevalence of the "blink rate" — the old Madison Avenue goal of making an advertisement so compelling that a viewer will not blink during its unfolding — in game planning (Quittner).

This interest (and the similar movement in MMORPG technology towards more realistic and emotive forms of embodiment) coincides with two larger trends in computer research and corporate application: "emotional bandwidth" [15] — Mitch Kapor's measure of a communication technology's capacity to transmit "subtle emotional cues" alongside informatic content — and "affective computing" — a blanket term applied to strategies for attending to the affective responses and needs of hardware and software users. [16] These technologies similarly find their most ubiquitous manifestation in the form of virtual avatars, their most consistent practice as pedagogical, and their most lucrative application in those three corporate functions that require the greatest capacity for "soft skills" — sales, training, and customer service.

The last of these three provides the most "visible" application of developing technologies of language-recognition and emotive behavior engines. Banks, in particular, increasingly are adding Responsive Virtual Human Technologies (RVHT) applications to ATM screens because the "human touch" provided by a humanoid avatar with recognizable facial expressions quickly outpaced text-only instructions which typically remain skimmed or unread. However, it is the response to the invisible functions of audio-only telephonic avatars — those that can detect the affects in a caller's speech and modulation — that is most telling. Application of these programs to both sales and customer service produced mixed results: although customer satisfaction grew, individuals proved reluctant to give up their connection to the avatar, and a move meant to save time and resources (a typical audio-avatar being able to service up to fifty callers at a time) proved counterproductive as call times increased dramatically. Either customers were hip to the machine and enjoyed performing their own iterations of the Turing Test, or, and this is the compulsion suggested by customer polling, they were adverse to ending their time with the lively entity on the other end of the phone, one implicated with them in an affective feedback loop and adapting to their every cadence.

Further complicating this schema is the use of affective computing and RHVT in training personnel in soft skills. In addition to the direct line opened between the customer and telephonic avatars, research has been performed into how this technology might also be used in disciplining the responses of human telephone representatives by having them practice with "emotive virtual reality agents" (Link). This poses a somewhat thornier question than the one prompted by the direct appliance of RHVT in customer service: "Who's on the line again? The machine or the human trained by the machine?"

Affective computing produces a new vector for corporate interest in human-machine interaction, a point emphasized by the transnational SAFIRA project:

The goal of affective computing is to allow a humanization of the interface, to make interfaces address a wider range of human experience than those traditionally addressed by computer science. In order to accomplish this, affective computing must address a broader, culturally appropriate notion of emotion, not only as understood by the cognitive science models of emotion which motivate much computer research in the area, but also in such areas as depth psychology, the humanities, and art. (Paiva)

Although it is no longer surprising to find computer science (particularly in the register of corporate research) speaking the language of the humanities and aesthetics, we might question the implications of this second category of avatars and the replication/ dissemination of affect they deploy. Both the service and training avatars run by a logic of trained response and behavioral conditioning and both signify a new openness to the productive capacities of cybernetic programmability. And yet, they also threaten a directionless back-and-forth and a bizarre replication of the same. We will be only as emotional as our machines program us to be (as long as they promise to do the same). "Operator?"

Perspectacles

The splinter in your eye is the best magnifying glass.
Theodor W. Adorno, Minima Morlia: Reflections from a Damaged Life (50)

So far I've argued that a certain reading of cybernetics has neglected the more productive aspects and strategies of the movement (namely, an emphasis on the pragmatic particulars of human-machine interaction, and a recognition of the role affect and response play in these practices); that the same mode of interpretation haunts contemporary analyses of technologies of mediation characterized by traditional methods of critical distance and critique tied to representation; and that MMORPGs like EverQuest and various modes of affective computing simultaneously produce new alliances in human-machinic interactions, but replicate a certain unproductive distance from these effects and a reduction of the constructive pontentialities of these interactions. I would like to end by considering a related — but perhaps more vital — mobilization of mediating technologies' affects and effects: the EyeTap networks of Steve Mann.

And yet, this immediately seems like an odd choice. Mann, who has sported various forms of wearable computing since the sixties, is certainly a forerunner for novel interactions between humans and machines, but his highly public ethos shouts negation and critical distance. Having assembled a computing platform "from items salvaged from dumpsters, and from curbside trash" ("Post-Cyborg"), Mann expresses a resistance to closed-source software and a boycott mentality that would make the most hardcore shareware advocates look like Microsoft dupes. Even the EyeTap technology on which I wish to focus here is cathected to two philosophies that are a far cry form the productive capacities of the "diminished" agency engendered by cybernetic self-experimentation and programmable subjectivity for which I have been trying to make a case in this essay. Mann's concept of "humanistic intelligence" (HI) is pitched as "a step toward a foremost principle of the Enlightenment, that of the dignity of the individual" through the "prosthetic transformation of the body into a sovereign space" (Mann/Niedzviecki 30). Similarly, his program for "existential computing" stipulates that the key to empowerment relies on greater control by the individual:

Existentiality denotes the degree to which the individual has control of personal technology systems...A wearable computing system that allows the user to control all inputs and outputs has existentiality. A "smart" room that automatically adjusts lighting and environment regardless of the occupant's wishes does not provide existentiality. (262)

Mann's renunciation of certain technologies (closed source softwares) and attempts to reclaim the Enlightenment subject through greater control of technology might imply a certain notion of transcendence that is uncomfortable in our present, posthuman, moment — a suggestion that the appropriate responses to these technologies is to somehow remain outside of them or gain greater control of them. [17]

In spite of these conceptual frameworks that might underpin them, I want to argue that Mann's "selectively mediated reality" EyeTap goggles, which utilize Image Recognition Technology to alter his immediate field of vision, create a perspective and a practice that provides strategies for surfing cybernetic flows — ones that working immanently through these technologies and configure an agency far different than those Mann attempts through humanistic intelligence or existential computing. Despite Mann's reliance on a certain mode of separation between his agency and that of his wearable computing, he admits the degree to which it has become part of his "normal" perception:

It's like an additional sensory organ that I've had running for so long that it, it's — parts of the brain have developed around it and then suddenly when it's shut down it's like sort of riding blindfolded or with ear plugs in. You know? It's like removing some sensory capability that should be there or at least from my own perception should be there. ("Script") [18]

For Arthur Kroker, this is a dangerous familiarity, one that in practice abandons the critical distance and control Mann invokes in his writings:

Steve Mann for all that I like him I would say that he's unreflective on the extent to which he has made his body an extension of the technology itself. And he — therefore his experiment on the technology is important not in terms of its technology. But in terms of what it says culturally. When human beings passively and actively, very actively and happily accept being an extension of the technology and shut down any critical consciousness of that. ("Script")

But just what is this perspective that has become the norm for Mann and implies a loss of critical consciousness for Kroker? Consider first another pair of glasses, this one fictional. In John Carpenter's dystopian film, They Live, Roddy Piper's character Nada stumbles upon spectacles that allow him to see the subliminal import of everything around him. These glasses peer through the immediate level of representation to display their real signification, in this case a series of directives ordering humans to consume, revere money, and maintain the status quo. Billboards, for instance, are revealed "actually" to signify directives such as "conform" and "obey," and paper currency is inscribed with the message "this is your god."

click to enlarge image
click to enlarge image
The face of contemporary criticism? They Live screenshots.

click to enlarge image
click to enlarge image

When Nada becomes aware through his new vision that aliens are exploiting humans and treating the earth like "their third world," he struggles to articulate this truth to others. He struggles in the literal sense of the word with Keith David's character Frank, attempting to force the eyeglasses onto him in the midst of a violent and absurdly protracted fight. In a hyperbolic vein, one might say that these glasses represent certain prevalent critical modes of interpretation and critique (such as those cited in the second section of this essay), a constant revelation of the ideological and subliminal value of all media — "What does it mean?" Nada's reaction is mirrored as well in a concomitant attempt to make this "meaning" known and accepted by others (a process often described, perhaps accurately, as a "struggle"). Mann's glasses, working more in the mode of manipulation and response than interpretation, hail other images:

click to enlarge image
click to enlarge image
Screeshots from Steve Mann's EyeTap goggles (image from Mann's "Mediated"): "This is done using one of the applications of the WearComp system, something I call Visual Memory Prosthetic. In this function, visual information - what I see - is temporarily recorded in a memory buffer. This allows me to then instruct the computer to block out all images of that nature in the future. WearComp's Visual Memory Prosthesis can help us forget or not see at all, as well as remember and enhance vision. It can, effectively, filter out unwanted visual detritus through the EyeTap" (Mann/Niedzviecki 39-40).

The important move here is not so much a disruption of communication, but the self-experimental and creative ethos that drives (the aptly named) Mann's interaction with machinery and machinic consciousness. Mann describes how the "visual memory prosthetic" of the EyeTap googles works in Cyborg:

In this function — what I see — is temporarily recorded in a memory buffer. This allows me to then instruct the computer to block out all images of that nature in the future.... The technical process by which this is done is relatively simple: I can take an electronic snapshot of an ad and tell my computer to block that ad form my vision in the future.... In this way, I can set my programming to delete any advertisements for cars, cleaning products, or condoms — any of certain selected advertisements placed into a kill file. If I prefer, I can use the WearComp system simply to eliminate all specific billboard advertisements from my vision. (39-40)

We might find a precursor for EyeTap in a canonical text of cybernetic theorizing, "What the Frog's Eyes Tell the Frog's Brain" (1959), which analyzes the result of assaying a frog's vision by implanting electrodes into its visual cortex. The experiment yielded that "the eye speaks to the brain in a language already highly organized and interpreted, instead of transmitting some more or less accurate copy of the distribution of light on the receptors" (251). A frog's vision was found not to present an "accurate" portrayal of the territory it surveyed, but, rather, a biologically useful filtering of that territory: items of interest, such as flies in motion, were more easily perceived and foregrounded, whereas items of little natural interest to the frog elicited weak responses. The authors concluded that this process has "more the flavor of perception than of sensation," i.e., it suggested that the outputs of frog's eyes already presented a certain interested interpretation of reality rather than a direct picture of "reality" to be passed on to higher nerve functions for processing and decoding (253). This conclusion influenced co-author Humberto Maturana to alter his cognition studies in order to "treat seriously the activity of the nervous system as determined by the nervous system itself, and not by the external world" (Introduction xv) and to "take seriously the indistinguishability in the nervous system between perception and hallucination" (xvi). However, the material application of these findings, in accord with Maturana's theoretical extrapolations but in the service of altering (rather than analyzing) perspectives, is perhaps best epitomized in EyeTap. Although a large amount of technologies have risen around humans' attempt to filter out unnecessary or unwanted visual stimuli (sunglasses, etc.) and amplify others (eyeglasses, microscopes, etc.), Mann's technology intensifies this process to unprecedented degrees of programmability, ubiquity, and experimentation.

Preprogramming the goggles to detect and distort advertisements (in the situation pictured above, the image has been additionally written over by another user receiving the inputs from Mann's goggles from afar) prompts a sequence of actions where Mann must shift between being the operator and subject of experiment, a shift marked only by the putting on and taking off of the spectacles. Mann becomes, in a cybernetic sense, both programmer and programmed — the "my programming" of Mann's statement above becoming a reference to both his action of programming and the programming of himself — as agency and intention (let alone control) become harder to pinpoint. At the intersection of contemporary mediation technologies and the genealogies of both cybernetics and asceticism, Mann has (perhaps, and appropriately, inadvertently) (re)created a new technology of the self: an act performed on the self by the self that seeks to alter subjectivity by treating these technologies and their consequences as practices and effects rather than objects of representation or critique. Concomitantly, he shifts from the analysis of perspectives to the alteration of the same.

Coda: Surfing the Cybernetic Age

The entire concept of modern day surfing is built on a framework of creative stunts....
James Wagenvoord and Lynn Bailey, How to Surf (89)

I argue above that Mann's experiments in mediated reality create a certain fluid doubling of the programmer and the programmed. Brian Rotman approaches this phenomena in a recent essay, one written over a half-century after the essay by Turing that began this paper, and which begins with the opposite query: whereas Turing uses the notion that intelligent machines are beginning to think like humans as a point of departure for "Computing Machinery and Human Intelligence," Rotman begins with a Deleuzoguattarian notion of the machinic self of humans to suggest that human subjectivity is beginning to function like a particular intelligent machine (parallel processing computers). Rotman argues that a certain multiplying is emerging everywhere, that the increasing cotemporality manifested in digital and networked technologies is always producing a post-serial subjectivity, a feeling of becoming beside oneself. He ends by questioning readers as to whether they want to return to the illusion of a cohesive single self or embrace this new creation, a question he admits is technically "fake":

do we really have much choice? Aren't we being dragged, like it or not, into a world of ever more numerous co-happenings? Isn't parallel computation and the relentless co-presencing and distribution it facilitates/demands already starting to control numerous social and cultural sites where subjects are produced? Isn't the multiple optic and its post-collage image already deeply embedded in various semiotic and art practices constituting appropriately multiple visual subjects?

And yet, he writes, in aesthetico-ethical terms, "the question couldn't be more real." Although recovering the serial self is impossible, the post-serial self either can be embraced or condemned.

The economies of human programming and programmability that I have tried to recuperate here through a certain retrofitting of cybernetic aims prompt questions of agency and transformation similar to both Turing's and Rotman's approaches to human-machine interaction. At best it offers not "resistance" to any of the powers of control and communication, but rather a certain toolbox for "surfing" these flows, an application of them through a logic of inversion, reversal, and manipulation. Although in relation to cyberculture, the term "surfing" finds its most ubiquitous use in that greatest of contemporary time-wasting activities — surfing the net — we might do better by thinking of it in two different registers: the physiological and strategic practices of the water sport, and its relation to acts that originally inspired Norbert Wiener to deem the new study of control and communication "cybernetics" — the Greek term for steering or navigation. [19] Both imply not a resistance to the forces in which one is immersed, but instead strategies for the redirection of them and practiced reaction to them. It is at this vector, I'd like to suggest, that we must begin searching for our ethical responses to and material applications of cybernetic technologies. "The techniques and experiments described here have been used and are being used by agencies official and non official without your awareness and very much to your disadvantage any number can play." [20]

Acknowledgements

Thanks to Richard Doyle, Stuart Selber, and anonymous readers for Post Identity for their insightful comments on an earlier draft of this essay.

Jeff Pruchnic is a Ph.D. candidate in the English Department of the Pennsylvania State University specializing in rhetorics of science and technology. He is currently at work on a dissertation studying the emergence of cybernetics and the control sciences tentatively titled Neuropolitics. He can be contacted directly at jap334@psu.edu.

End Notes

1. In a somewhat more nuanced reading of Turing's essay, Elizabeth A. Wilson argues that "what is put into action is not the expulsion of the body, but its deliberate restraint" (Neural Geographies 110):

To this end, the body is never radically absent from Turing's field of cognition; rather, it has been fabricated and naturalized as a benignly noncognitive entity. Turing's fantasy of a discrete cognitive domain of pure intellectual communication between cognizing subjects is premised not on the eradication of the body, but rather on an attentive constraint and management of corporeal effects. (111)

Though more attentive to the importance of bodies (particularly female bodies) in Turing's thinking, Wilson, like Hayles, focuses narrowly on the definitional aspect of "Computing Machinery and Human Intelligence" - the conceptualization of a "cognitive domain" that might be labeled (human) "intelligence." Although her more recent "Imaginable Computers: Affects and Intelligence in Alan Turing" repeats this depiction of the "philosophically and empirically moribund" definition of intelligence in "Computing Machinery" (41), as her title suggests, it pays more attention to the role of affects in Turing's thought, focusing on Turing's tendency to be "surprised" by computers and his interest in the relationship between infant structures and computer programming.

2. The duo are responding to an objection by Richard Taylor ("Comments on a Mechanistic Conception of Purposefulness") included in the same issue of Philosophy of Science; indeed they do less than convince Taylor, whose follow-up to Rosenblueth and Wiener's response is also included in the same issue.

3. Lilly was an acknowledged researcher and user of Ketamine, a drug rumored to allow communion with the dead, but most commonly prescribed in the U.S. as an anesthetic for felines (see chapters 19 and 20 of his The Scientist).

4. The heady mix of human desire and artificial intelligence Berlinski describes is far from extraordinary in applications of the Turing Test. Turing's biographer, Andrew Hodges, relates that the Loebner Prize winning program for 1991 was on the topic for "romantic conversation" and the winner for 1994 "sex." The next year's winner, competing in the first contest when programs were not limited to a particular topic, still produced a transcript dominated by sex (see Hodges' online scrapbook).

5. Though first used by John Dewey in explicit reference to economics, "occupational psychosis" was expanded by rhetorician Kenneth Burke to describe all instances when an individual's habituation to a certain way of perceiving phenomenon makes it difficult to see them another way (including academics). See Burke 37-49.

6. Sony claims their tracking of the number of players simultaneously online topped out at upwards of 120,000.

7. Nick Yee provides a thorough overview of his and others' attempts to survey average usetime (see his "Codename Blue"). Average usetime surveys are compiled on a voluntary basis and therefore likely to attract more enthusiastic players and concomitantly skew demographic results.

8. Jay David Bolter and Richard Grusin's interaction with the work of Marshall McCluhan is exemplary in this regard. Although they follow McCluhan's assertion that "the 'content' of any medium is always another medium" (McCluhan 23) for their analysis of the relationships between current and older media, they largely jettison McCluhan's overarching concerns with media's physiological import (media as a more than metaphorical extension of human nervous systems and sensory capacities) and the various levels of interactivity required by different media (foregrounded in McCluhan's famous distinction between "hot" media — those that require little rational/non-rational contribution from the receiver to be complete — and "cold" media — those that require a high degree of interactivity). Instead, Bolter/Grusin's focus on the remediating aspects of contemporary digital media speaks more about the psychoanalytic, economic, social, aesthetic, and historical contexts of new technologies than an individual's immediate interactions with them (see Bolter/Grusin's third chapter, "Networks of Remediation," 64-84).

9. Bernard Schiele and Gertrude J. Robinson provide a survey of the status of "effects research" in the study of electronic media as an appendix to Sturm's anthology. The story of its relation to similar practices in other sciences (such as Libets' contributions) and the parallel investigation into effects going on in the humanities around the same decades by individuals such as Massumi and, for instance, in the genealogical methodologies of Michel Foucault and the pragmatic methodologies of Deleuze and Guattari, has yet to be written.

10. One might consider other impetuses behind engaging EQ's dopamine rich landscape; Tony Lamont Bragg, an individual convicted of manslaughter after placing his nine-month old son (who somehow sustained a punctured lung and broken collarbone from the process) in a closet during a 24 hour plus EQ marathon, is usually pointed to as the cautionary posterboy and monitory character in narratives of EverQuest addiction. However, Shawn Woolley, a diagnosed schizophrenic who committed suicide after a prolonged period of playing EverQuest for upwards of 12 hours a day is perhaps the more illustrative case. Dopamine is one of the most commonly prescribed drugs for managing schizophrenia, but E-Quest outpaces L-Dopa in availability, sustainability, and ease of use.

11. Steven Shaviro's recent work has expanded on this relationship between cyberculture and drugs, emphasizing the psychedelic or synaesthetic nature of contemporary mediation technologies. For Shaviro, "Whenever we worry about drugs altering our bodies and minds, we should remember that cars and television do this too" (185):

Psychedelic drugs and electronic technologies affect the sensorium in striking strikingly similar ways. They both disperse and decenter subjectivity. Consciousness is scattered all across space, and yet in a strange way intensified. Concentration is no longer possible, for too many things are happening all at once....Yet, despite the isolation and fragmentation, everything seems to be mysteriously connected. (188)

12. "Set and setting" appears as a trope both implicitly and explicitly in the work of many investigators into hallucinogens and other consciousness-altering substances such as Humphrey Osmond and Timothy Leary. I use it here in reference to its more popular appearances in analyses of addiction. This second type of usage enters popular parlance with the publication of Norman Zinberg's Drug, Set, and Setting: The Basis for Controlled Intoxicant Use (sections available online: <http://www.psychedelic-library.org/zinberg.htm>).

13. The "strictly virtual" economy of the game itself has attracted a large amount of attention as well, including a very popular article by University of California Fullerton Economics professor Edward Castronova.

14. The only warning label Sony has assigned to EverQuest does indeed center on its nonsignifying aspects: a photosensitive seizure warning posted on their website alerts players to the effects the game might produce on their physiology (<http://www.station.sony.com/en/services/help/photosens.jsp>).

15. This term was apparently first coined by LOTUS founder Kapor in a contribution to the Buddhist influenced magazine Tricycle. Appropriately, its use and application has drifted to less "enlightened" programmers and corporate planners.

16. Pickard provides a good background of the emergence of this concept and its current developments in her book of the same name (Affective Computing).

17. On the other hand, in commenting on his "cyborg in(ter)ventions" Mann seems to suggest a more immanent approach of working through these technologies ("Existential" 20). Through a series of installations and performance pieces, Mann (and others) due in fact diminish their agency by, for instance, allowing certain technologies they wear to be controlled by another individual or operating a computer rmouse "ouija-board" style so that it becomes indiscernible who (if any) of the participants are liable for the actions performed. Although Mann suggests these practices present attempt to make them individual more "free," this largely cashes out as a freedom from responsibility (if the actions are controlled by another or by a group, the individual cannot realistically be compelled to account for these actions — this in turn creates a parody, or perhaps absurdist repetition, of the modes of corporate doubletalk and responsibility-dodging that Mann is critiquing). Mann personally expresses ambivalence about these attempted inversions in a recent writing, wondering if in a time where it is becoming hard to tell the difference between "culture jamming" and "culture spamming," and between activism and "Gaptivism," that the idea of an "inverse" to such technologies as electronic surveillance makes much sense ("Post-Cyborg").

18. Mann relates elsewhere that disconnection from the EyeTap technology has resulted in "disorientation and flashbacks" (Mann/Niedzviecki 12).

19. For the background of Wiener's derivation of "cybernetics" from the Greek term χυβερυητησ, see Watanabe's commentary on his early work (215).

20. Spaces in original. This line is from William S. Burroughs' Nova Express (115). Here Burroughs, speculating on the coming cybernetic age of communication and control, approximates our appropriate response.

Works Cited

Adorno, Theodor W. Minima Moralia: Reflections from a Damaged Life. Trans. E.F.N. Jephcott. New York and London: Verso, 1974.

Berlinski, David. The Advent of the Algorithm: The Idea that Changed the World. New York: Harcourt, 2000.

Bernstein, Charles. "Play it Again, Pac-Man." Postmodern Culture 2.1 (1991). Project Muse. 23 Sep. 2003 <http://muse.jhu.edu/journals/postmodern_culture/v002/2.1bernstein.html>.

Bolter, Jay David and Richard Grusin. Remediation: Understanding New Media. Cambridge: MIT Press, 1998.

Burke, Kenneth. Permanence and Change: An Anatomy of Purpose. Third Edition. Berkeley, U of California P, 1984.

Burroughs, Williams S. Nova Express. New York: Grove Press, 1987.

Carpenter, John, dir. They Live. Screenplay by John Carpenter. Universal, 1988.

Castronova, Edward, "Virtual Worlds: A First-Hand Account of Market and Society on the Cyberian Frontier." CESifo Working Paper Series No. 618. (December 2001). Social Science Research Electronic Library. 23 Sep. 2003 <http://ssrn.com/abstract=294828>.

Darwin, Charles. The Expression of the Emotions in Man and Animal. Eds. Paul H. Barratt and R. B. Freeman. The Works of Charles Darwin. Vol. 23. London: William Pickering, 1989. <http://www.human-nature.com/darwin/emotion/chap14.htm>.

Deleuze, Gilles. "Postscript on Control Societies." Negotiations, 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 177-182.

Douglas, Christopher. "'You Have Unleashed a Horde of Barbarians!': Fighting Indians, Playing Games, Forming Disciplines." Postmodern Culture 13.1 (2002). Project Muse. 23 Sep. 2003 <http://muse.jhu.edu/journals/postmodern_culture/v013/13. 1douglas.html>.

EverQuest Widows Group. 7 Apr. 2003 <http://groups.yahoo.com/group/EverQuest-Widows/>.

Guattari, Félix. "Machinic Junkies." Trans. Chet Wiener. Soft Subversions. Ed. Sylvére Lotringer. New York: Semiotext(e), 1996. 101-105.

Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: U of Chicago P, 1999.

Herman, Andrew, and John H. Sloop. "'Red Alert!': Rhetorics of the World Wide Web and 'Friction Free' Capitalism." The World Wide Web and Contemporary Cultural Theory. Eds. Herman and Thomas Swiss. New York: Routledge, 2002. 77-98.

Hodges, Andrew. "The Turing Test in Practice: 1950-2001." 16 Sep. 2003 <http://www.turing.org.uk/turing/scrapbook/gsoh.html>.

Kapor, Mitch. "Emotional Bandwith." 11 September 2003 <http://www.kapor.com/homepages/mkapor/emotional-bw.html>.

Koepp, M. J., et al. 1998. "Evidence for Striatal Dopamine Release During a Video Game." Nature 393: 266-8.

Lettvin, J. Y., H. R. Maturana, W. S. McCulloch, and W. H. Pitts. "What the Frog's Eye Tells the Frog's Brain." Embodiments of Mind. By McCulloch. Cambridge: MIT Press, 1965.

Libet, Benjamin. "Unconscious Cerebral Initiative and the Role of Conscious Will in Voluntary Action." The Behavioral and Brain Sciences 8 (1985): 529-566.

Link, Michael W., et al. "A Test of Responsive Virtual Human Technology as an Interviewer Skills Training Tool." Proceedings of the 2002 Annual Conference of the American Association for Public Opinion Research. 2002. 28 Sep. 2003 <http://www.cs.duke.edu/~cig/papers/AAPOR02_Link_Armsby.htm>.

Lilly, John C. Programming and Metaprogamming in the Human Biocomputer: Theory and Experiments. New York: Julian Press, 1972. <http://www.city net.com/~mbt/pamithb.html>.

—. The Scientist: A Novel Autobiography. Philadelphia: J. B. Lippincott, 1978.

Mann, Steve. "Existential Technology: Wearable Computing is Not the Real Issue!." Leonardo 36.1 (2003): 19-25. Project Muse. 20 Sep. 2002 <http://muse.jhu.edu/journals/leonardo/v036/36.1mann.pdf>.

—. "Mediated and Deliberately Diminished Reality." 28 Sep 2003 <http://about.eyetap.org/library/weekly/aa012301a.shtml>.

—. "The Post-Cyborg Path to Deconism." CTheory 18 Feb. 2003. 28 Sep 2003 <http://ctheory.net/text_file.asp?pick=368>.

—, and Hal Niedzviekci. Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer. New York: Doubleday, 2001.

Massumi, Brian. "The Autonomy of Affect." Parables for the Virtual: Movement, Affect, Sensation. Durham: Duke UP, 2002. 23-45.

Maturana, Humberto R. Introduction. Autopoesis and Cognition: The Realization of the Living. By Maturana and Francisco J. Varela. Boston Studies in the Philosophy of Science 42. Boston: D. Reidel, 1980.

McLuhan, Marshall. Understanding Media: The Extensions of Man. New York: McGraw-Hill, 1964.

Nakamura, Lisa. Cybertypes: Race, Ethnicity, and Identity on the Internet. New York: Routledge, 2002.

Nealon, Jeffrey T. "Empire of the Intensities: A Random Walk Down Las Vegas Boulevard." Parallax 8.1 (2002): 78-91.

Paiva, A., et al. "SAFIRA—Supporting Affective Interactions in Real-time Applications." 28 Sep. 2003 <http://gaips.inesc.pt/safira/publications/P-SAFIRA-Overview.pdf>.

Quittner, Joshua, et al. "Are Video Games Really So Bad?." TIMEasia 10 May 1999. 11 Sep. 2003 <http://www.time.com/time/asia/asia/magazine/1999/990510/video1.html>.

Rosenblueth, Arturo, and Norbert Wiener. "Purposeful and Non-Purposeful Behavior." Philosophy of Science 17.4 (1950): 312-326.

—, and Julian Bigelow. "Behavior, Purpose and Teleology." Philosophy of Science 10.1 (1943): 18-24.

Rotman, Brian. "Becoming Beside Oneself." 1998. 15 Sep. 2003 <http://www.wideopenwest.com/~brian_rotman/becoming.html>

"Script: Cybersouls." 28 Sep. 2003 <http://www.open2.net/digitalplanet/souls/Script3/scriptp1.htm>.

Schiele, Bernard and Gertude J. Robinson. "Effects Research at the Crossroads." The Emotional Effects of Media: The Work of Hertha Sturm. 53-57.

Shaviro, Steven. Connected or What it Means to Live in the Network Society. Electronic Mediations 9. Minneapolis: U of Minnesota P, 2003.

Sturm, Hertha. "The Missing Half-Second." The Emotional Effects of Media: The Work of Hertha Sturm. Ed. Getrude Joch Robinson. Montreal: McGill UP, 1987. 37-44.

—, and Marianne Grewe-Partsch. "Television - The Emotional Medium: Results from Three Studies." The Emotional Effects of Media: The Work of Hertha Sturm. 25-36.

Taylor, Richard. "Comments on a Mechanistic Conception of Purposefulness." Philosophy of Science 10.1 (1943): 310-317.

—. "Purposeful and Non-Purposeful Behavior: A Rejoinder." Philosophy of Science 10.1 (1943): 327-332.

Turing, A. M. "Computing Machinery and Human Intelligence." Mind: A Quarterly Review of Psychology and Philosophy 59.236 (1950): 433-460. <http://www.abelard.org/turpap/turpap.htm>.

—. "Digital Computers Applied to Games." Faster Than Thought. Ed. B. V. Bowden. London: Pittman, 1953. 286-310.

—. "Solvable and Unsolvable Problems." Science News 31 (1954): 7-23.

Turkle, Sherry. Life on the Screen: Identity in the Age of the Internet. New York: Simon & Shuster, 1995.

Wagenvoord, James, and Lynn Bailey. How to Surf. New York: Collier, 1968.

Watanabe, Sumio. "Wiener on Cybernetics, Information Theory, and Entropy." Norbert Wiener: Collected Works with Commentaries. Vol 4. Ed. Pesi R. Masani. Cambridge, MA: MIT Press, 1985. 215-218.

Wicked, Jin. The Universal Turing Machine. 28 Sep. 2003 <http://www.jinwicked.com/en/art/drawings/turing.html>.

Wiener, Norbert. Cybernetics: or Control and Communication in the Animal and the Machine. 2nd ed. Cambridge: MIT Press, 2000.

Wilson, Elisabeth A. "Imaginable Computers: Affects and Intelligence in Alan Turing." Prefiguring Cyberculture: An Intellectual History. Cambridge, MA: MIT Press, 2003. 38-51.

—. Neural Geographies: Feminism and the Microstructure of Cognition. New York: Routledge, 1998.

Yee, Nicholas. "Codename Blue: An Ongoing Study of MMORPG Players." 23 Sep. 2003 <http://www.nickyee.com/codeblue/home.html>.

Zinberg, Norman. Drug, Set, and Setting: The Basis for Controlled Intoxicant Use. New Haven: Yale UP, 1984.