Claire Colebrook

The Death of the PostHuman: Essays on Extinction, Volume One

    Introduction: Framing the End of the Species: Images Without Bodies

    Society invents a spurious convoluted logic tae absorb and change people whae’s behaviour is outside its mainstream. Suppose that ah ken all the pros and cons, know that ah’m gaunnae have a short life, am ay sound mind etcetera, etcetera, but still want tae use smack? They won’t let yae do it, because it’s seen as a sign ay thir ain failure. The fact that ye jist simply choose to reject whit thae huv to offer. Choose us. Choose life. Choose mortgage payments; choose washing machines; choose cars’ choose sitting on a couch watching mind-numbing and spirit-crushing game shows, stuffin fucking junk food intae yir mooth. Choose rotting away, pishing and shiteing yersel in a home, a total fucking embarrassment tae the selfish, fucked-up brats ye’ve produced. Choose life.

    Well, ah chose no tae choose life. If the cunts cannae handle that, it’s thair fuckin problem. (Irvine Welch, Trainspotting, 187-88)

    There are three senses of extinction: the now widely discussed sixth great extinction event (which we have begun to imagine and witness, even if in anticipation); extinction by humans of other species (with the endangered species of the ‘red list’ evidencing our destructive power); and self-extinction, or the capacity for us to destroy what makes us human. All three senses of extinction require a nuanced conception of climate. Climate is at once an enclosing notion, imagined as the bounded milieu that is unavoidably ours, and a disturbing figure, for it is with the recognition that there is climate, or that the human species is now recognizable as a being that for all its seeming diversity is nevertheless bound into a unity of destructive power. (This is so much so that geologists are arriving at consensus regarding an ‘Anthropocene epoch’ where man’s effect on the planet will supposedly be discernible as a geological strata readable well after man ceases to be, even if there are no geologists who will be present to undertake this imagined future reading (Crutzen 2000). Climate is not only, then, the surface or terrain upon which we find ourselves, but something that binds us to this time on the earth, with its own depletions and limits.)

    There is, of course, the standard meteorological notion of climate which increasingly attracts our already over-taxed attention; but this concept of climate is only possible because of a broader thought-event where humans begin to imagine a deep time in which the human species emerges and withers away, and a finite space in which 'we' are now all joined in a tragedy of the commons. I would suggest that just as Darwinian evolution altered the very modes of scientific and imaginative thinking, such that new forms of narrative and knowledge were required to think of man as a species emerging within time (Beer 1983), so global climate change is similarly catastrophic for the human imaginary. It becomes possible to think of climate as the milieu that is necessary for our ongoing life, and as the fragile surface that holds us all together in one web of risked life, even if we cannot practically grasp or manage the dynamics of this totality (Gardiner 2006). The concept of climate is also split between knowledge and denial: on the one hand talk of climate draws all bodies (organic and otherwise) into a single complex, multiply determined and dynamic whole; on the other hand, any brief glance at climate change policy and politics evidences a near psychotic failure to acknowledge or perceive causal connections with dire consequences. In this respect we need to embark on a notion of climate change that includes the radical alteration of knowledge and affect that accompanies the very possibility of climate. It is only possible to think of climate change in the meteorological sense—with humans now bound to volatile ecologies that they are at once harming and ignoring—if some adjustment is made to the ways in which we think about the relations among time, space and species. A necessarily expansive sense of climate change encompasses a mutation of cognitive, political, disciplinary, media and social climates. The fact that we start to think about climate as a general condition that binds humans to an irreversible and destructive time means both that climate becomes an indispensable concept for thinking about the new modes of knowledge and feeling that mark the twenty-first century in terms of our growing sense of precarious attachment to a fragile planet, and that climate is an alibi. We talk about climate, ecology, globalism and even environment (as that which environs) even though the experience of climate change reveals multiple and incongruent systems for which we do not have a point of view. We are at once thrown into a situation of urgent interconnectedness, aware that the smallest events contribute to global mutations, at the same time as we come up against a complex multiplicity of diverging forces and timelines that exceed any manageable point of view.

    In a recent fable that allegorized the human relation among memory, destruction and the future of life, Nick Bostrom suggests that the human species would remain complacent about its catastrophic history and future as long as it continues to forget that its situation is catastrophic. We have taken the catastrophe of human existence as natural and irredeemable: only a counter-narration in which we vanquish destruction will let us see just how death-inured we have become (Bostrom 2005). More recently, climate change scientists have started to play with new strategies for awakening public affect: perhaps the focus on hope needs to give way to mobilizations of fear, whereby we learn to ‘hug the monster,’ in order to shift from inertia and quiescence to action. [1] How is it that the human species, seemingly so hungry for life and dominance, has conveniently forgotten its own self-extinguishing tendencies? We can only pose the question of human extinction—the fact that humans will become extinct, the fact that we cause other extinctions, and also that we are extinguishing what renders us human—if we locate the problem of climate change inaction in a broader terrain of ecological destruction. The very climates—cognitive, industrial, economic, affective, technological, epistemological and meteorological—that render our life possible are also self-destructive (both destructive of the self, and destructive of climate itself).

    There is a widespread lament regarding a trajectory of self-extinction occurring in the human brain. According to Susan Greenfield, in her book ID, we are losing identity: where our brains once operated by a synthesizing power of grammar, syntax and critique we are now seduced by a culture of stimulus (Greenfield 2008). We are not just losing one of our critical powers—our power to represent or synthesize what is not ourselves—we are losing our very selfhood. For 'we' are—as human, as identities—just this evolved synthesizing power. Greenfield locates her diagnosis of identity within a broader argument regarding the brain and its self-forming capacities. A certain self-loss is required for stimulus and pleasure, but a certain neural extension and order is required for meaning and self. In her earlier work Greenfield had argued for a healthy or normal balance between the capacity for the joy of fleeting sensation (such as the first taste of morning coffee) and the ability to link sensations into some broader network of selfhood and significance (Greenfield 2002). If there were no capacity to enjoy the simple moment we would suffer from depression, or an extreme search for meaning that we may never be able to fulfill; drugs that treat depression enable a release from the grip of significance. But today—perhaps—it is the fleeting insignificance that is taking over twenty-first century neural architecture. The diagnostic dimension of Greenfield's work lies in its lament regarding the new modes and temporalities of visual culture, where the transient ecstasies of video games overtake the sustained focus and pleasure of complex narrative and argument. This lament of human self-loss achieved through the over-consumption of stimulus is not Greenfield's alone. Her work keeps company with Carr's The Shallows: What the Internet is Doing to Our Brain (2010), Jackson and McKibben's Distraction (2008), Wolf’s Proust and the Squid (2007), Winifred Gallagher's Rapt (2008), N. Kathryn Hayles's (2007) theory of the transition from deep attention to hyper attention, and Bernard Stiegler's (2010) lament regarding the short circuits of transindividuation (with humans having lost the orientation of care). Precisely at the moment of its own loss the human animal becomes aware of what makes it human—meaning, empathy, art, morality—but can only recognize those capacities that distinguish humanity at the moment that they are threatened with extinction.

    It is possible to argue, as Giorgio Agamben (1998) has done, that there has always been a sense of the human capacity for failing to be human. We can lose ourselves—extinguish ourselves—because we are nothing more than potentiality. If humans were always and already fully human, if humanity were a simply actuality, then there would never be the possibility of failing to realize either one’s reason, or to recognize rational humanity in others. This is why Agamben has isolated a last chance for redemption precisely at this point in our history when it becomes apparent that what we are is not something essential that will necessarily come into being: our humanity is not an actuality from which we can draw grounds for action. The fact that we forget our impotentiality—that we treat humans as factual beings with a normality that dictates action—has reached crisis point in modernity, especially as we increasingly suspend the thought of our fragility for the sake of ongoing efficiency. Both totalitarianism and democratic hedonism are, for Agamben, forms of deadening managerialism. Both act on the basis of man as an actuality. It is at this point of exhaustion, when we have become frozen spectators in a world in which images appear as ready-mades, that we can see both that there is no guarantee that we will be human and that it is human to forget oneself. For Agamben it is both the modern horrors of totalitarianism (where humans are reduced to so much manageable and disposable matter or animality) and modern democratic hedonism (where we become nothing more than the targeted consumers of dazzling spectacle) that demonstrates human impotentiality, our essential capacity not to actualize that which would distinguish as human.

    Most importantly, this highly human inhumanity seems to centre strangely on the organ that organizes the human organism; for it is the same eye that reads and theorizes—that looks with wonder at the heavens—that is also seduced, spellbound, distracted and captivated by inanity. Immanuel Kant already drew on a tradition of philosophical wonder when he isolated man’s capacity to look into the heavens as both a source of delusion that would draw him away from grounded knowledge into enthusiasm, and as the necessary beginning of a power of thinking that would not be tied solely to sensation (Kant 1999, 269-70). The eye is geared to spectacle as much as speculation, with speculation itself being both productively expansive in its capacity to imagine virtual futures and restrictively deadening in its tendency to forget the very life from which it emerges. Indeed there is something essentially self-destructive about the human theoretical eye: our very openness to the world—the very relation that is our life—is precisely what seduces us into forgetting that before there is an eye that acts as a camera or window there must have been something like an orientation or distance, a relation without relation. I would suggest that we ought to think, today in an era of climate change, about moralizing laments regarding human reason's self-loss alongside various post-human theorizations that human reason is constituted by a certain self-forgetting. The human animal or human eye is torn between spectacle (or captivation by the mere present) and speculation (ranging beyond the present at the cost of its own life).

    There are two directions this criticism of the embodied eye can take: one is to expand the sense of the body, to imagine a receptive or perceptive power that is not a simple snapshot of the world but a full and expansive openness. Here we might identify a pseudo-Heideggerian criticism of Descartes that was taken up by cognitive science: Heidegger had already diagnosed Western metaphysics with Descartes as a fulcrum: Descartes is able to establish man as the 'subject' (or as that which remains present) because Western thought has always proceeded by forgetting the temporality through which all being comes into presence (Heidegger 1968). By the time Descartes establishes the subject as that which precedes and provides a foundation, 'humanism' has definitively forgotten that there is no such thing as man as a simply existing thing with an essence. For Heidegger what is required is not a retrieval of some pre-Cartesian connectedness to the world, with man and world being co-present; rather, before there is the dyad of man and world there is something like disclosure or revealing. Contemporary cognitive science and certain philosophies of the human have drawn upon this anti-Cartesianism to insist that man is not a camera, not a computer, and the eye is not a window (Wrathall and Malpas 2000; Thompson 2007; Wheeler 2005). Where such contemporary uses of Heidegger differ from Heidegger is in their diagnosis of Cartesianism as an accidental lapse rather than as evidence of humanity's self-forgetting 'essence.' These pseudo-Heideggerian diagnoses suggest that Cartesianism can be overcome by returning man to the richer expansive life from which he has become detached. The subtitle of Andy Clark’s book says it all: ‘putting, brain, body, and world together again’ (Clark 1997). For Heidegger, though, there is a necessary forgetting in any disclosure of being: to experience the world as present for me, and to begin questioning—as we must—from this already given world, relies upon a hiddenness or non-revealing that we must leave behind in living the world as our own. We begin in media res, always already thrown into a world that appears as so many natural and separate things. Our tendency to forget, and to live life inauthentically—not recognizing Being as the site for all clearing, as though the world were just this way naturally—is not something one can simply place behind oneself as an unfortunate philosophical error. For Heidegger in-authenticity or humanism (where we simply take ourselves to be a privileged thing among things) is not an external and unfortunate event but has to do with the very mode of being’s appearance: we see being appear, but do not attend to its coming into being. One mode of phenomenology after Heidegger has, however, taken the form of a correction or adjustment: we should overcome the deep problems of how we know or arrive at having a world and accept that the world just is that which is always already given and meaningful for living beings. Phenomenology should be naturalized and tied to a process of embodied knowledge. We are not minds who represent a world, but organisms from which the capacity and figure of knowing mind emerged.

    But there's another path, another way of dealing with man's tendency to reify himself. This other departure from a restricted subjectivism proceeds not by broadening the self to include emotions, dynamism and the non-cognitive, but by tearing the eye from the body. Rather than restore the human to some unified and expansive vision it might be possible to think of the eye as a machine. This machine would not be a computer, for a genuine machine does not have a central organizing program but is put to work through connections; one could consider synthesizers as computers receiving inputs and turning out data, or as machines in their creation and recreation of connections. For Deleuze and Guattari, the reference to synthesizers is not another metaphor for thinking, where we substitute one machine for another. Thought is a synthesizer: just as musical synthesizers take the sounds of the world and repeat, create and mutate various differences, so thought can maximize rather than diminish the complexity of sensations:

    A synthesizer places all of the parameters in continuous variation, gradually making ‘fundamentally heterogeneous elements and up turning into each other in some way.’ The moment this conjunction occurs there is common matter. It is only at this point that one reaches the abstract machine, or the diagram of the assemblage. The synthesizer has replaced judgment, and matter has replaced the figure or formed substance. It is no longer even appropriate to group biological, physico-chemical and energetic intensities on the one hand, and mathematical, aesthetic, linguistic, informational, semiotic intensities, etc., on the other. The multiplicity of systems of intensities conjugates or forms a rhizome throughout the entire assemblage the moment the assemblage is swept up by these vectors or tensors of flight. (Deleuze and Guattari 2004, 121)

    Before exploring the ‘multiplicity of systems of intensities’ in detail, we can go back over the relation between the eye and human self-extinction, between the eye that views the world in order to enable survival, and the eye that then becomes frozen or seduced by its own imaging power—to the point where the eye takes in a frozen image of its self. Bergson has argued for an economy of the eye and creative difference: in order to release itself from merely surviving in the world, the human eye organizes the world into conceptualized units, mastering the world by reducing difference. This intellectual process allows for increasing technologies and the furtherance of systems of order: the intellect is at home with technology and matter, or that which remains the same through time and can be mastered though repetition. What is abandoned is intensity—the infinitesimally small differences and fluxes that the eye edits out. For Bergson the problem with this difference-reducing mode of the intellect is when mind turns back upon itself, and fixes upon a static image: thought is no longer intuited (as it should be) as a dynamic creative force, but appears as a brain, representing self, thinking substance or ‘man’ (Bergson 1913, 196).

    This argument for the self as not being a substance but, rather, the condition for the organized perception of substance, has a long philosophical and moral history. If Aristotle argued that what distinguished us as humans was not merely perception of the world, nor consumption of the world but the capacity for perception and consumption to go beyond what is to consider what ought to be, and if Plato also argued that we should not merely perceive but think about that which gives itself to be perceived, this moral distinction becomes formal in modernity. That is, Plato and Aristotle concede that man is a biological being but with a capacity for reason, a capacity that distinguishes humans from other beings. But the modern theory of the subject, with Descartes positing a different substance—or res cogitans—makes a difference of kind and modality with regard to humans and their relation to images. ‘Man’ is the being to whom the world is given for representation; what man himself is can never be known in itself, but only after the event of perception of the world. For Foucault, it was this shift from a world that possessed its own order and hierarchies to some distinction between ordered world and man as representing being that marked a historical a priori: what shifted was not an event within time but the modality of time itself. In modernity historical time is that through which ‘man’ both recognizes that he emerges from material conditions, at the same time as the very logic of life that requires him to labor, speak and form social wholes can only be known after the event (Foucault 1970).

    If pre-moderns sought to elevate humans among other animals, modernity increasingly rejects human superiority and refuses to see man as rational animal; for man is pure reason. Kant does not argue that we have to be more than merely biological or animal beings, he insists that we are not beings at all. Rather, there are only beings because there is something like an organizing or synthesizing power. There is a world because there is a subject to whom a world is given. It makes no sense to strive to perceive or know the self, to try and capture the self as something that might be viewed. In the beginning is a potentiality for viewing from which we constitute a viewed world. We then imagine—ex post facto—that there must be selves who would be there to be viewed. Whereas Kant argued that there must be something like the subject who existed as this condition for all intuition (even if this subject cannot be known), Bergson (1913) argued that there was no subject who intuited images, just images or perceptions from which we posit some thing—the brain—that provides the illusory image that would cause all images.

    But if pre-modern philosophy from at least Plato onwards argues that we ought not think of ourselves only as appetites, for we are responsible for our organizing relation to the world, modern philosophy argues that we are only organizing relations. There is not a self who perceives; there are perceptions, from which something like a self is constituted. We cannot explain the self’s relation to images because of the interests and appetites that are its natural base. Desires and appetites are possible only because there is imaging: in the beginning is the relation. We can think of Freud here for whom pleasures are possible because of a prior genesis of a relation between desire and desired; the libido is a force that forms a relatively stable or ‘cathected’ pool of ongoing equilibrium, relating to the outside world in terms of its own tendency towards quiescence (Freud 2011). The desiring self is possible only because of a prior distribution that emerges from perception; a relation between self and other is formed through perception and does not precede perception

    Something quite distinct structures modern claims for the relation between mind and image. It is not only the case that the self emerges from organizing perception, but also that perception can destroy the self. In Beyond the Pleasure Principle Freud observes his grandson throwing a cotton reel in and out of his cradle, while intoning ‘Fort/Da’ (away/here) and it is from this observation that Freud argues that in addition to the self’s formation of a stable border between itself and the world, there is also a tendency to want to destroy or annihilate that distance. If pleasure is managing the relation between perceiver and the manageable influx of stimulus, then beyond pleasure there lies a tendency towards annihilation of distance, a dissolution of the bounded and perceiving organism.

    What if the brain that is supposedly properly (in its human mode) oriented towards synthesis were at risk of falling back, of devolution? For some time it has been noted that there is an anxiety regarding mere images: the society of the spectacle (Debord 1973), a world of simulation (Baudrillard 1994), a world of passive consumption (Adorno 2001) or mere exhibition without aura (Benjamin 2008), a world of hyper attention rather than deep attention (Hayles 2007), at once seems to destroy the brain's evolved powers, and yet also give the lie to a certain destructive illusion regarding the brain as image. If we have lamented for so long—since Kant at least—that man tends to forget that he is a subject and tends to take himself to be just substance, then why are we so alarmed today by the brain’s tendency to destroy any image or sense of itself, to be nothing other than the stimulus it receives? After all, this loss of self seems to be the fulfillment of a long modern striving for anti-self-consciousness or pure immediacy (Hartman 1970). Have we arrived, perhaps unwillingly, at Emerson’s transformation of the self into a transparent eyeball (Emerson 1982, 39)? And yet this achievement of what was once a Romantic and existential imperative for consciousness to be nothing other than its perceptual relation to the world, a pure process without reifying ground, is being met with mourning and alarm.

    First, consider all the ways in which 'we' are now reacting with horror to our own capacity not to be ourselves.

    This ranges from neo-Kantian claims that without the commitment to some idea of who I am—without some ongoing identification of what I would do if I were to remain true to the idea I have of myself—then 'I' am not a self at all (Korsgaard 2008, 86). There are also neurological claims regarding the importance of ongoing synthesis, ranging from Greenfield's moral anxiety over a culture of mere stimulus, to Antonio Damasio's claim that the self is not, as Descartes would have it, a thing that feels, but a receptive and creative structure of feeling from which it might then be possible to have a snapshot attitude to reality. If we lose sight of that feeling self, of the emotional brain, or of the naturally affective, connected and world-oriented self then we risk mistaking mind for mere machine or computer (Damasio 2000).

    When today—with horror—we look at young minds, we ask how they have become nothing more than cameras or computational devices. The young brains of today are not affected or world-oriented; they manipulate Facebook numbers with ruthless algorithmic force, and ingest images without digestion or rumination. We watch, with horror, as the human brain reverts to being not so much a reader of Proust as akin to a squid, or mere life (Wolf 2007). This tendency to be nothing more than a screen for images is observed as at once the brain's horrific tendency towards self-extinction (an internal and ever-present threat) and as accidental or extrinsic (something that has assaulted us from without, by way of technology and modernity).

    The diagnoses of the brain’s and humanity’s capacity to destroy itself are persistent and manifold, ranging from a supposed neural devolution caused by spectacle-stimulus culture to various anxieties about over-ingestion (where we glut ourselves on destructive images and various psychotropic drugs that diminish the brain’s synthesizing powers). And yet, at the same time, this release of the intuition of images from the organizing self of Cartesian subjectivism is hailed as redemption from the rigidity of man: no longer do we enslave ourselves to the notion of the autonomous, disembodied, affectless and world-divorced subject. One of the many and varied modes of post-humanism hails an end to human exceptionalism and cognition-oriented models, and instead begins from one already integrated, dynamic and connected world. There is no ‘really hard problem’ about the relation between mind and world, for the mind is an effect of relations, not something that has to act in order to represent a world to which it must subsequently relate (Flanagan 2007). It is not the case that we begin life as observing, representing beings who then become through receptivity. Rather, in the beginning is a dynamic and world-oriented receptivity from which organized cognition, and the sense of the self as subject or man emerges. It is from a primary openness to the world, a primarily sensed world, that there emerges the sense of one who sees.

    This ambivalent observation of the self-extinguishing tendency of the brain's capacity for imaging does not pertain only to philosophy, theory or recent theses of the brain. There are also popular accounts of our self-attrition, with our over-consumption of everything from the internet and Facebook to empty fats and calories, indicating that the very mechanisms that led to out expansion are the same that will lead to our demise. Beyond all the laments and moral proclamations regarding our falling away from the activity of human reason, and beyond all the post-human celebrations that there is no such thing as ‘man’ and that we are really always already at one with one web of life, we might ask how it is possible for humans to have this panicked (or joyous) apprehension of self-loss. If humans really are at one with the world of which they are nothing more than living and creative perceivers, why have we felt for so long that we are disengaged and rational minds? How did ‘Descartes’s error’ take hold? Or, if mind and reason are our proper self-creating potentialities, how is it that the spectacle of the world has lured us into destroying ourselves? Why are our own creations, technologies and desires the very mechanisms that preclude us from being most properly ourselves? It is as though our excessive glutting on images—from the seduction by media labels and visual stimulus to the voyeurism of disaster porn—evidences the brain's fragility to be nothing more than itself, a mere screen rather than a properly self-organizing whole. The thousands of years of evolved complexity can fall away through overconsumption. Just as the very desire for fats and sugars that propelled the body to hunt and develop technologies for metabolic stability and survival will drive the modern body into obesity, hypertension and an early grave, so the darting eye that stimulated the brain into becoming a reading and interpreting animal, may also be at the forefront of the human species' cognitive atrophy. And does this not say something profound about climate: that the human species’ damaging of its own milieu is not an accident that we might otherwise have avoided, precisely because climate—as our milieu—is something that our very dependence upon will preclude us from ever really seeing?

    Both of these questions—of self-destruction and milieu-destruction—are economic problems. (Both Freud and Bergson argued that the self was an effect of investment, by postponing the discharge of energy and allowing a pool of force that would be relatively stable through time.) The human animal delayed consumption of immediate resources, developed hunting and farming techniques in order to store energy, and so then freed energy and resources for further technical-intellectual-moral development (Ayala 2010). The viewing eye also delayed immediate response, developing concepts and perceptive technologies that enabled greater representational sophistication. V.S. Ramachandran speculates that the self and the notion of mind emerged in a survival tendency to anticipate the actions of others (Ramachandran 2003). The viewing eye becomes a reading and organizing apparatus, allowing 'man' to become a subject. These same replicating technologies, and life-propelling investments—allowing us to fashion cinematic, computational, virtual reality and tele-visual technologies—would eventually sacrifice the reading brain to the merely stimulated eye.

    Apart from the general interest of observing a widespread anxiety regarding the brain's own capacity to destroy itself through the very perceptive power that generated its supposedly proper potentiality in the first place, it is possible to orient this discussion towards the perception of futurity.

    Is not the problem of both sides—the dire prediction that we are losing our capacity to synthesize ourselves and the post-human affirmation that we are really, properly, nothing more than a dynamic power to perceive—that there is still (for all the talk of loss) a reliance on a normative notion of the human, whereas what is required today is an inhuman perception? For all the talk of climate change we assume that the climate is what environs us, and that change—or the danger of change—needs to be calculated according to the degree to which it enables or precludes ongoing existence of humans (Mann 2009). If biodiversity is a prima facie good then surely any ecosystem—even one that emerged after human extinction—would answer the requirement for ongoing life? And if biodiversity is not a prima facie good, and is only good insofar as it offers ecosystem services for humans, then the very reasons why we might finally act in order to maintain biodiversity—in order to continue to live—seem to be hampered by our drive for life. The very eye that has opened up a world to the human species, has also allowed the human species to fold the world around its own, increasingly myopic, point of view. Today, we might start to question the appropriate point of view from which we might observe and evaluate the human viewing eye: from our own greater will to survive, or would it not be better to start to look at the world and ourselves without assuming our unquestioned right to life?

    Our very narration of the brain and its emergence as the properly synthesizing milieu from which all other imaging milieus need to be considered, shelters us from the thought of the inhuman images that confront us at the limits of the embodied eye. We can recall, here, Deleuze's criticism of Bergson, which is technical and counter-vital. Bergson, like so many other early modernists mourned the living and dynamic eye that had been sacrificed to technological expediency. For Bergson the intellect cuts up the world in order to achieve managerial efficiency and then subjects itself to that same technical calculus. The mind starts to operate with an image of itself as some type of viewing machine. Redemption, for Bergson, lies in retracing the path, regaining a vitality that would no longer be that of the bounded organism. Intuition would pass beyond its enclosed self-interests to arrive at the perception of life’s duration or élan vital. For Deleuze, by contrast, the problem is that the eye remains too close to the lived. (So, today, when we demand ‘reality’ of television and cinema, or if we criticize cultural production for being too irrelevant or divorced from everyday life, we do so because we think there is such a thing as life and reality to which vision ought to be subordinate.) Rather than asking the eye to become organic once more, and to re-find its place in life, Deleuze asks for an inhuman perception: can we imagine the world without us, not as our environment or climate? Drawing on Bergsonism, rather than Bergson’s concrete example of the fallen nature of cinematic perception, Deleuze calls on philosophy to ‘open us up to the inhuman and the superhuman durations (durations which are inferior or superior to our own), to go beyond the human condition’ (Deleuze 1988, 28). It is the cutting power of the eye that needs to be thought: the eye would be approached as a form of synthesizer, but as an analog rather than digital synthesizer. That is: the eye does not need to free itself from imposed distinctions to return to the flow of life, but should pursue ever finer cuts and distinctions, beyond its organic thresholds.

    How might we imagine a world without organic perception, without the centred points of view of sensing and world-oriented beings? Is there such a thing as perception without a world? (Think, here, of Heidegger’s remark that a stone ‘has’ no world, which is a way of saying that a stone has no climate, for a stone has no concern for ‘its’ world or environment.) This would not be a world without reading, as though abandoning the eye of grammar would return us to an inhuman lifelessness. Instead, the reading would take a radically different form. After humans have ceased to be present on the planet, their history will remain readable in a quasi-human sense: the earth’s strata will be inscribed with scars of the human capacity to create radical and volatile climactic changes. But one might consider a form of reading beyond this quasi-human and discerning mode: if, following Heidegger, the stone has no world, how do we account for the fossil records or archives borne by the stone? What might be thought is the extinction of the climactic eye: can we imagine a mode of reading the world, and its anthropogenic scars, that frees itself from folding the earth’s surface around human survival? How might we read or perceive other timelines, other points of view and other rhythms? The fossil record opens a world for us, insofar as it allows us to read back from the brain’s present to a time before reading; strata will continue beyond human reading, but if inscription continues is it too much of a stretch to say that the earth will remain as a ‘reading’ of at least one point of the universe? We use this term in literary and art criticism frequently, saying that a certain film offers ‘a reading’ of a certain event: we do not simply mean that the author is reading an event, for that may not have happened. The earth, after humans, will offer ‘a reading’ of a species’ history, just as we might say that Robinson Crusoe offers ‘a reading’ of race, empire and capitalism, even if neither Defoe nor his readers actually actualized the sense of the reading.

    Why have we assumed that reading and readability should take syntactical forms?

    Here I want to refer to what geologists have posed as the new anthropocene era, where it is imagined—after humans—that our scar on the earth would be readable for something like a future geologist. Not only do we imagine what would be readable for a world without readers, we also have to deploy and imagine (from within geology) a different mode of stratigraphic imaging. Stratigraphy, at present, is a mode of reading past layers, but the positing of the anthropocene era relies on looking at our own world and imagining it as it will be when it has become the past. In imagining this world after humans we are reading what is not yet written or inscribed. We can see, now, from changes in the earth’s composition that there will be a discernible strata that—in a manner akin to our dating of the earth’s already layered geological epochs—will be readable. This strata or text of the earth does not yet exist; we abstract from the human eye and its reading of the inhuman past, to imagine what would be readable, after humans, in a mode analogous to the human eye. One can only open up to this post-Anthropocene point of view if we start to view this world beyond the bounds of climate, and see climate as one expression—among many—of a broader time and broader (inhuman) life. Perhaps, then, the moral outrage about the death of active and synthesizing vision, or the premature hailing of the world as already post-human, needs to be tempered by the thought of the seeing brain that looks beyond itself. What we should not do is try to retrieve or repair a proper human vision; nor should we think, too easily, that we have abandoned human myopia once and for all.

    This allows for a new thought of the brain's self-extinguishing tendency. If there is an anxiety regarding the eye-brain’s seduction by images to the point of distraction, is not the figure of the evolved, self-organizing, connected and connecting brain also a lure or figure that precludes us from questioning the worth of the imaging brain? There is a strange torsion operating between the shrill cries lamenting the brain’s captivation by spectacle, and the supposedly opposing counter-image of the good flourishing mind-brain. In response we might ask seriously what all these diagnoses of the reading brain and its atrophy amount to for a thought of art and climate change.

    First, if the reading eye did have a proper mode—if the human brain had as its proper potentiality the mode of syntactical, synthesizing and world-ordering vision—how would we evaluate the last centuries of aesthetic judgment, which have relied on destroying the brain’s capacity for comprehensive consumption? One doesn’t have to be a fan of Duchamp and the avant-garde to note that there is something interesting, at the very least, in visual productions that short-circuit recognition. Indeed, one might say that climate change should not require us to return to modes of reading, comprehension and narrative communication but should awake us from our human-all-too-human narrative slumbers.

    In Danny Boyle's 127 Hours (2010), in the film's revelatory final quarter, the central character’s voice provides a voiceover declaring that all moments in his life have been leading up to this point. The screen is split into three panels, with one of the panels depicting the depleting battery indicator on his camcorder. At this stage, and for all the character knows, his self-made film and testimony will never be viewed, and yet—even so—he proclaims a moment of destined union between the end of his own life and the earth’s history: from the first comet that struck the earth to create life to this final point of self-narration, all this was destined to converge on this filmed present (or so he believes). This temporal point—one of the film’s peaks—is at once one of human heroism, confirmed by the final scene where the protagonist and his family are seated on a suburban sofa viewing the cinematic audience the way we have just viewed his triumph. And yet at the very moment that this central character’s destiny is related, the film's visual field explodes into a geological vision—the camera eye being taken over by the dazzling sun, which in turn dissolves into layers of rock and water beyond human time and perception. This cinematic seduction is quite different in kind from our tendency to be captivated by faces, bodies, objects of consumption and order.

    This geological eye operates alongside the lulling eye of the forming human power; it is not the simply destructive eye of the visual avant garde. It is not just the willful assertion of the desire for the human to assert its mastery by freeing itself from instrumental and comforting images. It is a positively geological vision, a seduction not by the light that warms and illuminates but a radiation that moves beyond organicism. This light appears through the cracks of our own survival mechanisms: in Danny Boyle's cinema alone we can see it in Sunshine of 2007 (where the sun of light towards which the space mission travels is figured as that which must be viewed but which remains as not viewable), and in the sublime scene of the Sydney opera house as a frozen wasteland in 28 Weeks Later (2007). The very titles of these films—hours, weeks, days—are intensive lived periods in which something like the unlived and unlivable takes hold.

    Jacques Rancière has commented on a certain double nature of the image that defines art: commenting on Roland Barthes's Camera Lucida, Rancière notes that Barthes (who had begun his career by aiming to strip images of their myth or lure and did so by reading what appeared to be enigmatically frozen as actually the outcome of human history and labor) reversed this in Camera Lucida by affirming a dazzling power of the image as such that occurs when photography becomes an art (Rancière 2009). For Rancière this is not noteworthy because of some interest in Barthes's biography, but because it discloses art's double relation to the image, a doubleness inherent not only in photography but also in the novel. For Rancière, the novel as art at once describes and images, and draws attention to (while also destroying) any simple notion that the image is secondary and effaces itself before that which it indexes. Rancière is not as indebted to the French avant-garde as Barthes, or Deleuze, for whom art is the release of affects and precepts beyond the lived, or Derrida, for whom literature is an absolute precariousness that has no referential outside other than that which it traces from itself. But there is a sense in Rancière's notion of art as a release of image from anything other than its own dazzling materiality without reference or relation, of a surmounting of a certain anthropocentrism of the aesthetic image. It as though a release from systems of human reference would somehow yield the shining of light in itself.

    To this extent all art leading up to the avant-garde would be art and image only to the degree to which it was anti-mimetic, or other than any form of reference, as though art somehow were god-like, freed from any necessity to be anything at all, liberated from all constituting relations. This, for Rancière, is the two-sided nature of the image.

    I would suggest that something like a third side of the image is prompted by the thought experiment of extinction. By referring to extinction as a thought experiment I want to move in two directions. If we think of the experimental passage to extinction as thought—if we imagine thinking as a variation that takes place from function but essentially risks all function—then thinking of life as mindful requires thinking of mind as intrinsically destructive. Thought occurs when relations between terms are destructive, when there is a not knowing or misprision. Life occurs not with ongoing self-sameness but with an experimental variation that could be construed as risk, except that risk implies betting, strategy or even the venturing of some being, whereas it is only after variation that one might refer ex post facto to a mutation that is interpreted as good for some being or some environmental fit. And this is also why environment (like climate in its narrow meteorological sense) is not such a helpful term, given the notion of surrounding or environing—as though beings varied to fit a world. Extinction—as thought experiment—destroys such notions; there is just variation that is not variation of any being. So if extinction is thought experiment, it is because the process of extinction is a variation without a given end determined in advance; thinking possesses an annihilating power.

    A certain thought of delimited extinction, the extinction of humans, opens up a variability or intrusion of a different side of the image. This is a geological, post-anthropocene or disembodied image, where there is some experimental grasping at a world that would not be the world for a body, nor the world as body. This mode of impersonal imaging differs from an avant-garde immanence of aesthetic matters or sensations, for such notions tend towards a god-like self-sufficiency. The avant-garde sought to think of the liberation of the image from man, but in doing so it created a heightened subjectivism where ‘we’ might liberate ourselves from function and become pure perception or pure becoming. In the era of extinction we can go beyond a self-willing self-annihilation in which consciousness destroys itself to leave nothing but its own pure non-being; we can begin to imagine imaging for other inhuman worlds. That is to say: rather than thinking of the post-human, where we destroy all our own self-fixities and become pure process, we can look positively to the inhuman and other imaging or reading processes.

    What happens if one thinks of the vision of no one, of the human world without humans that is still there to be seen? What remains might still take the form of ‘a’ vision or referring eye—the scene of a human world as if viewed without any body. The positing of an anthropocene era (or the idea that the human species will have marked the planet to such a degree that we will be discernible as a geological strata) deploys the idea of human imaging—the way we have already read an inhuman past in the earth's layers—but does this by imagining a world in which humans will be extinct. The anthropocene thought experiment also alters the modality of geological reading, not just to refer to the past as it is for us, but also to our present as it will be without us. We imagine a viewing or reading in the absence of viewers or readers, and we do this through images in the present that extinguish the dominance of the present. The figure of a frozen Sydney opera house, a London where Trafalgar Square is desolate, layers of rock distorted through a camera lens that is not the point of view of any body, an underwater Manhattan, or a sunlight so bright it would destroy the eye—all these experiments strive to image a world as image (as referential) but not referential for any body. These images cannot be sustained, and are unsustainable; they—like the thought of extinction itself—will always be for us, and are always co-opted by the narrative lures they fragment. They nevertheless indicate an era or epoch that has begun to sense, if not have a sense of, a world without bodies.


    1. return to text