Three Species Challenges

Three Species Challenges: Toward a General Ecology of Cognitive Assemblages

N. Katherine Hayles

As many are beginning to realize, Planet Earth is in trouble.1 The STEM disciplines, supported by funding agencies, are organizing to identify and address a series of Grand Challenges, among them Global Climate, Hunger and Thirst, Pollution, Energy, and Health.2 The challenges are “global” not only in their reach and scope but also because their effects cannot be con- tained by geographical boundaries. If someone contracts bird flu in Beijing, chances are it will show up in Paris and New York; if Californians dump plastic into the ocean, it washes up on the shores of Japan and Easter Island. Moreover, the challenges involve cultural, sociopolitical, and ethical issues as well as scientific and technical problems, so the STEM disciplines alone will not be sufficient to solve them; input from the humanities and qualita- tive social sciences will be necessary as well. Effective action on these global issues requires large-scale consensus among different regions, nationalities, and ethnicities – yet the mechanisms to achieve such consensus are woefully lacking. Only a few come to mind, such as the Paris Climate Accords, the prohibitions against nuclear, chemical, and biological weapons, and con- straints on altering the human genome. As humans, we desperately need a sense of solidarity and shared purpose that can help create these global mechanisms. Even to write such a sentence, however, risks bringing howls of protests from humanists and social scientists, because of the historical baggage of false universalisms that have been so effectively deconstructed over the past several decades. Hence the challenge this chapter addresses: is it possible to arrive at conceptual foundations for human solidarities that do not reinscribe oppressive ideologies and discriminatory practices? I will propose three such foundational concepts: species-in-common, species-in- biosymbiosis, and species-in-cybersymbiosis.

The Challenge of Species-in-Common

Immediately problems arise with the concept of species, because biologists have been unable to arrive at a rigorous definition of what constitutes one. All of the proposed criteria – morphology, reproductive success, genetics – have fallen short in some aspect or another. Consider, for example, the widely used criteria that individuals count as the same species if they can mate and have fertile offspring (this leaves out mating between donkeys and horses, whose mule offspring are sterile). The problem here can be illu- strated with ring species. Consider squirrels, for example: individuals in adjacent geographical regions can mate and have fertile offspring (New Yorkers with Pennsylvanians, Pennsylvanians with Missourians, Mis- sourians with Coloradians, Coloradians with Californians). Insert enough geographical distance, however – say, matching Californians with New Yorkers – and mating is not successful. Problems like these notwithstanding, most biologists nevertheless share a general understanding of species and find it indispensable for their work.

For the humanities, a more serious issue is speciesism, the ideology that humans are morally more important than other species and therefore enti- tled to exploit or dominate them. A founding document is the 1970 privately printed pamphlet Speciesism by Richard D. Ryder.3 Arguing against animal experimentation, it equated speciesism with racism: just as speciesism con- siders humans morally superior to other animals, so racism judges one ethic group morally superior to others. Contemporary commentators on specie- sism include Timothy Morton, who recently argued that speciesism is more fundamental than racism and that anyone who is a speciesist must perforce be a racist as well. Given that racism is one of the most virulent charges one humanist can level against another, such arguments virtually guarantee that if someone asks, “Who here is a speciesist?” there would be a thunderous absence of response.

Why, then, do most biologists continue to find species a necessary con- cept, even with all of its problems? The answer seems obvious: different species have distinctively different capabilities and potentialities. The human species notably differs from others in its ability to predict the future and form intentional plans to address anticipated problems. Which brings us back to the Grand Challenges: only humans could have conceived of these as global concepts, and only humans can devise technological, cultural, and ethical solutions to them. Here it may be useful to invoke a term used by Donna Haraway (2016): human response-ability. Humans respond through an empathic bond with other humans and nonhumans, and because of our abilities to conceptualize and anticipate the future, we bear a special responsibility for working toward ensuring the welfare of others and the planet in general. That we so far have failed miserably in meeting this challenge does not negate our potential to do so. Indeed, writers such as Haraway, Bruno Latour (2018), Brian Holmes (2018) and many others are now urging us to embrace our response-abilities. For this kind of Grand Challenge, a reconceptualized notion of species may be helpful – not one that implies speciesism with its imperialistic heritages of exploitation and racism, but rather what I call species-in-common, a notion of human soli- darities and purposes that can work to mitigate the damages we have so far wrecked upon our common and only home, the Earth.

So reconceptualized, species-in-common can serve as a bulwark against racism rather than a facilitator of it. For virtually all of human history, people have believed that their own group is fully human, while those in the next valley are somehow less or other than human. Indeed, when genocide raises its horrible head, one of the first (predictable) rhetorical moves is to equate the despised others with rats, vermin, cockroaches rather than with the human species (another indication of the close historical tie between racism and spe- ciesism). Richard Rorty put the matter into useful perspective:

Most people live in a world in which it would be just too risky – indeed, would often be insanely dangerous – to let one’s sense of moral community stretch beyond one’s family, clan, or tribe. Such people are morally offended by the suggestion that they should treat someone who is not kin as if he were a brother, or a nigger as if he were white, or a queer as if he was normal, or an infidel as if she were a believer. They are offended by the suggestion that they should treat people whom they do not think of as human as if they were human. (Rorty 1998, 125)

He cautions that saying these benighted others should simply become more rational will not solve the problem (indeed this way of thinking is part of the problem). The necessary prerequisites, he suggests, are security (“condi- tions of life sufficiently risk-free to make one’s difference from others ines- sential to one’s self-respect, one’s sense of worth”) and what he calls “sympathy,” here denoted as empathy (ibid., 128).

This pragmatic approach makes clear the relevance of the Grand Chal- lenges, particularly Global Hunger and Thirst and Global Security, to the species-in-common concept. Solutions to each of these challenges reinforce and depend on the others. Species-in-common, with its focus on human solidarity, insists that every individual of the human species counts as human, but such a potently anti-racist vision can be effective only if every- day life for the world’s peoples includes enough of the necessities to ensure some measure of relief from danger, famine, drought and other catastrophic urgencies. In its reconceptualized form, species-in-common articulates a vision that has taken literally thousands of years of human history to achieve. Still in its infancy throughout most of the world, it calls for us to take response-ability for working toward the global conditions that will enable us to see the people in the next valley, living, feeling, cognizing people, as human like us.

Moreover, the concept of species-in-common offers new clarity for media theory as well. This aspect is implicit in the move that John Durham Peters (2016) makes when he upends media theory by proposing that elemental processes such as clouds and ocean currents function as media interfaces through which communications are processed. One may be tempted to object that this stretches the concept of “media” so far as to render it meaningless as an analytical category, since in this view almost everything can count as media. Here it may be useful to return to John Guillory’s exploration of the genesis of the media concept (2010), where he argues that almost from the beginning, media has implied both “mediation” and “com- munication through a distance.” If we accept these as the two essential components of media as a concept (which Durham suggests we do), then there is no reason why mediation has to involve technical apparatuses. The result has been an explosion of media theory in a number of new directions, including Melody Jue’s Wild Blue Media (2020), exploring the ocean as a medium complete with databases and communication circuits. An investi- gation of coral reefs, with their long histories of sedimentation and inter- locking life forms, in this view could count as media archeology, which typically involves such archaic technical media as stereoscopes, magic lan- terns, and cycloramas.

To evaluate what is gained (and lost) by this paradigm shift, we can com- pare Peters’s “elemental” scheme with Claude Shannon’s famous diagram of the communication situation (Shannon and Weaver, 1949; diagram available at http://people.seas.harvard.edu/~jones/cscie129/nu_lectures/lecture1/ Shannon_Diagram_files/Shannon_Diagram.html). Recall that Shannon’s diagram begins with a sender, who composes a message that is then enco- ded into a signal and sent through a circuit to a decoding apparatus, which reconstitutes the message and conveys it to the receiver. Intervening in this process is noise, always threatening to degrade the signal and compromise the message’s legibility. Shannon made a point of emphasizing that the sender and receiver need not be humans; they could be technical appara- tuses instead. In either case, however, implicit in the diagram is the idea that both the sender and receiver have sufficient cognitive capabilities to perform the actions required of them.

In Peters’s “elemental” model, the signal to be communicated over dis- tance need not originate with a cognitive entity; the movement of clouds, which he argues communicates information to humans and nonhumans (birds, animals, and plants, for example), are material processes that do not require cognition to function. However, I would argue that there must be a cognizer at the end of the process for the two necessary components of mediation and communication over distance to function. Otherwise there are only material processes, distinguished as I argued in Unthought (2017) from cognition because there is no choice and no interpretation, only che- mico-material events that are the resultant of the forces acting upon and through them. Communication, unlike material processes, always requires interpretation and choice – choice in determining which phenomena will be considered as media, for instance, and interpretation in the decoding and reception of the message.

This leads immediately to one of Peters’s finest insights: “media are spe- cies and habitat-specific and are defined by the beings they are for” (2016, 56). Of course! Only our anthropocentric biases can account for why the field called “media theory” remains almost exclusively about human communication, while communication within and between other species is relegated to the relatively marginalized field of biosemiotics. With a multitude of examples, Peters gives a vivid sense of what media mean, for example, to whales and dolphins, including seawater sonic waves and ocean currents. As he argues, once the species-specificity of media is explicitly recognized, many new kinds of inquiries are opened, different vocabularies become possible, and novel theoretical frameworks can be developed.

As an example, suppose that I am sitting on the couch with Monty (my dog), watching a rerun of the classic Lassie TV series. I see Lassie coming to the rescue, defeating bad men, helping the good. What does Monty see? He notices flashing lights and, when Lassie barks, momentarily looks at the screen, but he quickly loses interest because the images are contained in a box and have no smell, so he knows they are not real. Compare that with a trip to the dog park, where Monty comes across fresh urine. Smelling it, he notices the specificity of its chemicals and associates them with the hand- some poodle that has just left the area. The urine smell-signature matches up with other smells coming from her anus, which he trots over to sniff. These are media for him because they communicate information and mes- sages, although not for me. Similarly, dissolved chemicals in water function as media for redwood trees, salt-tinged air for seagulls, blood in water for sharks. What can count as media is therefore tied to the specificities of the sensory apparatus of the receiving cognizer, whether human, nonhuman, or computational, and are constituted within and through the environments of the cognizing species.

It is not enough, then, to insist on media-specific analysis, for which I have been arguing for some time to encourage literary critics in parti- cular to attend to the specific forms in which texts are instantiated; we must also attend to the specificities of the species that engage in com- municative acts. This is another reason why the concept of species remains an essential analytical tool, despite its problems and historical baggage. Without it, we could scarcely formulate the mediations and communications through distance that comprise what Hoffmeyer calls the semiosphere, the totality of signs and messages passing between and within living organisms.

Taking species into account has the additional advantage of restoring some of the specificity that opening media to elemental communications had dissolved. Even though virtually anything can be seen as originating a com- munication, such signals must be received and interpreted by cognizers to count as acts of communication, and the meanings extrapolated from the signals are specific to the sensory and neuronal capacities of the species that receives them. In addition to connecting species to their environments, these communicative acts help to construct and expand the deep interdependence of living organisms.4

The Challenge of Species-in-Biosymbiosis

Species entangle and interpenetrate. In 1967 when Lynn Margulis finally had her revolutionary paper published, she upended current biological dogma by arguing that mitochondria had descended from bacteria, and chloroplasts from cyanobacteria; these once freely living organisms became symbionts of eukaryotic cells in a process of endosymbiosis. Indeed, she went on to argue that endosymbiosis, rather than natural selection, is the primary driver of evolution: “Natural selection eliminates and maybe maintains, but it doesn’t create,” she argued in an interview (Teresi 2011). Humans have likewise acquired symbionts in our evolutionary history, for example, the gut bac- teria essential for the proper digestion of food.

Recently Donna Haraway has extended this work to global scope through the concept of sympoiesis. “Sympoiesis,” she writes, “is a word proper to complex, dynamic, responsive, situated, historical systems. It is a word for worlding-with, in company. Sympoiesis enfolds autopoiesis and generatively unfurls and extends it” (2016, 58). “What happens,” she asks,

when the best biologists of the twenty-first century cannot do their job with bounded individuals plus contexts, when organisms plus environ- ments, or genes plus whatever they need, no longer sustain the over- flowing richness of biological knowledges, if they ever did? What happens when organisms plus environment can hardly be remembered for the same reasons that even Western-indebted people can no longer figure themselves as individuals and societies of individuals in human- only histories. (Haraway 2016, 31) She continues, “poiesis is symchthonic, sympoietic, always partnered all the way down, with no starting and subsequently interacting ‘units’” (ibid., 33). In this view, organisms do not precede the relations they enter into with one another but reciprocally produce one another from lifeforms that for their part have already emerged from earlier involutions.

This is a compelling vision, which has been extended by Jason Moore into the socioeconomic realm (2015, 2014). He urges a shift of perspective in which we turn from considering “the environment as object to environment as action. All life makes environments; all environments make life” (Moore, 2014, 012). His focus is specifically on capitalism, which he argues has radically exploited and reconfigured environments to extract profits, a pro- cess that continues into the 21st century with the transformation of human behaviors into dataflows that can be commoditized.

Nevertheless, there are counter-narratives to these strong arguments. Countering endosymbiosis, for example, is the continuing tendency toward speciation, in which species occupy new niches or otherwise become isolated from one another and, over time, develop into new species distinct from their ancestors. The existence of Homo sapiens is testimony to the power of speciation to effect tremendous changes, given an evolutionary timespan. And temporality here is key. Given enough time, glass flows, mountains erode, continents drift apart. But seen from the measure of a human life- span, windows abide comfortably in their frames, Mount Everest remains more or less the same for generations of aspiring climbers, and African shores measure the same distances from South America. Similarly, organ- isms carry through time their inheritances of DNA, the great conservation mechanism, so the extent to which they can be shaped by their present environments is tempered by the inertia of all the past environments they have inhabited. Even if DNA itself is constantly in motion over the genera- tions, for example through horizontal gene transfer among bacteria that facilitates resistance to antibiotics (Gyles and Boerlin 2014), these events also take place along temporal scales that both moderate and enable the poten- tiality of all living things to change.

In similar fashion, a counter-narrative to endosymbiosis is the fact that all life forms depend on membranes (skin, surfaces, cell walls) that at once distinguish them from and connect them to their milieux. That these sur- faces change over time, admitting what was previously exterior into the interior, is an observation also subject to perspectives that can be range from the (relatively) stable to the (relatively) porous, depending on the temporal scale of the chosen viewpoint. In the same way, whether the life- form’s membrane connects or divides is a matter of whether one emphasizes its protective function or its activity as a surface across which multiple kinds of communications occur. Jesper Hoffmeyer makes this point in distin- guishing between endosemiotics, “semiotic processes taking place inside the organism,” and exosemiosis, “biosemiotic processes going between organ- isms, both within and between species.” The distinction, he warns, “should not be taken to signify any privileged role in biosemiotics for either side of the interface, or boundary. In fact, semiotics is in principle always connected with some kind of inside–outside interaction” (2008, 6).

A surprising counter-narrative to endosymbiosis is chimerism, as Margrit Shildrick notes (2019). Whereas a mule is a hybrid, with DNA consistent throughout its body, the geep (a rare offspring of a sheep and goat) is a chimera, with discrete areas of sheep and goat DNA mixed together within its body. If endosymbiosis is like one bubble encased within a larger one, chimerism is two bubbles sitting side by side, both encased by another membrane. There have been documented cases of human chimeras, as in the case of Lydia Fairchild, whose DNA did not match those of her biological offspring. Shildrick explains:

the most likely explanation is that the woman was the result of a dizy- gotic twin conception that had disappeared from knowledge when her embryonic self had absorbed the other twin in utero. The resulting 
singleton carried both her own original DNA and that of the non-identical twin, thus creating a chimera. (2019, 14)

The extent to which (micro)chimerism is prevalent among microorganisms is still being investigated, but it is thought to play an important role in whether a transplanted organ is accepted or rejected. Granted that chimer- ism seems to be a much rarer phenomenon than endosymbiosis, it never- theless illustrates the dazzling complexity of trying to form generalizations about biological processes.

These complexities notwithstanding, distinctions nevertheless persist and play useful analytical roles. In the view argued here, species-in-common and species-in-biosymbiosis co-constitute each other, each delineating the con- tours of the other, as in the famous yin/yang symbol. Like the white dot nestled in field of black and a black dot in white, the symbol hints that a push too far in either direction will set an opposing tendency into action. The clear implication is that extremes risk distorting the world’s complex- ities. All flow and no structure is as distorting as its apparent opposite, all structure and no flow. Such oppositions can be found in many forms: only symbiogenesis and no speciation or only speciation and no symbiogenesis, only environment-organism becomings and no individuals or only indivi- duals and no environment-organism becomings – each extreme diminishes our ability to understand the world’s complexities and construct useful fra- meworks to account for evidence.

In summary, we need both species-in-common and species-in-bio- symbiosis (and one more too) to meet the challenges of our times and create new openings for speculative thought. Species interpenetrate each other’s domain both physically in processes such as endosymbiosis and semiotically as their life patterns entangle through exchanging signs and contextualized meanings with each other. Species-in-biosymbiosis, connoting both physical and semiotic interdependence, works together with species-in-common to create a nuanced sense of how “species” can indicate both a specific kind of entity and a web of entangled reciprocities between species. Together, the two concepts open possibilities for a revitalized and expanded media theory that builds on the essential insight, media are species specific.

The Challenge of Species-in-Cybersymbiosis

Until now, I have been considering only living organisms (in their manifes- tation as species-in-common and species-in-biosymbiosis) as the interpreters who act upon mediated communications. Clearly, however, technical devices can also perform these functions. Humans are in the process of entering into deep symbiosis with computational media. Over the last 50 years, virtually every complex technological system has either incorporated, or been con- trolled by, computational media, including transportation networks, energy grids, food production chains, and so on. Short of complete environmental collapse or apocalypse, everything points to this trend continuing and intensifying in the new millennium.

Recent efforts by Erich Hörl to analyze the effects of this transformation under the rubric of “general ecology” provide a useful starting point for this discussion. Deeply versed in media theory as well as philosophical tradi- tions, including phenomenology, Foucauldian genealogy, deconstruction, and Deleuze and Guattari’s rhizomatics, Hörl provides a comprehensive synthesis of what he calls the “absolute prioritization of mediation” (2013, 124), including ubiquitous computing, intelligent environments, data analy- tics, and the microtemporal regimes of computational systems. Foucault’s governmentality, he suggests, has now morphed into “environmentality,” a term that implies the control and governance of a population through mediated conduits of power (2018, 154). The new regime, however, has distinctive aspects only just coming into existence in the 1970s when Fou- cault first introduced the term. Chief among these differences is the ability of contemporary media to access human cognition under the temporal horizon of consciousness, an effect that Luciana Parisi and Steve Goodman have termed “mnemonic control” (2011). The effect has also prompted Mark Hansen to designate 21st-century media as “atmospheric” (2015), implying not only their ubiquitous presence through cell phones, social media sites, internet searches, and so on, but also their inescapability, their permeation into virtually every aspect of human sensations and experiences.

Reverberating through Hörl’s rhetoric is what I might call, appropriating a phrase from Brian Massumi, a shock to thought, especially to thought as it is understood in the phenomenological tradition. The phenomenological emphasis on consciousness, intentionality, and temporality is subverted with atmospheric media, which short-circuit reflective thought by addressing human cognition through Libet’s “missing half-second” (1985), the interval between the onset of sensory sensations and conscious awareness of them. Hansen makes this implication explicit. Pointing out that consciousness in the context of mediated microtemporal regimes takes on the “more humble role as modulator of a sensory presencing that takes place outside its pur- view” (2015, 24), Hansen goes on to point out that this development “sounds the final death knell for phenomenology’s animating dream, the dream of a consciousness capable of experiencing its own presencing in the very moment of that presencing – consciousness’s dream of living in the ‘now’ of its happening” (ibid., 25). I drew similar conclusions about the belated role of consciousness in Unthought (2017), working however through neuroscientific research rather than phenomenology.

For Hörl, and to some extent for Hansen, these developments make the reconceptualization of human subjectivity an urgent task, an implication that I have sought to address as well. Hörl’s approach is to posit a “general ecology” based solely on human–computational interactions, which leaves nonhuman and noncomputational entities out of the picture. There is a certain irony in calling this approach “environmental,” precisely because it pays no attention to what used to be called “the environment,” meaning the natural world of bacteria, plants, insects, birds, animals, and fungi that go about their business with only minimal and occasional interactions with humans or computers.

Indeed, he critiques this kind of environmentalism for its “reaction to the machine” and its invocation of “the undamaged and unscathed, the unspoiled, intact and immune, the whole and holy” (2013, 128). I think this kind of critique is justified, for several reasons. By idolizing “wilderness,” for example, this view of unspoiled nature makes it harder for us to see that weeds in a vacant urban lot are also nature; by focusing on the unspoiled, it deflects attention from issues of environmental justice that arise when we send our most polluted contaminants to communities too impoverished and powerless to object; and so on.5 Hörl proclaims that the “general ecology” at which he aims is “an unnatural, non-natural, and, one might say, subtractive ecology; an ecology that eliminates the immunopo- litics of ecology” (ibid., 128). He continues, “It is an ecology of a natural- technical continuum, which the general environmentalization through technology and the techno-sciences and the concomitant explosion of agency, schematizes as the core of our current and, even more, of our future basic experience” (ibid., 128).

He defends his use of “ecology” by insisting that it is not merely a meta- phor adopted from the environmental movement and therefore “bound to strictly biological, ethological, or life-scientific references” (2013, 126). On the contrary, he asserts,

it is more likely the case that the traditional concept or discourse of ecology causes a breakthrough and imparts a principle form to the conceptual constellation, which as a consequence in the course of techno-medial development, ascends to the level of a critical intuition and model for the description of the new fundamental position. (Hörl 2013, 126) This principle, of course, is relationality. “Being is relation,” he states (ibid., 122).

For Hörl, crucial aspects of the new relation of humans to the computa- tional media that surround and interpenetrate human complex systems are the reconfigured subjectivities that result when agency is distributed throughout the system. He writes that such technological systems are “cur- rently driving the ecologization of sensation, with the additional con- sequence, however, of ecologizing cognition, thought, desire, and libido, as well as power and governmentality” (2013, 127). Citing Guattari, he refers to this as “non-subjective subjectivity that is distributed in a multiplicity of relations” (ibid., 129). This is a vision similar to my own framework of cognitive assemblages, with a crucially

them, cognitive assemblages emphatically do not exclude nonhuman life- forms. In Unthought, I locate human cognition along a continuum with both the cognitive capabilities of nonhuman life and the artificial cognition of computational media. All of these entities perform cognitive acts consisting of interpreting information in contexts that connect it with meaning. Although they may perform as individuals, more frequently they function as cognitive assemblages, collectivities through which information, interpreta- tions, and meanings circulate.

A farm, for example, would count as a cognitive assemblage. It likely involves computational components, for example in the tractor and other automated equipment that the farmer uses and in the computer he powers on to access current market prices for his crops. But it also includes all the lifeforms necessary for the farm to produce its harvests, from the bacteria in the soil to the plants in the fields to the livestock those plants and bacteria help to feed. From microbes to the farmer and his cell phone, all count as cognizers interpreting information and engaging in meaning-making prac- tices specific to their capacities and milieux.

By appropriating the term “general ecology” for interactions that do not include 95 percent of the world’s biota, Hörl risks exacerbating trends already too prevalent in our anthropocentric cultures. His approach makes it more difficult to see how and why humans should take response-ability for the welfare of other species on the planet. However, it also has real strengths in its large scope of reference, synthesis of diverse material, and articulation of how computational media are impacting the very idea of human subjectivity. These are contributions that should be celebrated. Moreover, the very concept of a “general ecology” is a fine insight that, with due credit to Hörl, I would like to develop along lines not limited by his exclusion of the lifeworld of more-than- human cognizers and cognition.

A General Ecology of Cognitive Assemblages

To illustrate the usefulness of a cognitive assemblage framework, I will consider three topics that Hörl also analyzes, but now with an emphasis on an integrated approach to cognition: 1) the ability of computational media to address humans in the microtemporal regime, underneath the temporal horizon of consciousness; 2) the distributed agency that human enmeshment in cognitive assemblages implies; and 3) the prevalence of machine-machine communication over human–machine and human–human communication.

From a cognitive assemblage perspective, the idea that consciousness is not the whole of cognition is a fundamental premise. In Unthought (2017), I presented a timeline showing that nonconscious cognition starts significantly before conscious awareness, on the order of 200 milliseconds as opposed to 500 milliseconds for consciousness. As I noted there, nonconscious cognition is a level of neuronal processing inaccessible to conscious introspection and yet responsible for performing tasks essential for consciousness to operate, including constructing a coherent body image, processing information for patterns too complex and “noisy” for consciousness to process, and for- warding or suppressing the results to consciousness depending on the context.

Here is a major point from Unthought: nonconscious cognition can suggest that consciousness pay attention, but it cannot by itself initiate intentional action. Consciousness is always able to ignore such nudges if it considers other information to be more important at the moment. Once consciousness decides to pay attention and sends down reinforcement in the form of activation waves, “ignition of the global workplace” takes place, as Stanislas Dehaene puts it, and then consciousness can continue to meditate on a given thought indefinitely. Nevertheless, the suppression function of nonconscious cognition is also crucial; much of the work it does in creating a coherent body representation, for instance, never enters consciousness at all. Indeed consciousness, with its slower processing speed and limited information capacity, could not function without the pre-pro- cessing done by nonconscious cognition; it would simply be overwhelmed. It depends on both the suppression and representation of information from the anterior work of nonconscious cognition, and in this respect it is always belated.

From this perspective, it is no surprise that consciousness is temporally vulnerable to phenomena that enter under its half-second delay; indeed, this is the normal way that all sensory information is processed. What is differ- ent in computational microtemporal addresses is that the messages are not simply coming from the body’s sensory interfaces with the outside world (as well as from internal sensing mechanisms) but rather are targeted by cor- porate interests specifically to create a propensity toward certain kinds of information, for example the kind of branding information that Parisi and Goodman discuss in “Mnemonic Control” (2011). When this kind of tar- geted information reaches consciousness, for example when one is looking at a web page with side boxes displaying certain commodities, consciousness has already been predisposed to pay selective attention to some of them because of the information that had previously entered through non- conscious cognition, whether or not at that point it entered conscious awareness.

This kind of approach was already denounced in Vance Packard’s 1957 The Hidden Persuaders, where it went by the name of “subliminal” adver- tising, but now that the mediascape has enormously expanded and the technologies of micro-address have become much more sophisticated, it returns with a virulence to direct human attention to the products that cor- porations want us to purchase. It is difficult to know how to protect our- selves against this informational onslaught, given that it exploits how the human neuronal system works. The key, no doubt, lies in the fact that only consciousness can initiate intentional action such as clicking on a “Buy” icon. That allows time for reflection and resistance to come into play.

A similar issue is the distributed agency that human enmeshment in cog- nitive assemblages implies. No doubt consciousness has always tended to exaggerate its own importance of human cognition, through its internal monologue that typically dominates human awareness. As stand-up come- dian Emo Philips has joked, “I used to think the brain was the most won- derful organ in the body, but then I asked myself, ‘Who’s tell me this’?” Meditation techniques, mindfulness exercises, and other body-awareness practices aim to stop the internal monologue and empty the mind of con- scious thoughts so that another kind of awareness can enter. In this sense, distributed cognition is a hallmark of human being, central to the body’s functioning as a semiotic entity through which external and internal mes- sages are constantly passing and interacting, with only a small part of them available to consciousness.

With the advent of computational media, however, distributed agency takes on a different sense as human–computational assemblages perform tasks that human cognition alone could never accomplish. Already proble- matic, in my view, is the notion of “free will,” because it over-simplifies the body’s complex interplays between sensory inputs, neuronal processes, and conscious awareness, tilting the matter entirely too far toward consciousness intentionality, as if that were all that is involved. With cognitive assem- blages, “free will” becomes hopelessly muddled, to the extent that it is ren- dered virtually useless as an analytical concept.

Consider, for example, an officer standing next to a drone pilot as they scrutinize images from the drone’s camera, and the pilot waits for the offi- cer’s decision to strike or not. Is the officer making his decision based on “free will”? The images he relies on to distinguish a disguised enemy com- batant from a woman on her way to the market have already been highly processes by the drone’s computerized visual system, where innumerable software decisions have been made about which information to accept and which to reject – decisions that have significant consequences and that the officer himself cannot access or evaluate Moreover, the officer acts in a highly regulated environment with its own complex constraints on what kinds of decisions and actions he can take. Finally, the drone pilot and the drone itself are part of this cognitive assemblage, and they both have beha- viors they can initiate independent of what the officer decides.

This situation is unusual in that it involves a life-or-death decision, but it is not at all unusual in the interplays between complex cognitive com- ponents, the information they can and cannot access, the constraints on their actions, and resultant actions that the assemblage as a whole will enact. This is what distributed agency looks and feels like in the compu- tational era.

The cognitive assemblage framework, with its emphasis on the informa- tion, interpretations, and meanings circulating throughout the assemblage and the cognizers, human and nonhuman, that comprise it, provides a way to talk about distributed agency that does not succumb to the panic implicit in some of Hörl’s rhetoric, the shock of discovering that humans alone are not in control (if we ever were). At the same time, the cognitive assemblage approach creates many possible openings for analytical inquiry, from the construction of software algorithms to hardware configurations of compu- tational platforms to the interfaces with humans, and in turn to the bureaucratic and institutional constraints under which they may operate.

Within cognitive assemblages as a whole, machine-machine communica- tion is growing exponentially faster than human–human or human–machine communication. The software company Cisco estimated that by 2017, the number of things connected to the internet was 20 billion; by 2020, that number was estimated to rise to 31 billion, and by 2030, 500 billion (Cisco 2019). The human population of the planet, by contrast, is about 7.4 billion and, although it too is predicted to rise (sparking concerns about resource scarcity), it is nevertheless increasing much more slowly than the numbers of smart machines. As machines communicate more with each other than with us, the intervals and pervasiveness of machine autonomy increase – areas where machines make decisions that affect not only other machines but also humans enmeshed in cognitive assemblages with them.

Proponents of driverless cars, for example, argue that this is a good thing, because there will likely be, on the whole, fewer traffic accidents than when humans alone are in control. Nevertheless, the prospect of machine auton- omy is concerning because there are many instances where humans, with their wider context-awareness, are better able to judge the consequences of actions. Cases in point are the two tragic airplane crashes of Lion Air flight 610 on October 29, 2018, and the Ethiopian Airlines Flight crash in Kenya on March 10, 2019, both flying the Boeing 737 Max 8 aircraft. As details emerge, it appears that the cause of both flights was malfunction in the Maneuvering Characteristics Augmentation System (MCAS), which is con- nected to external angles of attack sensors (AoA). If these sensors indicate that the plane is flying too slowly or at too steep an angle, the MCAS soft- ware will force the plane nose down to present stalling. This software system is obviously not robust, because it depends on only one set of sen- sors; if these are not correct, then the software will dictate actions that could (and apparently did) result in the plane crashing.

A day before the fatal March 10 flight, a different set of pilots reported problems with the MCAS; they were fighting to lift the plane nose after the software turned it down, frantically going through their check lists to solve the problem. Fortunately, a pilot not on duty but riding in the cockpit jump seat (a “dead head,” as such pilots are called) knew how to disable the motor that powers the MCAS system and conveyed this information to the pilots, who were able to disable the system and (presumably) save the flight from the same fate that occurred a day later on the very same aircraft with a different set of pilots (Bloomberg Report 2019).

As machine autonomy increases, issues arise about how to program algo- rithms to make ethical decisions when some degree of harm is inevitable. This is the premise for the “Moral Machines Experiment,” a computer game designed by an international team of researchers headed by Edmond Awad recounted in “The Moral Machine Experiment” (Awad et al. 2018). The game used stick figures and users to make binary choices about how to distribute harm, choosing, for example, to spare more people rather than fewer, a pregnant woman rather than an elderly man, a person rather than a dog or cat, and so forth. The game elicited input from millions of users from 233 countries and regions, and sophisticated statistical techniques were used to analyze the results. Regional and ethic differences revealed interesting systemic differences in preferences, but perhaps more surprising was the large amount of consensus, for example, responses sparing children over adults, humans over nonhumans, women over men.

The game has been rightly criticized on several grounds, ranging from its forced binary selections to such problematic choices as the fit versus the overweight, or those crossing streets legally versus those crossing illegally. Nevertheless, looking at the results, I was struck by how many of the pre- ferences coincided with choices in favor of species survival – young over old, pregnant woman over elderly man, humans over nonhumans. Humans in extreme circumstances have made such wrenching choices in reality, from the abandonment of old people in Eskimo cultures when food becomes scarce, to the sacrifice of sled dogs to conserve food for humans in the Shackleton expedition to the Antarctica. Here is an example where species- in-cybersymbiosis (the computer game, designed by humans and played by them but executed by algorithms with the data collected and analyzed by more algorithms) becomes entangled with species-in-common on multiple levels, leading to the kinds of complexities that a general ecology of cogni- tive assemblages is designed to address.

Activating Species Concepts for Global Challenges

The major issue confronting those of us who want to address the Global Challenges is immediately and starkly evident: why, when it seems clear that we as a species could make significant progress on several of the challenges with technologies already readily available, is it so difficult in fact to do so? Peter Haff, in his analysis of large technological systems, gives us a partial answer (2014: pdf available from author at Researchgate.net). Arguing that large-scale technologies such as communication systems, transportation net- works, and energy and resource extraction industries comprise a “techno- sphere,” Haff proposes to analyze “the dynamics of this newly emerged Earth system and the consequences for humans of being numbered among its parts” (2014, pdf 1). Rejecting the idea that humans control the techno- sphere, he instead takes the approach that “the workings of modern humanity are a product of a system that operates beyond our control and that imposes its own requirements on human behavior” (ibid., pdf 2). From this premise, he deduces rules for how the technosphere operates, based on coarse-gained scale distinctions between micro-scale (small compared to humans, which he calls Stratum I), meso-scale (human-sized, Stratum II), and macro-scale (very large compared to humans, Stratum III) (ibid., pdf 6). His rules include ones specifying that macro-objects cannot directly affect humans, and that humans cannot directly affect macro-objects.

If we consider the reality that the human species, through its activities, is accelerating climate changes and endangering the future of humans on this planet, along with risking extinction of myriad nonhuman species, it is easy to agree that humans are not in control of the technosphere: otherwise, why would we rush headlong to our own destruction? Yet Haff surely indulges in misplaced agency when he argues that the technosphere as such can carry out intentional actions (in this respect his approach resembles that of Kevin Kelly in What Technology Wants [2011]). For example, one of his six rules is that the techno- sphere acts to protect the humans that are some of its components. It is as though his approach has carried out a vector analysis of the global totality of competing/ cooperating forces that comprise the technosphere, and then, by designating their resultant with a singular noun, has created a fantasy-object that has intention- ality, agency, and rule-based behaviors. The key to his flawed approach can be found here: “We analyze the role of technology in the Anthropocene by exam- ining basic physical principles that a complex dynamic system must satisfy if it is to persist, i.e. continue to function” (Kelly 2011, pdf 3). This imparts to the technosphere an imperative similar to the biological imperative, “survive and reproduce.” But plenty of technical systems have failed to function and conse- quently have not persisted in human history – narrow-gauge railways in Great Britain, Jacquard looms throughout Europe, horse-and-buggy transportation networks in the US, samurai cultures in Japan. There is no will within a technical system to persist, only the desires of individual humans that it do so. Even these desires are not homogeneous throughout the system; for example, many humans abandoned making buggies (including wheels, axles, whips, traces, and so forth) when they saw there was no future in them.

These objections notwithstanding, some of his analysis yields insight into our present predicament. In comparing the technosphere to a navy ship, he argues that the captain (meso scale) can control the ship (macro scale) as well as the cams on the ship engine (micro scale) because connecting all three scale levels are mediating mechanism that allow interactions between levels, from the mechanical linkages in the engine, to the computational media in the navi- gation and communication systems, to the hierarchical structures that regulate the captain’s relations with his junior officers and crewmen. Only because of these multi-scalar and highly articulated mechanisms, developed over centuries of navy tradition and engineering, can one person control the actions of the ship as a whole. He contrasts the navy ship with the technosphere:

the technosphere is not an engineered or designed system and during its emergence has not relied on nor required an overall leader, and in con- sequence lacks the infrastructure needed to support leadership. In this regard the technosphere resembles the biosphere – complex anleaderless. (2014, pdf 10)

Therein lies the relevance to Global Challenges: one way to understand our pre- sent situation is through the need to create more mediating mechanisms that connect different scale levels to each other, creating the possibility for humans, in all their diversity as well as species-specificities, to begin to affect the direction of events. We might think of the Paris Accords as a successful attempt to create a series of such mechanisms linking together individual governments, corporations, and human actors within them to achieve global-scale results.

Another experiment in creating multi-scale linkages is “The Moral Machine Experiment”; for all its faults, it succeeded in gathering data from millions of participants across the globe, analyzing the results and presenting them in easy- to-understand graphs illustrating overlaps and divergences between different regions and cultures. In effect, it provides linking mechanisms between the programming of self-driving cars (species-in-cybersymbiosis), millions of human users (species-in-common), and the complex bureaucratic and technical systems that will determine how driverless cars are actually developed (the technosphere). Such experiments suggest that it may be possible to develop other multi-scalar mechanisms to address the Grand Challenges facing us and nonhuman species (species-in-biosymbiosis).

A general ecology of cognitive assemblages provides a framework through which to map existing linkages and, equally important, points to possibilities for developing more linkages across multiple scale levels. For such an ecology to succeed, it is necessary to have some way to talk about the enmeshments of humans, nonhuman others, and our computational symbionts without obliter- ating important distinctions and yet also acknowledging commonalities. The relational thinking characteristic of a general ecology includes the necessity to discover and specify the mechanisms through which relations are established and also to help bring other mechanisms into existence where they are lacking. Grand Challenges need species challenges to achieve effective action, just as species challenges need Grand Challenges to direct attention and effort toward our most urgent problems. A general ecology of cognitive assemblages provides a framework within which both can happen.

Notes

  • 1  UN Intergovenmental Panel on Climate Change (IPCC) warns that the planet will reach the crucial threshold of 1.5 degrees Celsius above preindustrial levels by as early as 2030, leading to extreme weather events, sea level rise, food shortages, and increased human misery.
  • 2  More information on the Grand Challenges can be found at the Gates Founda- tion, https://gcgh.grandchallenges.org/.
  • 3  For a reprint of the original by Richard Ryder, see https://www.veganzetta.org/wp -content/uploads/2013/02/Speciesism-Again-the-original-leaflet-Richard-Ryder.pdf.
  • 4  An example of this interdependence and the shared semiotic space it creates is recounted in Lorimer (2006), about the re-establishment of a reindeer herd in Scotland and the herders who tend them.
  • 5  More than two decades ago I participated in a residential research group at the Uni- versity of California, Irvine that included such well-respected scholars as the eminent environmental historian William Cronon, historian Richard White, landscape archi- tect Anne Whiston Spirn, and South American ethnographer Candace Slater, as well as several emerging scholars. Our semester-long discussions, which included such wry comments as Richard White remarking that we know something is a wilderness because there is a 300-page book of regulations governing what you can and cannot do, culminated in the anthology Uncommon Ground: Rethinking the Human Place in Nature (Cronon 1996). After the book was published, Gary Synder sent a bitter email to Cronon saying that our book had set back the environmental cause 20 years. In hindsight, however, a stronger, more enlightened environmentalism has since emerged that acknowledges the interpenetration of natureculture and does not rely on romanticized tropes of solitude, awe, and grandeur for its effectiveness.

References

Awad, Edmond, Sohan Dsouza, Richard Kim, Jonathan Schulz, Joseph Henrich, Azim Shariff, Jean-François Bonnefon, and Iyad Rahwan. 2018. “The Moral Machine Experiment.” Nature 563 (24 October): 59–64.

Bloomberg Report. 2019. “Off-Duty Pilot Saved Lion Air 737 Max One Day Before Doomed Flight.” March 20, 2019. https://www.bloomberg.com/news/videos/ 2019-2003-20/off-duty-pilot-saved-lion-air-737-max-one-day-before-doomed-flight- video.

Cisco. 2019. “The Internet of Things.” https://www.cisco.com/c/dam/en/us/products/ collateral/se/internet-of-things/at-a-glance-c45–731471.pdf.

Cronon, William. 1996. Uncommon Ground: Rethinking the Human Place in Nature. New York: W.W. Norton.

Gabyrs, Jennifer. 2016. Program Earth. Minneapolis: University of Minnesota Press. Guillory, John. 2010. “Genesis of the Media Concept.” Critical Inquiry 36, no. 2

(Winter): 321–362.
Gyles, C. and P. Boerlin. 2014. “Horizontally Transferred Genetic Elements and Their

Role in Pathogenic Bacterial Disease.” Veterinarian Pathology 51, no. 2: 328–340. Haff, Peter. 2014. “Humans and Technology in the Anthropocene: Six Rules.” The

Anthropocene Review 1, no. 2: 126–136. https://doi.org/10.1177/2053019614530575. Hansen, Mark. 2015. Feed-Forward: On the Future of Twenty-First-Century Media.

Chicago: University of Chicago Press.
Haraway, Donna J. 2016. Staying with the Trouble: Making Kin in the Chthulucene.

Durham, NC: Duke University Press.
Hayles, N. Katherine. 2017. Unthought: The Power of the Cognitive Nonconscious.

Chicago: University of Chicago Press.
Hoffmeyer, Jesper. 2008. “Semiotic Scaffolding of Living Systems.” In Introduction to

Biosemiotics: The New Biological Synthesis, edited by Marcel Barbierl, 149–166. Dor-

drecht: Springer. http://jhoffmeyer.dk/One/scientific-writings/semiotic-scaffolding.pdf. Holmes, Brian. 2018. “Learning from Cascadia.” https://deptofbioregion.org/departm

ent-of-bioregion/2018/12/11/ecotopia-today-learning-from-cascadia.