Contingency and Plasticity in Everyday Technologies (Media Philosophy) 1538171570, 9781538171578

Technology is a host of social, material, and epistemic transformation techniques, tools, and methods. The common percep

309 106 4MB

English Pages 336 [337] Year 2022

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Contingency and Plasticity in Everyday Technologies (Media Philosophy)
 1538171570, 9781538171578

Citation preview

Contingency and Plasticity in Everyday Technologies

MEDIA PHILOSOPHY

Series Editors M. Beatrice Fazi, Reader in Digital Humanities, University of Sussex Eleni Ikoniadou, Reader in Digital Culture and Sonic Arts, Royal College of Art The Media Philosophy series seeks to transform critical investigations about technology by inciting a turn towards accounting for its autonomy, agency, and for the new modalities of thought and speculation that it enables. The series showcases the ‘transcontinental’ work of established and emerging thinkers whose research engages with questions about the reshuffling of subjectivity, of perceptions and of relations vis-à-vis computation, automation and digitalisation, as 21st century conditions of experience. The books in this series understand media as a vehicle for ontological and epistemological transformation, and aim to move past their consistent characterisation as pure matter-of-fact actuality.  For Media Philosophy, it is not simply a question of bringing philosophy to bear on what is usually considered an object of sociological or historical concern, but of looking at how developments in media technology pose profound challenges for the production of knowledge and conceptions of being, intelligence, information, temporality, reason, the body and aesthetics, among others. At the same time, media and philosophy are not viewed as reducible to each other's internal concerns and constraints, and thus it is never merely a matter of formulating a philosophy of the media. Rather, the series aims to create a space for the reciprocal contagion of ideas between the disciplines and new mutations from their transversals. With their affects and formalisms cutting across creative processes, ethico-aesthetic experimentations and biotechnological assemblages, the media events of our age provide different points of intervention for research.  The series is dedicated to pushing the thinking of media through projects looking for uncertain, unknown and contingent rhythms that inflect and change the world. —The Editors, M. Beatrice Fazi and Eleni Ikoniadou

‌‌ Software Theory: A Cultural and Philosophical Study, by Federica Frabetti Media after Kittler, edited by Eleni Ikoniadou and Scott Wilson Chronopoetics: The Temporal Being and Operativity of Technological Media, by Wolfgang Ernst, translated by Anthony Enns The Changing Face of Alterity: Communication, Technology and Other Subjects, edited by David J. Gunkel, Ciro Marcondes Filho and Dieter Mersch Technotopia: A Media Genealogy of Net Cultures, by Clemens Apprich, translated by Aileen Derieg Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics, by M. Beatrice Fazi Recursivity and Contingency, by Yuk Hui Sound Pressure: How Speaker Systems Influence, Manipulate and Torture, by Toby Heys Contingency and Plasticity in Everyday Technologies, edited by Natasha Lushetich, Iain Campbell and Dominic Smith

Contingency and Plasticity in Everyday Technologies Edited by Natasha Lushetich, Iain Campbell and Dominic Smith

ROWMAN & LITTLEFIELD

Lanham • Boulder • New York • London

Published by Rowman & Littlefield An imprint of The Rowman & Littlefield Publishing Group, Inc. 4501 Forbes Boulevard, Suite 200, Lanham, Maryland 20706 www​.rowman​.com 86-90 Paul Street, London EC2A 4NE Copyright © 2023 by The Rowman & Littlefield Publishing Group, Inc. All rights reserved. No part of this book may be reproduced in any form or by any electronic or mechanical means, including information storage and retrieval systems, without written permission from the publisher, except by a reviewer who may quote passages in a review. British Library Cataloguing in Publication Information Available Library of Congress Cataloging-in-Publication Data Available ISBN 978-1-5381-7157-8 (cloth); ISBN 978-1-5381-7158-5 (paperback); ISBN 978-15381-7159-2 (ebook) The paper used in this publication meets the minimum requirements of American National Standard for Information Sciences—Permanence of Paper for Printed Library Materials, ANSI/NISO Z39.48-1992.

Contents

List of Figures

ix

Prologue: Normalising Catastrophe or Revealing Mysterious Sur-Chaotic Micro-Worlds? Natasha Lushetich, Iain Campbell, and Dominic Smith

xi

Acknowledgments xxxi PART I: SOCIAL-DIGITAL TECHNOLOGIES

1



Chapter 1: Information and Alterity: From Probability to Plasticity Ashley Woodward



3

Chapter 2: Transcendental Instrumentality and Incomputable Thinking 19 Luciana Parisi Chapter 3: Digital Ontology and Contingency Aden Evens



35

Chapter 4: Blockchain Owns You: From Cypherpunk to Self-Sovereign Identity Alesha Serada Chapter 5: The Double Spiral of Chaos and Automation Franco ‘Bifo’ Berardi

53

PART II: SPATIAL, TEMPORAL, AURAL, AND VISUAL TECHNOLOGIES Chapter 6: Allagmatics of Architecture: From Generic Structures to Genetic Operations (and Back) Andrej Radman vii

71

87 89

viii

Contents

Chapter 7: Computation and Material Transformations: Dematerialisation, Rematerialisation, and Immaterialisation in Time-Based Media Oswaldo Emiddio Vasquez Hadjilyra Chapter 8: How the Performer Came to Be Prepared‌‌: Three Moments in Music’s Encounter with Everyday Technologies Iain Campbell

107



125

Chapter 9: The Given and the Made: Thinking Transversal Plasticity with Duchamp, Brecht, and Troika’s Artistic Technologies 143 Natasha Lushetich Chapter 10: Ananke’s Sway: Architectures of Synaptic Passages Stavros Kousoulas PART III: EPISTEMIC TECHNOLOGIES



163 181



Chapter 11: Outline to an Architectonics of Thermodynamics: Life’s Entropic Indeterminacy Joel White

183

Chapter 12: Irreversibility and Uncertainty: Revisiting Prigogine in the Digital Age Peeter Müürsepp

201

Chapter 13: ‘At the Crossroads . . . ’: Essence and Accidents in Catherine Malabou’s Philosophy of Plasticity Stephen Dougherty

219

Chapter 14: Ugly David and the Magnetism of Everyday Technologies: On Hume, Habit, and Hindsight Dominic Smith

235

Chapter 15: Adjacent Possibles: Indeterminacy and Ontogenesis Sha Xin Wei



251

Epilogue: Schrodinger’s Spider in the African Bush: Coping with Indeterminacy in the Framing of Questions to Mambila Spider Divination 271 David Zeitlyn Index

289

About the Authors



299

List of Figures

6.1. Axes of reference and consistency based on Guattari’s Schizoanalytic Cartographies. 7.1 and 7.2. Stills from Harun Farocki, Inextinguishable Fire (1969). 15.1. Turing machine. 15.2. State diagram (algorithm) for a Turing machine. 15.3. Tangent spaces (e.g., planes) over a manifold (e.g., sphere). 15.4. ‘Lifting.’ E.1. Stylus tablet 836, one of the most complete examples excavated at Vindolanda. E.2. Palm tree cards: positive and negative (approximately actual size), 2022. E.3. Divination setup, 2022. E.4. Basic result patterns, 2022. E.5. Two unusual results in which the cards are propped up on each other, 2022.

ix

Prologue Normalising Catastrophe or Revealing Mysterious Sur-Chaotic Micro-Worlds? Natasha Lushetich, Iain Campbell, and Dominic Smith

Over all things stand the heaven accident, the heaven innocence, the heaven chance, the heaven prankishness. —Friedrich Nietzsche1 I do not understand why, when I ask for grilled lobster in a restaurant, I’m never served a cooked telephone. —Salvador Dalí2

Can we say that technology—understood as a host of social, epistemic, material, and immaterial transformation techniques, tools, and methods—is contingent and indeterminate? If so, how does this manifest? As operational instability? As unpredictability or unknowability? As creativity and the production of novel otherness? In 2006, the US Congress established an expert cross-disciplinary commission consisting of anthropologists, molecular biologists, neuroscientists, psychologists, linguists, classical scholars, and artists. The commission’s purpose was to develop a language of warning against the threats posed by nuclear waste in ten thousand years’ time. The problem to be solved was not only what symbols to use to communicate with the thirty-first-century humans (who are likely to be more different from xi

xii

Prologue

us than the 8000 BC, prewriting humans were), but how to understand the evolution of potential catastrophes given the accelerated proliferation of new technologies and the rapidly changing environmental conditions, due, in part, to the proliferation of new technologies. Catastrophe is by definition beyond human comprehension. It is also beyond the technologies developed to control accidents (usually perceived as locally manageable). Since the shift in planetary interdependence induced by globalisation and the rise of the risk society where ‘the unknown and unintended consequences’ of complex global, technologically mediated interactions are ‘the dominant force in history and in society,’3 crisis and the mapping of catastrophe have become a necessary means of understanding the future. The paradoxical twist is that the conceptualisation, visualisation, and management of crisis and catastrophe are themselves contingent on technology. For example, a system known as Total Information Awareness, built by the US military in the wake of 9/11 as a counter-terrorism weapon, has been adapted to programmes such as the Risk Assessment and Horizon Scanning System (RAHS), which is widely used in Asia. The problem with RAHS, however, as with many other so-called early warning systems, is that it has high false alarm rates and creates almost as many accidents as it manages to prevent.4 Crisis is not—or is no longer—a historical event, a state of locally observable social breakdown. It is not a condition to be observed—as, say, a failure of operationality or loss of meaning—but rather, as Janet Roitman notes, a ‘transcendental placeholder’ that signifies techno-social contingency itself.5 But how should we understand this complex phenomenon? Should it be seen as the inevitable result of difference, observation, and/or acceleration? For Niklas Luhmann, observation is an indication in a field of difference. When the observing agent (human or artificial) perceives ‘something,’ the ‘something’ it perceives is the differential relation to everything else.6 This is very similar to the Derridean play of semiosis7; both structure the world’s undecidables. Luhmann speaks of the ‘marked’ and ‘unmarked’ side of the field of observation, where ‘unmarked’ defines the blind spot of observation; he further suggests that everything becomes contingent whenever what is observed depends on who or what is observing.8 Second-order observation— the observation of (human or artificial) agents doing the observing—is thus doubly contingent.9 This means that the intelligibility of accidents, crises, and/or catastrophes is contingent on the parameters, techniques, and technologies of observation, be they material or immaterial: there is no such thing as an event that first occurs, and is then observed and subsequently analysed; rather, events are co-produced in and by the observation techniques, technologies, and agents.10

Prologue

xiii

The early Paul Virilio saw technology as inseparable from speed and acceleration: ‘there is no industrial revolution, only a dromocratic revolution . . . no strategy only dromology . . . “dromological progress” is that which “ruins progress”.’11 In an Aristotelian vein, Virilio associates accident with the revelation of substance,12 which may not manifest fully without the accident. In other words, the accident serves the purpose of knowledge. However, in his 2002 (post 9/11) exhibition Ce qui arrive at The Fondation Cartier in Paris—ce qui arrive being the French translation of the Latin accidens (that which happens)—Virilio is no longer concerned with the accident as that which reveals substance. He is concerned, first, with the accident of knowledge, and second, with the multiple arborisations of these accidents of knowledge: ‘[t]he shipwreck is the “futurist” invention of the ship, and the air crash the invention of the supersonic airliner, just as the Chernobyl meltdown is the invention of the nuclear power station.’13 Importantly, Virilio considers computer science as an ‘accident of knowledge due to the very nature of its indisputable advances but also, by the same token, due to the nature of the incommensurable damage it does.’14 But Virilio is not talking about programming errors or oversights, which is the impression we get from Norbert Weiner’s writing on the subject: A goal-seeking mechanism will not necessarily seek our goals unless we design it for that purpose, and in that designing we must foresee all steps of the process for which it is designed. . . . The penalties for errors of foresight, great as they are now, will be enormously increased as automatization comes into its full use.15

Though Weiner acknowledges the ‘penalties’ for technological complexity, he suggests that careful and knowledgeable programming can overcome contingency. For Virilio, by contrast, the accident is not a dysfunction of one or more parts of the means-to-goals trajectory. It is far more similar to structural instability. Indeed, in The Accident of Art Virilio calls the accident ‘a profane miracle.’16 A miracle is not a revelation of an object’s substance, or an aspect of human knowledge. Rather, a miracle reveals the structuring principles of reality. Like Bruno Latour, Virilio suggests that the operation of the world reveals itself in moments of rupture.17 We could understand this in two ways: accident as methodology, and accident as the manifestation of technological normalisation. The former was first proposed by Harvey Molotch, who, in his 1970 study of the accidental oil spill off the Californian coast—which no amount of lobbying from the wealthy and influential Santa Barbara community could stop or even mitigate—argues that what the oil spill revealed was not only the underlying power dynamic but also a mode of governance.18 In the post-1990s period, this mode of governance, which

xiv

Prologue

deploys contingency, instability, and disorder as facilitators of governance itself, became, along with disaster capitalism,19 a sine qua non of neoliberal governance. The second way to understand the accident is as technological normalisation, which refers to the normalisation of incompatibilities and consists of three elements: institutional, contextual, and systemic.20 Institutions sometimes develop practices that differ from written regulations; technologies are situated in specific, rather than generic, contexts, which are often at odds with the contexts they were designed for—for example, most nuclear reactors in Japan were designed in the United States, thus not with earthquakes and tsunamis in mind21—such and similar slippages are further exacerbated in large-scale systems whose various (human and technological) sub-systems, and multi-national regulatory structures make uniform operation impossible.22 Already in 1984, Charles Perrow argued that post-industrial catastrophes were to be understood as routine outcomes of normalised—yet utterly unmanageable—technological arrangements; his conclusion was that complex techno-social systems should be loosely, rather than tightly, coupled, and that systems where the consequences of accidents were on a catastrophic scale, such as nuclear power, should be abandoned altogether.23 All these narratives view the accident as the result of human-technological interaction and/or the non-compensable irruption of external difference. While it is certainly true that post-industrial accidents reveal what Martha Nussbaum has called ‘the fragility of existence,’ in addition to revealing the inapplicability of pre-industrial technical-epistemic principles, namely universality, commensurability, precision, and explainability,24 understanding accidents (and contingency more generally) as the obverse of these principles is problematic. Equally problematic is the notion of the accident as an external occurrence, regardless of its indisputable connection to knowledge. For Michel Foucault, identify[ing] the accidents, the minute deviations—or conversely, the complete reversals—the errors, the false appraisals, and the faulty calculations that gave birth to those things which continue to exist or have value for us is to discover that truth or being lies not at the root of what we know and what we are but the exteriority of accidents.25

While this is certainly true, and acknowledged in other fields, such as literature, where the experimental accident of form (in, say, chance operations), considered nonsensical in one generation, becomes a new literary genre in the next,26 the technological accident—understood in its literal meaning, as an event—cannot be seen as external to technology.

Prologue

xv

THE INTERNALITY OF CONTINGENCY The work of Bernard Stiegler and Cornelia Vismann as well as that of the more contemporary authors such as Yuk Hui and Beatrice Fazi shows contingency to be internal to the operation of technology, as method and material support. As is well known, for Stiegler, the originary relation between the human and the technical is both contingent and temporal.27 In the first volume of Technics and Time, Stiegler relates the story of Prometheus’s lesser-known brother, Epimetheus, who Zeus had put in charge of distributing traits and qualities to animals and humans. However, Epimetheus (whose name means afterthought and is related to the past) mistakenly used up all available traits—hooves, claws, and fangs—on animals and forgot to keep any in reserve for humans. In order to remedy this error, his brother Prometheus (whose name means forethought and is related to the future) stole fire from the gods and gave it to humans. As is well known, this gesture incurred the wrath of Zeus who chained Prometheus to a rock in the Caucasus where a vulture pecked his liver for the rest of his eternal life. Noting in passing that Prometheus had thereby effectively become the clock of the Titans (the measure of time’s passing), Stiegler interprets this allegory as suggesting that the origin of technology resides in oversight and forgetting.28 More important than Stiegler’s intriguing mythological account, however, is the fact that fire is not a claw or a fang, that is, not a physically incorporable technology or organology. Fire is an element. For Gaston Bachelard, fire is simultaneously subject, object, and a ‘hormone of the imagination.’29 It is both actual and virtual; its warmth lies at the bottom of human notions of comfort since ‘the origin of every animism’ is ‘calorism.’30 Both intimate and universal, fire is entwined with potentiality: it ‘hid[es] in the entrails of substance, latent and contained.’31 As an element, fire is also imbricated in the human body in a virtual manner, through the flesh, which, as Maurice Merleau-Ponty has argued, is the ‘fifth element.’32 Like all other elements, fire changes micro-temporally and this change is a spatial one, since the intensity of fire changes its reach. Although it’s not our intention to theorise the origin of technology here, it’s important to note that ‘fire as first technology’ is useful for understanding the micro-spatio-temporal operation of all technologies, and the extent to which this operation is contingent and/or plastic. Vismann doesn’t make an explicit connection with fire; however, her conceptualisation of the gadget or tool’s agency is profoundly spatio-temporal. First, a tool’s features are not independent from their conditions of production, material properties, and the spatial and temporal circumstances of their coming into being. This is why, in Vismann’s view, we need to differentiate between ‘the agency of persons, who de jure act autonomously,’ and

xvi

Prologue

the ‘agency of objects and gadgets, which de facto determine the course of action.’33 The question of tool agency is here not one of ‘feasibility, success, chances and risks of certain innovations and inventions but one of the auto-praxis [Eigenpraxis] of things, objects and tools.’34 In German, Eigenpraxis has the connotation of ‘particular’ or ‘own’ and refers to the agent-thing’s iterative (i.e., non-programmed) steering of emergent processes in new, and, for humans, often, unfathomable directions. The fact that all tools and gadgets engage in Eigenpraxis35 means that technological relations are dynamic actualities-virtualities. This is similar to Gilbert Simondon’s notion of individuation; for Simondon, all techniques and technologies are formed through evolutionary layering and the modification of functionalities, much like in the case of living organisms.36 Like living organisms, mechanical and automated objects have an internal dynamic. Space-time, likewise, is processual and mutational. Individuation unfolds in the (organism or machine’s) field of potentiality which affords the mutational qualities of an individual organic or machinic existent. Potentiality—the realm of the virtual—is a futurity that is enveloped in the present. In a recent work, Hui places contingency in dialogue with recursivity.37 Recursivity is the system’s transformational interaction with the environment, which is often, or at least to a degree, incorporated into the system. Cutting across the living organism-machine dichotomy through a historical analysis of the concept of the organic via Immanuel Kant, Friedrich Wilhelm Joseph von Schelling, and Georg Wilhelm Friedrich Hegel, where contingency defines the impossibility of the knowledge of final ends and is supplanted by an adaptive teleology of organisms predicated on contingency, Hui suggests that machinic systems, like organic ones, act recursively on themselves. The reason why recursion cannot be programmed is that systems are exposed to contingencies they can neither mitigate nor incorporate. Instead, contingent events interfere with the system’s recursive loops, which is what triggers new adaptive tactics. Hui’s connection between the Hegelian sublation, cybernetic feedback, and Kurt Gödel’s recursive algorithms suggests an onto-epistemology similar to Gregory Bateson’s organic-machinic, human, and other-thanhuman epistemology of eco-systems.38 Fazi’s book Contingent Computation by contrast focuses on abstraction as immanently constitutive of computational processes, through an aesthetic of the indeterminate, seen as a real function of computation. Drawing on Gilles Deleuze, Fazi conceptualises the virtual as a continuum that can never be completely actualised.39 However, she argues that this virtual potentiality does not pertain to the discretising operations of computation. Turning to the philosophy of A. N. Whitehead, Fazi theorises a form of potentiality that is not predicated on the continuum of the virtual but is instead specific to the abstractive, quantitative character of

Prologue

xvii

digital computation. Computers are determined by their formal structure and their deductive system. However, they are also indeterminate. Indeterminacy is both a process and a quality that arises from the system’s final openness; for example, the Turing machine operates through finite processes of computation, but this operation is nevertheless open as the truth claim of a statement cannot be determined before the actual operation. In other words, the infinite (potentiality) acts on the finite process of computation (actuality). These works of Hui and Fazi echo the long history of indeterminacy in the arts, where indeterminate procedures have, since the beginning of the twentieth century, been variously a response to or an elaboration of scientific and philosophical notions of indeterminacy, or have alternatively explored the actual-virtual indeterminacy of specific materials and processes. Many Dadaist and Surrealist practices, such as those of Tristan Tzara, Francis Picabia, and Marcel Duchamp, were a direct response to the prominent mathematical and quantum-mechanical theories of the time, those of Henri Poincaré and Niels Bohr. John Cage, Luigi Nono, and Iannis Xenakis’s compositional strategies likewise engaged early probability theory, stochastic procedures, and Ilya Prigogine’s theory of non-linear dynamics, much like the work of contemporary artists, engages systems’ and algorithmic indeterminacy. For example, Pierre Huyghe’s UUmwelt (2018) and After ALife Ahead (2017)40 stage organic-machinic interactions between complex systems, while Tom White’s Perception Engines (2018) engage in epistemic experiments with indeterminate neural network learning patterns.41 What such and similar works address is radical contingency, which was recently (re)formulated by Quentin Meillassoux as the amalgam of two notions. First, the fact that ‘any entity, thing, or event . . . could be, or could have been, other than it is,’42 and second, that ‘facticity’ is not ‘the index of thought’s incapacity to discover the ultimate reason of things’ but instead ‘the index of thought’s capacity to discover the absolute irreason of all things.’43 Meillassoux calls irreason ‘surchaos’ in a gesture similar to the Surrealists’44 and the theory of non-linear dynamics, where chaos doesn’t refer to disorder but to the (unpredictable) emergence of order from disorder and disorder from order.45 That said, everyday technologies—those we use on a daily basis—are often experienced as far from (auto)poietically sur-chaotic. Rather, they are experienced as over-determined. Theoretically, over-determination can be understood through Deleuze’s 1995 essay ‘Postscript on Control Societies.’46 For Deleuze, and for theorists following him such as Maurizio Lazzarato, ‘hypermodern’ techniques of governance, like those enacted through communication technologies and finance, reciprocally enable ‘neoarchaic’ mechanisms of subjection47—racism and class division—while enacting a ‘micropolitics of insecurity.’48 In everyday experience, over-determination is felt in automatic account termination, automatic health insurance claim

xviii

Prologue

refusals, or criminal recidivism prediction algorithms which equate poverty and low education levels with criminality. These produce conclusions like if you’re underprivileged, uneducated, and your family members have been to prison; you’re bound to be a criminal; and also a criminal for life.49 Such short-cutting practices, profoundly problematic on an ethical level, and welding acceleration to dataism on the ontological and epistemological levels, are accompanied by all too frequent examples of unnecessary complexity in matters that could hardly be any simpler; for example, changing the address associated with your bank account, which results in hours of time-wasting conversations with human and machinic agents in an effort to fathom why the programme ‘can’t take’ an address with two numbers. As noted in much recent scholarship, the widespread use of predatory algorithmic procedures that automate difference control and anomaly detection perpetuates racism, sexism, and classism.50 Sequence- and logic-locked procedures translate directly into pre-emption or ‘future from structure,’ reducing ethical questions to technical management, and continuing the mantra of industrial rationality: progress, increased productivity, and efficiency, in a far worse—because automated—way. ‘Future from structure’ manipulates possibility into probability, and probability into necessity, reducing relationships of relevance to those of causation. As Franco Berardi has extensively argued, automation is ‘the submission of the cognitive activity to logical and technological chains,’ a ‘form of engendered determinism,’ and, as such, the ‘fundamental act of power.’51 While it’s important to understand that the power of automation is, at the same time, the automation of power, it’s equally important to acknowledge that the mid-twentieth-century computational procedures—predecessors of what we understand computation to be today—did not develop on their own. A key term in neuroscience, plasticity played an important role in the mid-twentieth-century co-development of computers and neurosciences. Discussing the indeterminate element present in Turing’s thinking machine— which developed amid theories such as Gödel’s undecidability theory—David Bates and Nima Bassiri refer to Donald Hebb’s famous phrase ‘neurons that fire together wire together’ to establish a connection between plasticity and deviance from set routes and routines.52 Pointing to the fact that contingency exists in human and computer synapses alike, they suggest that at the time when the first computer was being conceptualised, the digital was not yet fully aligned with automaticity.53 The co-development of computer software, hardware, and infoware with experimental neuroscience meant that the plastic brain offered an insight into unpredictable leaps in human behaviour, related to hidden capacities that go beyond habit or norm. In machines, this meant unpredictable leaps in functional mechanisms, which were often treated as errors, but which were not errors, merely different

Prologue

xix

developments. Neuropsychological discourses focusing on the disorders of the injured brain and its ability to recover functioning after injury showed the brain to be simultaneously a ‘site of openness’ and a space of artificial, repetitive ‘mechanisms.’54 Quoting William James, Bates and Bassiri conclude that ‘[p]lasticity means the possession of a structure weak enough to yield to an influence, but strong enough not to yield all at once.’55 Errance—wandering or movement away from the established or programmed path or course—is, in other words, inherent in and to computational procedures. Or, as Simondon put it: ‘the true perfection of machines does not correspond to an increase in automation, but on the contrary to the fact that the functioning of a machine harbours a certain margin of indetermination.’56 Seven decades on from Simondon we know, as Hui has argued in a development of Simondonian concepts, and N. Katherine Hayles has noted in relation to Wiener’s cybernetic paradigm of circular feedback, that, in machinic and algorithmic processes and operations, feedback is recursive and spiral, rather than circular.57 Feedback does not reinforce self-same operations but creates an internal dynamic which opens onto the novel and the ‘undecidable.’58 Furthermore, contemporary machine learning uses back propagation to train multi-layer architectures, which makes feedback much less relevant than aggregation, de-aggregation, and re-aggregation, all of which create internal change and cue emergent behaviours. More precisely, there are at least three reasons why computer and machinic processes could be considered contingent, plastic, and indeterminate: the essential incomputability of all computing systems, their constant production of new temporalities or temporal swarming, and neural network contagions leading to unpredictable output. As this volume will show, moreover, these themes are far from remote: they are integral to the everyday technologies that populate the lifeworlds of the twenty-first century. INCOMPUTABILITY, TEMPORAL SWARMING, NETWORK CONTAGIONS Combining, one the one hand, Turing’s question of the limit of computability, and, on the other, Claude Shannon’s information theory where information doesn’t apply to the individual message but to signal crafted from noise,59 Gregory Chaitin suggests that computation consists of unknowable probabilities.60 Data entropy (the fact that the output always exceeds the input) leads to algorithmic randomness resembling an infinite series of coin tosses where the outcome of each toss is unrelated to the previous one. Chaitin’s name for this process is Omega—an infinitely long number whose digits have no repeatable pattern whatsoever. Related to the halting problem—the

xx

Prologue

question of whether a programme will halt after a thousand, million, or billion years—Omega is ‘the concentrated distillation of all conceivable halting problems’61—a number which can be known of, but not known through human reason. As a sequentially ordered computational processing of zeros and ones, it shows that there is an intrinsic dynamic at work in every computation process negating the logic- and sequence-locked view of computation where randomness is seen as an error. In other words, incomputability is not merely the impossibility of computability or prediction. It’s the very real possibility of an indeterminate computational coming-into-being, which does not operate in time but is temporal in nature. As is well known, there is a temporal gap between human and technical perception.62 The most frequently used examples come from high frequency trading where, as Donald MacKenzie has argued, behaviours like ‘queuing’ (where existing bids are altered on the basis of temporal advantage, according to the first-come-first-served rule), and ‘spoofing,’ which refers to the placement and cancellation of orders, based on the millisecond temporal advantage and price drops caused by cancellations,63 are produced. High frequency trading is, of course, a specific domain of human-machinic endeavour. However, the reason why these behaviours are relevant to a discussion of technological contingency is that they show, in qualitative terms, that informational-algorithmic ecologies do not consist of pre-formed, immutable interfaces, but of complex ‘swarm behaviours.’64 These swarm behaviours are predicated on temporal processes that brim under the surface of all machinic operations, for example, accelerated pattern recognition, or syntheses of diverse inputs. Despite expressions like ‘webpages,’ which would suggest a static object (both an object and static), the internet is an interpenetration of multiple ‘temporal latencies.’65 In asynchronous scripts, such as XML, applications ‘continually respond to input and work through interrelated scripts, style-sheets and mark-up.’66 Their ‘geographically dispersed operations’ do not ‘resolve into a uniform, mechanical rhythm’; on the contrary, they “propagate a fluctuating momentum based on highly dispersed ‘data-pours’.”67 By definition, information is never first ‘composed’ then presented. It’s always already operationally active, which is to say that it is changing all the time. As Cécile Malaspina’s recent study of the epistemological consequences of Shannon’s account of information has shown, distinctions between information and noise in the transmission of a signal are external to the process of transmission itself.68 This means that the boundary between information and noise is shifting all the time in tandem with our knowledge practices. Dieter and Gauthier call the medium-inherent process tertium quid (third something), a form of subterranean interpenetration and communication—in Shannon’s sense of the word—through the intersection and binding of signals into reiterative sequences of action in the ‘milieu intérieur

Prologue

xxi

of machines.’69 Micro-sensors, computational processors, and algorithmic operations environmentally transform the very possibilities for perception. This means that the temporal dimension of technical environments has a performative effect: it triggers new behaviours through plastic connections and transformations in and of different registers. For example, neural networks, in which connections are modulated through a (re-)distribution of weights that contribute to the tendency of neurons to fire through a function of the strength of the connection, are co-constitutive.70 Neural networks create media based on the mechanisms configured during training on input data. In supervised training, the model of emergence is consecutively monitored and modified, which has both empirical and significational relevance—understanding under what circumstances the networks change. An auto-productive developmental logic, which occurs in unsupervised learning, is fundamental to all neural networks. As Catherine Malabou has argued, in machinic operations, any notion of invariant repetition (automaticity) is accompanied by spontaneous movement, given that the ‘automatic’ in auto-production comes from the ‘double valence’ of automatism: as ‘involuntary repetition and spontaneous movement,’ as both ‘constraint and freedom.’71 Interactive algorithmic ecologies are not contingent or plastic in a consecutive, easily observable way but as perpetual oscillations between intelligibility and unintelligibility. In deep learning network architectures, neurons are connected through synaptic weights to neurons in deeper layers, which are connected to other neurons, in still deeper layers. In supervised and semi-supervised learning, the adjustment of weights forms part of processual programming; here human intervention alternates with the generative aspect of the networks. The difference between such operations and what may be called ‘network contagions’ is that the former are semi-knowable, the latter unknowable. ‘Unknowable’ here means that in deep learning architectures, the various activations and weighted connections between thousands of nodes can be traced at the micro level, but there is, at this moment in time, no macro explanation. Rather, complex behaviours emerge from interactions between millions of cells. These interactions are programmed, but the combination of unit-level learning algorithms and their exposure to data, which allows them to configure themselves, are not. Algorithmic ecologies are therefore a chaotic operation in a state of almost-equilibrium. The levels of abstraction we have to toggle between in order to engage and make sense of phenomena like neural networks and the media they create and inform are considerable. As this volume both thematises and demonstrates, however, instructive strategies for coming to terms with the structures and processes operative here turn out to be hidden in plain sight. Take the ‘loose resemblance’ between neural networks and the human brain we indexed earlier. For all this image is hackneyed, dwelling with it turns out to be a useful

xxii

Prologue

way of opening channels between the contemporary everyday and levels of abstraction that appear more remote from it, yet that are in fact deeply implicated as constitutive conditions: neural networks, for all the complexity their imply, are deeply embedded in sculpting much of the contemporary ‘everyday’ in networked societies—they obtain in the background across manifold interactions with digital technologies; as the cliché captures, they are modelled, at least in part, on processes obtaining in the human brain; and, by virtue of this relation of modelling/resemblance, they recursively feedback into how we conceptualise of human beings and the brains, nervous systems, and artefacts that constitute them.72 It is a feature of highly specialised work in epistemology to thematise the key issues at stake here: recursivity, levels of abstraction, modelling, the relationship between propositionally expressed and ‘tacit’ knowledge, and links between the general and the particular, to name but a few.73 It is something further still, however—perhaps akin to a conjuring trick—to undertake the task of demonstrating how these ostensibly abstract, remote, and specialised matters are folded into our most ‘everyday’ technological artefacts and practices, in ways of which we can scarcely afford to be ignorant. If there is a key challenge that each of the chapters assembled in this volume undertakes it is precisely this one. The conjuring trick turns out to have to be a conjoining trick: that is, a way of shedding light on sometimes familiar phenomena through novel forms of assemblage, exemplification, analogy, and combination. Viewed through these novel lenses, what hides in a contemporary phenomenon like the ‘loose resemblance’ between a brain and a neural network are not just further clichés, nor mere anthropomorphic or anthropocentric projections; it is rather a margin of indeterminacy and difference that requires the conjunction of a particularly focused set of themes and problems in order to be explored. In this volume, we have attempted such a conjunction under three headings: Social-Digital Technologies; Spatial, Temporal, Aural, and Visual Technologies; and Epistemic Technologies. THE STRUCTURE OF THIS VOLUME Part I, ‘Social-Digital Technologies,’ juxtaposes arguments for machinic and algorithmic indeterminacy to those of (over)determination in cognitive automation, blockchain, and digital ideology. The section opens in an existential register, with Ashley Woodward’s historical overview of the idea of information, traced from the metaphysical catastrophe of the death of God to the informational simulation of God in the Leibnizian Monad. In a move that enables both a philosophical and social reflection on the imbrication of information technologies in the visceralities of human existence, individual

Prologue

xxiii

and social, Woodward analyses the relationship of the probabilistic aspect of information to alterity, engaging, along the way, with the work of Luciano Floridi and Gilbert Simondon. Luciana Parisi continues the discussion of the (algorithmic and machinic) production of alterity through a comparison of contemporary digital brutalism with the aesthetic, functional, and architectural strategies of New Brutalism. Using digital decisionism as an example (where there is no difference between true or false, only between a faster and slower—often equally illogical—decision), she proposes that transcendental instrumentality is rooted in the materiality of indeterminate machinic processes. For Parisi, the (social and operational) construction of technology as ‘Man’s means-to-an-end continuum’ de-values tools under the pretext that they have no soul. What is needed instead is a reappraisal of the ontological implications of the actual, material machinic processes. Aden Evens’s chapter opposes this view. Addressing Parisi’s notion of computational indeterminacy (as well as those of Fazi and Hui), he argues that the digital is deterministic. Digital determinism is, for Evens, rooted in an elaborate ideology—based on positivism, rationalism, and instrumentalism. This ideology erodes not only all conditions for novelty but, more worryingly, has a significant social dimension. The deterministic view is further elaborated in Alesha Serada’s analysis of blockchain technology. Here Serada argues that despite contemporary blockchain technologies (on which emerging projects of digital governance are based) being a reaction to algorithmic surveillance and control, they have now morphed into a form of ‘blockchain governmentality’ where repression and invisible violence are hidden behind the façade of democratic decision-making. Similarly to Woodward’s opening chapter, part I closes in an existential register, with Franco ‘Bifo’ Berardi’s analysis of cognitive automation, and, in particular, its relation to the politics of financial indeterminism, the post-COVID-19 supply chain disruption, and a new crisis of abstraction. Highlighting the plasticity of the general intellect, Berardi argues that recent neuro-scientific advances in human-machinic intelligence applications should be seen in the context of the re-concretisation of biological matter, and the derailment of cognitive productivity, in cognitive labour, however also in panic and fear. Reassessing the role of all these factors in cycles of in- and re-determination, Berardi proposes a new methodological approach based on the techno-political determination of governance, and on tuning into contingent, chaotic events. Part II, ‘Spatial, Temporal, Aural, and Visual Technologies,’ delves into received ideas about non-digital technologies such as those used for building spatial structures, manufacturing instruments, and constructing the aural and visual space. Bookended by Andrej Radman’s and Stavros Kousoulas’s architectural analyses, this section studies the technical and aesthetic stakes of the temporalities and spatialities of physical environments. Radman, to begin,

xxiv

Prologue

takes as his target a perspective on the built environment where the relation between agent and architecture is grounded in the supposed unities of space, time, and consciousness. Through a theoretical apparatus developed from Simondon’s notion of allagmatics and via figures including Félix Guattari and Rem Koolhaas, Radman proposes to treat architecture as an ecological practice that facilitates the production of collective subjectivities. Radman’s call to reinvest discourses of digitality with a pathic dimension is echoed by Oswaldo Emiddio Vasquez Hadjilyra, who provides a transhistorical juxtaposition of some means by which material reality has been treated as an object of measure and computation. Vasquez’s studies, stretching from Pythagoras’s account of an aural-mathematical harmony to contemporary digital image making, highlight how the temporalities of computation are at the same time techniques of material transformation. Iain Campbell and Natasha Lushetich, meanwhile, treat the diverse modes by which twentieth- and twenty-first-century artistic technologies and social-scientific technologies have met. Campbell explores a movement between transparency and opaqueness that has been characteristic of how musical instruments, understood as technologies, are conceived. Beginning with the insertion of objects into an everyday piano that rendered John Cage’s ‘prepared piano’ as a challenge to the aesthetic and social standing of that instrument, Campbell follows the thread of contingent musical technologies as they come to intersect with large-scale technological research. Lushetich, in turn, presents artistic dialogue with science as a means for challenging socio-scientific dogma around space, time, and change. As with Vasquez Hadjilyra’s account of computation, Lushetich treats the ‘artistic technologies’ developed by Marcel Duchamp, George Brecht, and the artistic collective Troika as attempts to challenge given orderings of the world, evoking a plasticity and indeterminacy of space-time. Such work, Lushetich shows, does not only perform a set of interventions into diverse fields (socioscientific, artistic, political), but suggests a transversal formulation of being to come. In dialogue with Radman’s concerns with the indeterminacies of architecture, Kousoulas closes the section and points toward the subsequent chapters on epistemic technologies with his exploration of ‘synapses,’ a notion through which architecture can be understood as a kind of delimitation—constraint—of the possible. Sharing with many of the authors here an explicit future-orientation, Kousoulas characterises architecture in terms of its capacity not only to produce forms, but to enact a sensitivity to outside information, and with this to intuit kinds of space and types of subject that do not yet exist. Part III is entitled ‘Epistemic Technologies.’ Joel White’s chapter engages the work of Bernard Stiegler to develop an innovative reading of Immanuel Kant’s architectonic approach to regulative ideas. Applying this

Prologue

xxv

methodological framework to the implications of thermodynamics, White offers a new way of unpacking the implications of the notorious ‘heat death of the universe’ for human and non-human forms of life. Peeter Müürsepp’s chapter takes up and develops the potential he sees in nuce in tantalising remarks that the physicist/chemist Ilya Prigogine made toward the end of his life, on the ‘bifurcation point’ for humanity implied in the digital revolution. Like White, Müürsepp shows how issues relating to entropy, dissipation, and the irreversibility of time can only remain ‘irrelevant’ or ‘abstract’ for forms of common sense (whether pre-philosophical or philosophical) that remain bound to anachronistic forms of Galilean/Newtonian classical physics. Operating at a more familiar level of abstraction, Stephen Dougherty offers an engagement with Catherine Malabou’s work. He charts two main axes of development: the theoretical sense of plasticity in Malabou’s work, as it develops out of her early work on Hegel (The Future of Hegel), through engagements with neuroscience, then through ‘plastic’ close readings of philosophers including Kant (Before Tomorrow), Heidegger (The Heidegger Change), and Derrida (Plasticity at the Dusk of Writing); but Dougherty also charts how another side of Malabou’s work (the engaged political work of The New Wounded, Ontology of the Accident, and What Should We Do With Our Brain?) relates to what, borrowing a term from Stephanie LeMenager, he calls our ‘petromodernity,’ of which plastic is a ubiquitous material manifestation. Continuing in this vein, Dominic Smith’s chapter adopts a phenomenological technique to investigate ‘everyday technologies.’ Smith considers everyday technologies in light of the COVID-19 pandemic, in a chapter that develops through a critical engagement with Benjamin Bratton’s controversial 2021 book, The Revenge of the Real: Politics for a Postpandemic World. Against Bratton, Smith contests the scope and purpose of three terms: ‘philosophy,’ ‘everyday technologies,’ and ‘the personal.’ Part III then concludes with a chapter that dialectically relates both highly abstract and highly concrete levels of abstraction: Sha Xin Wei’s account of mathematical, algorithmic, and social-aesthetic operations. This chapter continues the discussion of an integrated (human-machinic) existence but re-directs it to the conditioning occasions in which ensembles of people and machines produce sense-making. Engaging with Agamben’s concept of ‘destituent power,’ and distinguishing between the deterministic (as defined by algorithm and information), the ‘unpredictable’ (chaotic), the random (as modelled by stochastic arithmetic), and the irreducibility of life to evolutionary physics determined by pre-statable rules, Sha argues for a third space between material causality, language (as a social technology) and experience via Deleuze’s notion of sense and differential heterogenesis. In the epilogue, David Zeitlyn distils many years of anthropological fieldwork among the Mambila people of Cameroon. In contrast to naïve

xxvi

Prologue

tendencies toward celebration of indeterminacy tout court, Zeitlyn’s epilogue offers an important example of a traditional practice—spider divination—that seeks to mitigate indeterminacy and uncertainty. Seen in light of the concern with thermodynamics and non-classical physics offered by White and Müürsepp, Zeitlyn’s epilogue offers a much-needed sense of both continuity and difference: as Zeitlyn shows, concerns with indeterminacy, death, and the (ir)reversibility of time are, on the one hand, manifest across diverse human cultures; on the other hand, Zeitlyn offers the volume a muchneeded anthropological focus on a precise and localised non-Western practice, where the status of indeterminacy is moot. Contingency and Plasticity in Everyday Technologies renders visible indeterminate ontologies—and their correlates determination and over-determination—in and of historical architectural, sonic, visual, spatio-temporal, social, epistemic,and ontogenetic practices situating digital indeterminacy in the wider context of technological transformation. NOTES

1. Friedrich Nietzsche, Thus Spoke Zarathustra: A Book for All and None, translated by Alexander Tille (London: H. Henry and Co Ltd., 1896), 183. 2. Salvador Dalí quoted in Terry Riggs, ‘Salvador Dalí Lobster Telephone,’ Tate, 1998, np, public domain: https:​//​www​.tate​.org​.uk​/art​/artworks​/dali​-lobster​-telephone​ -t03257 3. Ulrich Beck, Risk Society: Towards a New Modernity (London and Los Angeles, 1992), 52. 4. See Edna Tan et al (eds.), Thinking about the Future: Strategic Anticipation and RAHS (Singapore: National Coordination Security Secretariat, 2008). 5. Janet Roitman, Anti-Crisis (Durham: Duke University Press, 2014), 39. 6. Niklas Luhmann, Risk: A Sociological Theory (New York: De Gruyter, 1993). 7. Ferdinand de Saussure and Jacques Derrida have called semiosis ‘the play of language’; for both meaning functions independently of its reference as language is governed by arbitrary conventional and differential aspects of signs that define it as a system. See Jacques Derrida, Margins of Philosophy, translated by Alan Bass (Chicago: University of Chicago Press, 1982). 8. Niklas Luhmann, Observations on Modernity, translated by William Whobrey (Stanford: Stanford Press, 1998), 48. 9. Ibid. 10. See Jonathan Crary, Techniques of the Observer: On Vision and Modernity in the Nineteenth Century (Cambridge, MA: MIT Press, 1990).

Prologue

xxvii

11. Paul Virilio, Speed and Politics: An Essay on Dromology, translated by Mark Polizotti (New York: Semiotext(e), 1986), 46. 12. For Aristotle, non-essential properties of substances, those that manifest sporadically are accidents. See Aristotle, Categories and De Interpretatione, translated by J.K. Ackrill (Oxford: Oxford University Press, 1975). 13. Paul Virilio, The Original Accident, translated by Julie Rose (London: Polity, 2007), 5. 14. Ibid, 6. 15. Norbert Weiner, God and Golem, Inc.: A Comment on Certain Points where Cybernetics Impinges on Religion (Cambridge, MA: MIT Press, 1966), 63. 16. Sylvère Lotringer and Paul Virilio, The Accident of Art, translated by Mike Taormina (Cambridge, MA: MIT Press, 2005). 17. See Bruno Latour, Reassembling the Social: An Introduction to Actor-Network Theory (Oxford: Oxford University Press, 2005). Such an emphasis on rupture is, of course, also present in Martin Heidegger’s work, and that of his contemporary inheritors, such as Graham Harman (see Martin Heidegger, Being and Time, translated by John Macquarrie and Edward Robinson [Oxford: Blackwell, 2001 (1962)]); Graham Harman, Tool Being: Heidegger and the Metaphysics of Objects [Peru: Open Court, 2002]). 18. Harvey Molotoch, ‘Oil in Santa Barbara and Power in America,’ Sociological Inquiry, 40, no. 1 (1970): 131–44. 19. See Naomi Klein, The Shock Doctrine: The Rise of Disaster Capitalism (London: Picador, 2008 [2007]). 20. Brian Wynne, ‘Unruly Technology: Practical Rules, Impractical Discourses and Public Understanding,’ Social Studies of Science, 18, no. 1 (1988): 147–67. 21. Dahr Jamail, ‘Full Meltdown. Fukushima Called the “Biggest Industrial Catastrophe in the History of Mankind,’ Al Jazeera, 16 June 2011, http:​//​www​.alternet​.org​ /world​/151328​/full​_meltdown​%3A​_fukushima​_called​_the​%27biggest​_industriaal​_ catastrophe​_in​_the​_history​_of​_manking​%27_/. See also Andrew Feenberg, Between Reason and Experience: Essays in Technology and Modernity (Cambridge MA: MIT Press, 2010). 22. Wynne, ‘Unruly Technology.’ 23. Charles Perrow, Normal Accidents: Living with High-Risk Technologies (New York: Basic Books, 1984). 24. Martha Nussbaum here refers to Aristotle’s explanation of technē, as based on the four axioms. See Martha Nussbaum, The Fragility of Goodness: Luck and Ethics in Greek Tragedy and Philosophy (Cambridge: Cambridge University Press, 2001). 25. Michel Foucault, ‘Nietzsche, Genealogy, History,’ in Donald Bouchard (ed.), Language, Counter-Memory, Practice: Selected Essays and Interviews (New York: Cornell University Press, 1977), 139–64, 146. 26. See Yuri Tynianov’s theory of literary evolution in Yuri Tynianov and Roman Jakobson, ‘Problems in the Study of Literature and Langauge,’ in Ladislav Matejka and Krystina Pomorska (eds.), Readings in Russian Poetics: Formalist and Structuralist Views (Dublin and Funks Groves, Illinois, 2002 [1971]), 79–80. 27. Bernard Stiegler, La technique et le temps (Paris: Galilée, 1994).

xxviii

Prologue

28. Ibid. 29. Gaston Bachelard, La psychanalise du feu (Paris: Gallimard, 1949), 169. 30. Ibid. 31. Ibid, 19. 32. For Maurice Merleau-Ponty, flesh cannot be thought of as matter or substance but needs the old term ‘element’ such as water, air, earth, and fire because it is an element of Being, and, as such, both relational and transformational. See Maurice Merleau-Ponty, The Visible and the Invisible, edited by Claude Lefort, translated by Alfonso Lingis (Evanston: Northwestern University Press, 1968). 33. Cornelia Vissman, ‘Cultural Techniques and Sovereignty,’ translated by Ilinca Iurascu, Theory, Culture, Society, 30, no. 6 (2013): 83. 34. Ibid., 84. 35. Ibid. 36. Gilbert Simondon, Du mode d’existence des objets techniques (Paris: Aubier, 1989 [1958]). 37. See Yuk Hui, Recursivity and Contingency (Lanham and London: Rowman & Littlefield, 2019). 38. Gregory Bateson, Steps to an Ecology of Mind (San Francisco: Chandler Publications, 1972). See also Yoni Van Den Eede, The Beauty of Detours: A Batesonian Philosophy of Technology (New York: SUNY, 2019). 39. M. Beatrice Fazi, Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics (Lanham and London: Rowman & Littlefield, 2018) 40. See https:​//​www​.serpentinegalleries​.org​/whats​-on​/pierre​-huyghe​-uumwelt​/ and https:​//​www​.estherschipper​.com​/artists​/41​-pierre​-huyghe​/works​/15049​/. 41. See https:​//​drib​.net​/perception​-engines. 42. Quentin Meillassoux, ‘Métaphysique, spéculation, corrélation,’ in Ce peu d’espace autour: Six essais sur la métaphysique et ses limites, edited by Bernard Mabille (Paris: Les Éditions de la Transparence, 2010), 299. 43. Ibid. 44. Many of the automatic Surrealist practices, such as automatic writing, drawing, and frottage, were methods for excavating hidden layers of reality, which when brought to the surface, formed ‘sur-reality.’ The term implied aboven-ness through imbrication, not elevation. 45. For a study of non-linear dynamics in the arts and science, see N. Katherine Hayles, Chaos Unbound: Orderly Disorder in Contemporary Literature and Science (New York: Cornell University Press, 1990). 46. Gilles Deleuze, ‘Postscript on Control Societies,’ in Negotiations, 1972–1990, translated by Martin Joughin (New York: Columbia University Press, 1995), 177–82. 47. Maurizio Lazzarato, Experimental Politics: Work, Welfare and Creativity in the Neoliberal Age, translated by Arianna Bove et al., edited by Jeremy Gilbert (Cambridge, MA: The MIT Press, 2017), 61–62. 48. Ibid, 39–40. 49. Cathy O’Neill, Weapons of Math Destruction (New York: Crown Publishing, 2016).

Prologue

xxix

50. See, for example, Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor (New York: St. Martin’s Press, 2018), and Davide Panagia and Köseoğlu Çağlar, “#datapolitik: An Interview with Davide Panagia,” Contrivers’ Review (2017), http:​//​www​.contrivers​.org​/articles​/40​/Davide​ -Panagia​-Caglar​-Koseoglu​-Datapolik​-Interview​-Political​-Theory​/. 51. Franco ‘Bifo’ Berardi, ‘Simulated Replicants Forever? Big Data, Engendered Determinism and the End of Prophecy,’ in Big Data—A New Medium?, edited by Natasha Lushetich (London and New York: Routledge, 2020), 42. 52. Donald Hebb quoted in David Bates and Nima Bassiri, Plasticity and Pathology (New York: Fordham University Press, 2015), 195. 53. Ibid. 54. Ibid, 200. 55. William James quoted in Bates and Bassiri, Plasticity, 202. 56. Simondon, Sur la mode, 48. 57. N. Katherine Hayles, My Mother Was a Computer: Digital Subjects and Literary Texts (Chicago: Chicago University Press, 2005). 58. N. Katherine Hayles, Unthought: The Power of the Cognitive Unconscious (Chicago: Chicago University Press, 2017), 202. 59. Claude Shannon, ‘Communication in the Presence of Noise,’ Proc IRE 37, no. 1 (1949): 10–21. 60. Gregory Chaitin, Meta Maths: The Quest for Omega (London: Atlantic Books, 2005). 61. Cristian Calude quoted in Marcus Chown, ‘God’s Number: Where Can We Find the Secret of the Universe? In a Single Number!,’ in Randomness and Complexity: from Leibniz to Chaitin, edited by Cristian Calude (Singapore: World Scientific, 2007), 328. 62. See Mark B.N. Hansen, Feed-Forward: on the Future of Twenty-First-Century Media (Chicago: University of Chicago Press, 2015). 63. Donald MacKenzie, “How Algorithms Interact: Goffman’s ‘interaction order’ in Automated Trading,” Theory, Culture, Society, 36, no. 2 (2019): 48–49. 64. Ann-Christina Lange, ‘Organizational Ignorance: An Ethnographic Study or High-Frequency Trading,’ Economy and Society, 45, no. 2 (2016): 230–50. 65. Michael Dieter and David Gauthier, ‘On the Politics of Chrono-Design: Capture, Time, Interface,’ Theory, Culture Society, 36, no. 2 (2019): 63. 66. Ibid. 67. Helmond quoted in Dieter and Gauthier, 63. 68. Cecile Malaspina, An Epistemology of Noise (London and New York: Bloomsbury Academic, 2019), 61. 69. Dieter and Gauthier, ‘On the Politics,’ 66. 70. Lonce Wyse, ‘Appreciating Machine-Generated Artwork through Deep-Learning Mechanisms,’ in Big Data—A New Medium?, edited by Natasha Lushetich (London and New York: Routledge, 2020), 102. 71. Catherine Malabou, Morphing Intelligence: From IQ Measurement to Artificial Brains, translated by Carolyn Shread (New York: Columbia Press, 2019).

xxx

Prologue

72. See Lambros Malafouris, How Things Shape the Mind: A Theory of Material Engagement (Cambridge MA: MIT Press, 2013). 73. See, for instance, Luciano Floridi, The Logic of Information: A Theory of Philosophy as Conceptual Design (Oxford: Oxford University Press, 2019), and Christopher Norris, Epistemology (London: Continuum, 2005).

Acknowledgments

This publication is part of a research project entitled The Future of Indeterminacy: Datafication, Memory, Bio-Politics, funded by the UK Arts and Humanities Research Council (grant reference: AH/T001720/1). We are grateful to our funders for making the project, and this publication, possible. Our heartfelt thanks also go to our contributors for the many interesting conversations that have animated this venture, in verbal and written form. We acknowledge that Luciana Parisi’s chapter, ‘Transcendental Instrumentality and Incomputable Thinking,’ draws on previously published work. Sections of this chapter are modifications of ‘Reprogramming Decisionism,’ which appeared in e-flux, issue #85, in October 2017. We would also like to thank the Media Philosophy series editors, Beatrice Fazi and Eleni Ikoniadou, and Rowman & Littlefield’s team—Frankie Mace, Natalie Mandziuk, and Sylvia Landis—for all their help and support.

xxxi

PART I

Social-Digital Technologies

1

Chapter 1

Information and Alterity From Probability to Plasticity Ashley Woodward

In this chapter, I propose to approach questions of determinacy and indeterminacy in the context of technologies from a broad philosophical perspective, taking the technical concept of ‘information’ as my central focus. There are competing theories of information, such as those proposed by R.A. Fisher, Norbert Wiener, and Andrey Kolmogorov. However, the one which has achieved dominance is that proposed by Claude Shannon in 1948.1 This theory, known as the Mathematical Theory of Communication, or simply Information Theory, is strictly speaking a theory of data transmission, where data are understood as quantitative, uninterpreted symbols (such as a 1 or a 0). However, its possible implications for a semantic theory of information— that is, what we ordinarily mean by information as ‘meaningful content’— were quickly pointed out by Warren Weaver, and subsequently developed in cybernetics and philosophy. As well as making the computer revolution possible, this theory has been extended to a variety of bold speculations, including Konrad Zuse’s digital physics and John Archibald Wheeler’s ‘It from Bit’ hypothesis, which in their own ways argue that all processes in the physical universe are informational in nature.2 The perspective I explore here is equally speculative, but is a more metaphysical one, with a dose of theology, even: following the thought of several philosophers, we will see the relevance of the idea of information from the metaphysical catastrophe of the death of God, to the elevation of human beings to the status of demiurge, to the informational simulation of God in a great Monad. While these metaphysical and theological terms might be thought fanciful, they serve the purpose of being a way to reflect on the implications of information technologies for the deepest concerns of human life. 3

4

Chapter 1

Philosophers have disagreed radically about such implications. While some, such as Luciano Floridi and Gilbert Simondon, have seen the technical theory of information as having the potential to powerfully renovate philosophical concepts, many—from Martin Heidegger to Bernard Stiegler—have seen it as a threat to human thought and existence. In so far as Information Theory has been linked with meaning, such philosophers have seen it as a radical impoverishment of meaningfulness, a technocratic reduction of the richness of semantic quality to the abstractions of quantitative calculation. Following Friedrich Nietzsche, this threat to meaning can be called nihilism, and I will use this concept to frame the inquiry. The particular danger for meaning I will explore here is a threat to otherness, or alterity, which has frequently been pointed to by philosophers as an essential aspect of thought and life. This threat to alterity is an implication of the probabilistic nature of Information Theory: information in the technical, quantitative sense is a matter of calculating probabilities, with the apparent result that in a system with complete information—the equivalent of God—nothing unexpected, or other to the system itself, could take place. In response to these issues, I will argue here that a notion of information as indeterminate and plastic allows the preservation of alterity. To begin, let us consider a contemporary philosopher with a generally optimistic relation to information, who neverthless links it in interesting ways with nihilism, which we will then be able to unpack: Luciano Floridi. NIHILISM AND INFORMATION Since the mid-1990s, Floridi has been working tirelessly to establish a new field in philosophy, the Philosophy of Information (PI). In the first chapter of his general presentation of the topic, The Philosophy of Information (published in 2011), he outlines two approaches to PI, the ‘Analytic’ and the ‘Metaphysical.’ The metaphysical approach proposes that PI takes its meaning and relevance in the wake of a ‘metaphysical catastrophe’ which might also be called ‘the death of God.’3 While he doesn’t use the word ‘nihilism,’ following Nietzsche, this would be another name for this catastrophe. Floridi proposes that the context of contemporary philosophy is the death of what he specifies is a philosophical God, the God of René Descartes, defined as ‘a metaphysical guarantee of an objective, universal semantics that eventually harmonizes and gives sense to nature and history, culture and science, minds and bodies. . . . the ontic and rational foundation of any reality . . . the ultimate source of semantization’ needed to make the world and life ‘intrinsically meaningful and fully intelligible.’4

Information and Alterity

5

From the perspective of PI, epistemology can be seen as a kind of information theory, which has the task of deciphering the world, understood as God’s message. So the presumed existence of a philosophical God in this context means that ‘the message is guaranteed to make sense, at least in principle.’5 In the development of modern philosophy, God dies because the Ego (the human subject) begins to consider that it should itself be a sufficient ground for meaning. God is replaced by the Human. Yet successfully becoming that ground has proved an elusive, if not impossible, task. Floridi writes that ‘Nietzsche was right to mourn its [God’s] disappearance. Contemporary philosophy is founded on that loss, and on the ensuing sense of irreplaceable absence of the great programmer of the game of Being.’6 The main philosophical trends Floridi then visits in his potted history of modern and contemporary philosophy are German Idealism and Analytic philosophy. He presents German Idealism as ‘a series of titanic attempts to re-construct an absolute semantics by relying on very streamlined resources: the mind and its dialectics. The grand project is a naturalization of the I and an I-dealization of nature.’7 That is, German Idealism tried to give meaning to the world after the death of God by grounding everything in the mind, and by understanding everything as an evolution of mind. Without a God to guarantee the meaningfulness of the world, an abyss seemed to open between mind and world, subject and object, an abyss which seemed to many of Immanuel Kant’s immediate followers to be hypostatised in his idealism, which they sought to overcome. Johann Gottlieb Fichte, for example—to whom we will shortly return—explicitly sought to ground all meaning in the Ego. Floridi peremptorily judges that the project of German Idealism failed, and with it, philosophy gave up on the task of metaphysical grounding, of giving meaning to reality, and retreated to the defensive posture of Analytic philosophy: all philosophy can achieve is the analysis—the dissection and reconstruction—of the messages we already find in the world. In this way, Floridi suggests that the twentieth century’s ‘linguistic turn’8 represents the full acknowledgment of the untenability of the modern project of an epistemology that Cartesianly reads a world-message as though its meaningfullness is guaranteed. Rather, meaningfullness needs to be analysed, mistrusted, strictly adjudicated. The philosophical task becomes one of analysing ‘whatever semantics are left in a godless universe’: ‘The informee is left without informer.’9 In this context, ‘[p]hilosophers are dispatched to guard frontiers more and more distant from the capital of human interests.’10 This reaction of Analytic philosophy to the metaphysical disaster of the death of God, Floridi suggests, is like the reaction of a jilted lover. He equates it with an incomplete deicide, and beyond the letter of his text, I think we are here able to reconstruct his story in quite Nietzschean terms, invoking different types and manifestations of nihilism. Analytic philosophy then appears

6

Chapter 1

as a kind of passive nihilism: that is, a recognition of the untenability of the ‘higher values’ which previously gave the world meaning, but a continued clinging, however unconscious, to those values as the only ones conceived as possible. What Floridi proposes with the emergence of PI is then equivalent to the completion of nihilism, an accomplished deicide, and the ensuing freedom and power to create new values. Floridi does not present his metaphysical story in quite these terms, but rather, in terms of the metamorphosis of the post-Cartesian Ego, from ‘an agent subject to nature and orphan of its god into a demiurge.’11 The demiurge, of course, is the creator of the cosmos in Plato’s Timaeus, but also (though Floridi doesn’t mention this) the usurper God who created our physical world in gnostic traditions. Floridi sums up as follows: The history of contemporary philosophy may be written in terms of the emergence of humanity as the demiurgic Ego, which overcomes the death of god by gradually accepting its metaphysical destiny of fully replacing god as the creator and steward of reality, and hence as the ultimate source of meaning and responsibility.12

Floridi then tells a heroic story of the Human, who has finally begun to fully emerge from the dead God’s shadow. He calls this new image of the human homo poeticus, the human who brings forth, or in Floridi’s terms, who constructs reality. He then turns this into a story about the emergence of PI as that philosophy which is appropriate in the context of information technologies, understood as having a poetic, constructionist, semantising power, with which we can create and give meaning to our reality in unprecedented ways. He writes that one of the forces that lie behind the demiurgic turn is the Baconian-Galilean project of grasping and manipulating the alphabet of the universe. And this ambitious project has begun to find its fulfilment in the computational revolution and the resulting informational turn.13

Concomitantly, Floridi presents PI as no longer simply an analysis of existing meanings, but as having the task of semanticising reality, of making our world meaningful, through conceptual engineering, construction, or design. In sum: Seen from a demiurgic perspective, PI can then be presented as the study of the informational activities that make possible the construction, conceptualization, semanticization and finally the moral stewardship of reality.14

To summarise, for Floridi information technologies help us respond to nihilism because they allow human beings to be constructionist demiurges, and to

Information and Alterity

7

‘resemantise’ reality, so filling the void of meaning left by the death of God. Floridi’s story is also highly suggestive of another way in which nihilism might be thought to manifest: one which Floridi does not seem to see, and which is threatened by the very attempt to replace God. Let me introduce this other threat by way of what is often identified as the first philosophical use of the term ‘nihilism,’ in Friedrich Heinrich Jacobi’s ‘Open Letter’ to Fichte.15 Recall that Floridi construes German Idealism as an attempt to rebuild a world of meaning after the death of God by grounding meaning in the human mind, from which the real is thought to dialectically unfold. Jacobi’s letter to Fichte indicates at least one reason why this project might be thought to fail. Consequently, nihilism might be thought to result not only from the death of God, but from the attempt to rebuild a meaningful world in the wrong way. Fichte, to very briefly summarise, begins his Wissenschaftslehre [Science of Knowledge] by reasoning that while there are only two possible foundations for a complete system of philosophy, the subject and the object, we could never arrive at subjective experience by beginning with the object. Therefore, he begins his system of Transcendental Idealism by beginning with the subject—the I or the Ego—as the first principle, from which the objective world can supposedly then be derived. This derivation can be made, according to Fichte, on the basis of the mind’s necesesary operations alone, so has no need to refer to the ‘thing in itself.’16 In short, Fichte’s Idealism is then a reconstruction of the world in the Ego’s own image, outside of which we can know nothing. Jacobi’s critique of Fichte is based on the former’s conception of philosophy as an excess of a certain type of reason, which he believes threatens what has true value: existence itself as more fundamental than any rational representation—the individual, freedom, and God. Jacobi criticises Fichte for rationally reconstructing the entirety of existence on the basis of abstract concepts, and he claims that the rational representation of a thing annihilates the real existence of the thing itself that it represents. (For Jacobi, following Kant, the thing-in-itself cannot be accessed through reason, and he suggests that it can only be grasped through faith or belief). Jacobi writes: For man knows only in that he comprehends, and he comprehends only in that, by changing the real thing into mere shape, he turns the shape into the thing and the thing into nothing. More distinctly! We comprehend a thing only in so far as we can construct it.17

8

Chapter 1

A philosophical system such as Fichte’s, then, annihilates existence as something conceived as really existing outside the Ego’s own self-positing and its construction of the world on its own basis. (Note the link here with Floridi’s constructionist view of philosophy of information, a view he cites German Idealism as inspiring.) For Jacobi, the will to knowledge, understood in this way, is a ‘will that wills nothing.’18 Jacobi thus accuses Fichte of a nihilistic philosophy, which produces in him (Jacobi) a nihilistic despair. Jacobi’s condemnation of Fichte can be read as a reactionary theologism, which it certainly is. But in its general form, which may be secularised, it can also be read as an early avatar of post-Enlightenment critiques of reason: the over-extension of Reason can become nihilistic if it annihilates the Other(s) of reason which are necessary conditions (to phrase it transcendentally) of existential meaning. In terms of information, we can see the over-extension of Reason and the actualisation of metaphysics in information technologies as threatening the existence of anything outside or other to the world of its own construction. Interestingly, in a different context Floridi himself phrases the issue at stake here quite eloquently: ‘Hell is not the other, but the death of the other, for that is the drying up of the main source of meaning.’19 This theme of the eradication of otherness and the danger of closing ourselves within our own representations with information technologies was acutely presented by Jean-François Lyotard in the 1980s. Framing the issue through Kant’s theory of faculties, Lyotard understood the danger of technological nihilism as that of the eclipse of sensibility by reason (the sensible by the intelligible). On Lyotard’s account, the otherness which is being impoverished by the information revolution is essentially aesthetic or perceptual in nature. With the growth of science and its instantiation in technologies, we increasingly see and understand the world through a filter composed of our rational models. This casts doubt on the senses, and shuts us up more and more in images of the world which are the products of our own making. What this ‘aesthetic nihilism’ shuts us off from is the whole faculty of receptivity, the capacity to be open to something other, something coming from outside or elsewhere to the pre-processed environments of rationally and technologically mediated experience. Lyotard presents this threat to otherness by information technologies most powerfully in his image of the apparent ultimate goal of these technologies as the construction of a great Monad, in the Leibnizian sense, which would be the equivalent of God.20 In Leibniz’s Monadology, reality is conceived in terms of monads, which are simple substances. Monads contain reflections or representations of other monads that exist in the universe. They have memories, which allow them to test perceptions against past representations, to establish regularities, and to predict future occurences. Leibniz presents a hierarchy of monads according to how clearly they represent, or have

Information and Alterity

9

knowledge of, the rest of the universe. He conceives God as the most perfect monad, which perfectly represents the entire universe. God exists outside of time and has knowledge of all times: His ‘memory’ is thus perfect, and His ‘future predictions’ are infalible.21 In updated parlance, monads may be perceived as storing and processing information, and God may be construed as the depository of perfect information, the ultimate databank: ‘God is the absolute monad to the extent that he conserves in complete retention the totality of information constituting the world.’22 By following this line of thought, Lyotard suggests what we may perceive the transformations of the world we are seeing with information technologies as heading toward the artificial construction of just such a great Monad, which would be the simulated equivalent of God insofar as it would achieve perfect information, including the perfect prediction of all future states and events. He writes: “[c]omputers never stop being able to synthesize more and more ‘times,’ so that Leibniz could have said of this process that it is on the way to producing a monad much more ‘complete’ than humanity itself has ever been able to be.”23 The danger that Lyotard perceives is that the tendency to stock information is the tendency to foreclose the possibility of anything unforeseen happening, or any alterity being received.24 The great Monad would be perfect, complete: nothing would happen, and there would be no new information. Or in fact, as Lyotard puts it: ‘For a monad supposed to be perfect, like God, there are in the end no bits of information at all. God has nothing to learn.’25 Far from being a new source of meaning, for Lyotard this informationally simulated God appears as a threat to meaning insofar as it threatens otherness. For Lyotard, as for many other philosophers (as we have broached previously), nihilism consists in the elimination of all true otherness. This other is important, because it is, under its many guises, what has been presumed to be the immanent source of meaning once the transcendent Other, God, has been eliminated. If, as Floridi suggests, after the failure of German Idealism, Analytic philosophy retreated to a dissection and policing of the meaning that remained, European philosophy has continued to search out new sources of meaning, new others, often others of reason, such as the unconscious, the body, art, transgressive experience, and so on, thought in general as the event, the source of the new, the condition for the genesis of meaning. And European philosophy has continued to lament the destruction of the other, of the immanent sources of meaning that remain, often by the very Infosphere that Floridi celebrates as having the capacity to liberate us. To understand this threat to otherness, and its stakes, in more detail in the context of information, we may turn to the musings of Norbert Wiener on the metaphysical dimensions and implications of cyberentics.

10

Chapter 1

TYCHE AND ANANKE26 The nature of information as perfect knowledge and predictabiity contained in Lyotard’s ‘Leibnizian hypothesis’ raises issues of determinism and probability which were already given significant discussion by Norbert Wiener, the founding father of cybernetics and one of the inventors of technical information theory. In short, the issue to which the great Monad leads us is the problem of understanding how a perfect knowledge of all states and events is to be understood. Such pure prediction has historically often been treated as an issue of determinism. Wiener introduces a complication insofar as he argues that information machines are not deterministic, but probabilistic.27 He argues that this in fact makes no real difference for humanistic or religious concerns. However, we will later see, with Simondon, how conceiving information correctly as probabilistic, rather than deterministic, will allow a ‘margin of indetermination’ to persist in information technologies which will in fact place an internal limit on the formation of a great Monad and the eradication of all otherness, and so a response to this form of nihilistic danger.28 Let us proceed by first introducing Wiener’s argument. For modern philosophy, Newtonian physics, known as ‘dynamics,’ presented a problem as well as an opportunity. It presents a view of a clockwork universe, where every future condition is determined by and is predictable from past and present conditions. Where, in this clockwork, is there room for the qualities which we have deemed make human life meaningful, especially qualities such as free will, responsibility, creativity, and judgment? Kant recognised dynamics as a serious philosophical problem for human dignity, and so was driven to the solution of positing a ‘supersensible’ dimension, which we cannot know, but must posit as a practical ideal in order to carve out a region of indeterminacy in which freedom and autonomy can continue to live. This is also true, in their own ways, for many other modern philosophers: some region of indeterminacy seemed to be required as a refuge for what makes human life worth living. Today, of course, we no longer live with the (complete) dominance of Newtonian mechanics. Physics has not abandoned it, but significantly complicated it. In the mid-twentieth century, Wiener told this story in terms of a shift from deterministic dynamics to statistical probability, inaugurated by thermodynamics and confirmed by his own cybernetics, the science of communication and control. He identifies Ludwig Boltzmann and J.W. Gibbs as the most important figures in this revolution: they introduced statistical probability into physics. The new science questions fixed causal laws as entirely determining systems. It is concerned with the positions and velocities of particles from which new systems start, and it recognises that the physical

Information and Alterity

11

measurements of these are never precise: what we know about the initial conditions of a system is their probability distribution. In Wiener’s words, ‘physics now no longer claims to deal with what will always happen, but rather with what will happen with an overwhelming probability.’29 What this means is that physics cannot escape uncertainty and the contingency of events. While cybernetics as Wiener conceived it is now in many respects outdated, aspects of his perspective remain influential, and it is especially interesting because he extended his ideas about science to what he saw as their philosophical implications. For Wiener, while deterministic seciences have been superceeded by probabilistic sciences since the late nineteenth century, ‘from every point of view which has the slightest relation to morality or religion, the new mechanics is fully as mechanistic as the old.’30 That is, as he famously phrased this point, ‘Tyche is as relentless a mistress as Ananke.’31 Ananke is a Greek goddess who finds her Roman equivalent in Necessitas—she is the goddess of force, constraint, necessity, inevitability, compulsion, and, in their most deterministic sense, fate and destiny. Tyche, on the other hand, corresponds with the Roman Fortuna; she is the goddess of luck, fortune, chance, and fate and destiny in their most indeterministic sense (appeal to Tyche is made when no ‘natural’ cause for events can be discovered). Wiener uses Ananke to exemplify deterministic Newtonian dynamics, and Tyche the new sciences, which he characterises interchangably as involving probability, contingency, chance, randomness, and incomplete determinism. Wiener ‘philosophises’ the results of the new physics by suggesting that it presents an Augustianian view, in which evil is simply a (negative) disorder in the universe (and not a positive, opposing force, as in Manicheism). And yet, he suggests, it does not give us Augustinian free will, and this is why ‘Tyche is as relentless a mistress as Ananke.’ The idea, then, is one of a paradigm shift: we have passed from the Aeon of Ananke to the Aeon of Tyche. Yet the philosophical problems seem to have persisted, despite changing form: the modern philosophical appeals to indeterminacy must now make appeal to improbability. If determinacy threatens human freedom, probability, I will argue, threatens otherness and the event, as we have already seen. Yet to understand how, we must pass through a critcal reading of Wiener, on the way to finding a response to the threat of probability calculations with Simondon. For Wiener himself presents an image of cybernetics which threatens the same kind of nihilism as that of the Monad, outlined earlier. To see what is at stake here, we need only look to the way he draws values from the concepts of entropy and negentropy to inform his cybernetic view of the world. In thermodynamics, ‘entropy’ designates disorder, and ‘negentropy,’ order. In physical systems akin to heat engines, the second law of thermodynamics suggests an inevitable decrease in order (an increase in entropy). Translated into statistical probability, entropy or disorder is more probable, whereas

12

Chapter 1

negentropy or order is less probable. Since shortly after the development of thermodynamics, this appeared as an object of concern from a broad philosophical perspective. Considering the universe as a whole, it suggets that life on earth is an exceedingly improbable and rare enclave of order, and likely a very temporary one. Moreover, the frightening idea of a ‘heat death’ of the universe—a future event in which all formed matter will disappear—seemed to make life meaningless, since all would come to naught.32 It deprives life of any final goal, purpose, or lasting value (and as such, seems to threaten a form of nihilism). From this perspective, Wiener writes: In a very real sense we are shipwrecked passengers on a doomed planet. Yet even in a shipwreck, human decencies and human values do not necessarily vanish, and we must make the most of them. We shall go down, but let it be in a manner to which we may look forward as worthy of our dignity.33

And, somewhat more philosophically: We are swimming upstream against a great torrent of disorganisation, which tends to reduce everything to the heat-death of equilibrium and sameness described in the second law of thermodynamics. . . . What Maxwell, Boltzmann and Gibbs meant by this heat-death in physics has a counterpart in the ethics of Kierkegaard, who pointed out that we live in a chaotic moral universe. In this, our main obligation is to establish arbitrary enclaves of order and system.34

From this perspective, then, in the Aeon of Tyche, the great threat to human meaning and dignity is the probability of disorder and nothingness (entropy), against which we must strive to create the value of improbable order and meaning. Wiener seems to predate Floridi in pointing to the heroic role of the Homo Poeticus, the creator of order after the death of God who is able to fashion a new reality in the Infosphere. And yet a complication in this seemingly clear-cut dichotomy of values quickly emerges. One of the main reasons that Wiener is able to present cybernetics as belonging to the physics of probabilities is that it concerns communication, and the Information Theory underlying this communication is probabilistic, its mathematics and key concepts being drawn from analogies with thermodynamics. Because of the way Wiener chooses to formulate it, he sees no complication: for him, positive information value is understood as negentropy, or the construction of order. This positive information value is something improbable: the less we expect the content of a message, the more it surprises us, and the more ‘informed’ we are. Yet Claude Shannon, who developed much of the important mathematics of the theory, didn’t quite see it this way, and insisted on calling information value entropy (rather than negentropy).35 Despite this difference, Wiener and

Information and Alterity

13

Shannon could agree that they were talking about the same thing, and from a mathematical and engineering perspective, it does not really seem to make any significant difference. Not so, however, if we follow Wiener and others in trying to draw out some philosophical interpretations of the significance of these sciences of probability and the values we should attach to them. Among a great variety of philosophical interpretations, a common one is largely the opposite of Wiener’s in the values attached to terms: as we have seen, for the cyberneticist, positive information value, or meaning, must be understood as negentropy. However, some philosophers have insisted that what is most valuable in terms of ‘meaning’ is the opposite, entropy, which might equate just as much with noise, distortion, and failure in communication, because these contribute to unexpectedness or surprise, and the generation of the improbable in a communicational system. As we have already broached, in the terms of much French philosophical thought, what is at stake in meaningful systems in general is their disruption and change by an event.36 This equation—the improbable = the event = entropy—contrasts markedly with Wiener’s equation—the improbable = the ordered system = negentropy. And we find such contrasts among philosophers too; for example with Lyotard, who associates the improbable with entropy, and Bernard Stiegler, who associates it with negentropy.37 What appears then is a kind of antinomy: when the improbable is understood as the valuable, should it be interpreted as entropy or negentropy, disorder or order? I believe there are a number of useful ways this antinomy might be resolved, but here I want to indicate just one. I believe we can see a kind of inversion take place at different levels in the Tychic world view of probabilities. As we have seen, at the cosmic level, and with all matter in general, the most probable thing is entropic disorder, and the least probable, negentropic order. Yet in the order of human consciousness, language, sign systems, and meanings of all kinds, the opposite seems to be true: the most probable thing is the cliché, the redundant thinking and expression of that which has already been thought and is well-structured into a sedimented system, while the least probable is the unthought, the event, the genuinely new. Wiener writes: ‘the more probable the message, the less information it gives. Cliches, for example, are less illuminating than great poems.’38 This is very true, but can we really believe, as Wiener’s theory seems to suggest, that cliches should be understood as entropic, as creating disorder in a system of human meaning? Quite the contrary: from the point of view of Information Theory, a cliché would be a redundancy, something which acts to shore up and consolidate the order of the semantic system. On this view, most systems of communication work with only a small degree of improbability, and function for the most part to establish and reinforce probabilities. We might propose that in the order of human meaning, cosmic values are reversed: order is the most probable, disorder the least

14

Chapter 1

probable. Lyotard suggests that from the cosmic perspective the human brain is a highly improbable aggregate of matter, and that which is capable of producing the most complexity39: we can well imagine then that its secretions, too—its thoughts and semiotic systems—are, from the cosmic perspective, highly improbable. But considered from a perspective interior to its own semantic system, what is most probable is its own ordered structure, while the improbable are those new thoughts which have the character of contingent events, of disordered, entropic occurrences which, when they are extreme enough, force the system itself to change and become something new. It is here that we again meet Lyotard’s notion of information technologies as complexifying memories, increasing predictabilities, toward the apparent goal of realising the great Monad. If Tyche is just as harsh a mistress as Ananke today, it is at least in part because we are in fact subjected to the kind of regime of calculated probabilities that Lyotard envisaged, through the combination of information technologies and capitalist economics which has been variously theorised as algorithmic governmentality or surveillance capitalism. Nihilism is, it seems, beginning to take the form of the great Monad, and it is imperative that we try to understand how it is possible to resist it. I believe that we can find at least a partial solution, involving a limit to the great Monad, in Simondon’s notion of the margin of indetermination. THE MARGIN OF INDETERMINATION Simondon introduces the notion of the ‘margin of indetermination’ in his seminal book in philosophy of technology, On the Mode of Existence of Technical Objects.40 What he calls a margin of indetermination operates internally to information, as well as in machines regulated by information flow. It is a ‘median’ concept, falling between extremes, and a variable one: the margin of indetermination in information itself or in the machines it regulates can be increased or decreased. Information requires a margin of indetermination because, as Claude Shannon suggested, and we have already noted, it is a measure of surprise: information must provide something previously unknown, something which has not been determined in advance, or it would not be informative, and in this sense, information must involve some degree of contingency. However, an excessive contingency would be indistinguishable from the complete randomness of noise, and would not be informative either. Simondon’s identification of the margin of indetermination has a solid basis in Information Theory, and the material problems of information transfer. For such transfer to be successful, information must be reliably distinguishable from the noise (random fluctuations irrelevant to the signal sent as

Information and Alterity

15

message) which haunts any channel. The margin of indetermination needs to be decreased when a noisy channel threatens to drown out the clarity of the message. For example, a radio receiver reduces the margin of indetermination by isolating a particular frequency as that in which the message of the radio broadcast is to be discerned. Information, then, lies between these extremes. Simondon explains: This opposition represents a technical antinomy that poses a problem for philosophical thought: information is like the chance event, but it nevertheless distinguishes itself from it. An absolute standardisation, excluding all novelty, also excludes all information. And yet, in order to distinguish information from noise, one takes an aspect of the reduction of the limits of indeterminacy as a basis.41

Simondon summarises the key point here as follows: ‘Information is thus halfway between pure chance and absolute regularity.’42 Similarly, machines require a margin of indetermination if they are to receive information and be regulated by it. This mechanical margin of indetermination means that there are contingent, variable states of the machine, which can be determined according to the information it recieves. Simondon poses ‘open’ machines, those with a margin of indetermination which allows them to receive information, and to form ensembles with other machines and the human operators who manage them, as an ideal superior to the automatism the cyberneticists advocate. For Simondon, the automaton is in fact an impossible fantasy: it is the idea of a fully closed, self-sufficient, determined machine which would operate like an open machine. However, without a margin of indetermination, the automaton could not receive information and could not regulate itself. The margin of indetermination that Simondon identifies might be understood as a kind of internal logical limit, which prevents the technological nightmare of the great Monad that Lyotard envisages from ever being realised. In fact, Simondon expresses this idea in Leibnizian terms himself:43 If the time bases were truly incorruptible like Leibniz’s monads, then one could reduce the synchronization time of the oscillator as much as desired; the informing role of the synchronizing pulse would entirely disappear, because there would be nothing to synchronize: the synchronization signal would have no aspect of unpredictability with respect to the oscillator to be synchronized; in order for the informational nature of the signal to subsist, a certain margin of indeterminacy must subsist.44

Simondon’s notion of the margin of indetermination indicates why the Infosphere could not be a closed, fully deterministic system of fully calculated and predicted probabilities; it shows a necessary remainder which resists

16

Chapter 1

the kind of nihilism that we have seen earlier. The margin of indetermination shows why information is a median concept, with two internal limits at each end of a spectrum, neither of which can be crossed without information ceasing to be information. The margin of indetermination indicates an irreducible ‘otherness’ harboured at the heart of information itself, which requires that informational systems must be open to a degree of otherness (upredictability, chance, surprise, event) in order to function as informational systems. However, while the margin of indetermination indicates a remainder which must resist technological determinism at a limit point, there are good reasons to think that it is not sufficient to save us from much that the Monad threatens. And this is why, perhaps, Tyche is ultimately as relentless a mistress as Ananke. As post-structuralist studies of systems have made familiar, a small margin of indetermination may be just enough to ‘lubricate’ the workings of a system which is highly regulative and relatively unchanging (homeostatic, in cybernetic terms).45 Such otherness might be necessary, yet it might be so reduced that little of what we would hope indeterminism and contingency give room for—creativity, thought, the radically unforeseen event—might nevertheless by effectively quashed. The upshot of this is that we cannot be complacent about the real difficulties seen from this metaphysical perspective on the information society. The margin of indetermination gives us a space, but it must be sufficiently wedged open and widened to allow the things the great Monad threatens with its probabilistic nihilism. The intellectual part of this task is to reconceive or design the notion of information itself to understand its role in technologies, societies, and the relation between them. Simondon argues that technologies give us values and inform the relations between the human and the world. Following the metaphysical and theological perspectives explored here, then, we have good reasons to exploit and develop the notion of information as involving necessary dimensions of indeterminacy, contingency, and plasticity, as Simondon already realised: One can say that form, conceived as absolute spatial as well as temporal regularity, is not information but a condition of information; it is what receives information, the a priori that receives information. Form has a function of selectivity. But informa­tion is not form, nor is it a collection of forms; it is the variability of forms, the influx of variation with respect to a form. It is the unpredictability of a variation of form, not pure unpredictability of all variation.46

It is this ‘in-between’ contingency and regularity, unpredictability and predictability that consitutes the ‘influx of variation with respect to a form,’ which we can call plasticity. After the death of God, then, our task is to reconstitute meaning in a way which avoids the reign of Tyche in her harshest aspects, and

Information and Alterity

17

the conclusion to which this peregrination in thought has led is that designing a notion of information as plasticity is an essential part of this task. NOTES 1. Claude Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Ilinois Press, 1964). 2. Konrad Zuse, ‘Calculating Space,’ in A Computable Universe (Singapore: World Scientific, 2012); John Archibald Wheeler, ‘Information, Physics, Quantum: The Search for Links,’ in Complexity, Entropy, and the Physics of Information, edited by Wojciech H. Zurek (Redwood City: Addison-Wesley, 1990). 3. Luciano Floridi, The Philosophy of Information (Oxford: Oxford University Press, 2011). 4. Floridi, The Philosophy of Information, 20. 5. Ibid. 6. Ibid. 7. Ibid. 8. The widespread tendency of twentieth-century philosophers to make a metholodological focus on the analysis of language the basis for treating philosophical problems. 9. Floridi, Philosophy of Information, 21. 10. Ibid. 11. Ibid, 22. 12. Ibid, 23. 13. Ibid. 14. Ibid. 15. Friedrich Heinrich Jacobi, ‘Jacobi to Fichte,’ in The Main Philosophical Writings and the Novel Allwill, edited and translated by G. di Giovanni (Montreal: McGill-Queen’s University Press, 1994). 16. A reference in Kant which, like many, he sees as problematic. 17. Jacobi, ‘Jacobi to Fichte,’ 507–08. 18. Ibid, 515, 516. 19. Luciano Floridi, The Ethics of Information (Oxford: Oxford University Press, 2013), 332. 20. See in particular the chapters ‘Matter and Time’ and ‘Time Today’ in Jean-François Lyotard, The Inhuman: Reflections on Time, translated by Geoffrey Bennington and Rachel Bowlby (Cambridge: Polity, 1991). 21. Gottfried Wilhelm Leibniz, ‘Monadology,’ in Philosophical Papers and Letters, edited and translated by Leroy E. Loemker (Chicago: University of Chicago Press, 1956). 22. Lyotard, The Inhuman, 60. 23. Ibid, 64. Translation slightly modified to correct an apparent typographical error.

18

Chapter 1

24. On Lyotard and time see also Emiddio Vasquez’s chapter in this volume, ‘Computation and Material Transformations in Media.’ 25. Lyotard, The Inhuman, 65. 26. See also Stavros Kousoulas’s discussion of Ananke, determinism, and constraint in this volume: ‘Ananke’s Sway: Architectures of Synaptic Passages.’ 27. Norbert Wiener, Cybernetics, or Control and Communication in the Animal and the Machine, second edition (Cambridge, MA: MIT Press, 1961), chapter 1. 28. Gilbert Simondon, On the Mode of Existence of Technical Objects, translated by Cecile Malaspina and John Rogove (Minneapolis: Univocal, 2016). 29. Norbert Wiener, The Human Use of Human Beings: Cybernetics and Society (Burlington: Da Capo Press, 1988), 10. 30. Wiener, Cybernetics, 44. 31. Ibid, 38. 32. On heat death, see Joel White’s chapter in this volume, ‘Outline to an Architectonics of Entropy: The Entropic Indeterminacy of Life.’ 33. Wiener, Human Use of Human Beings, 40. 34. Norbert Wiener, I Am a Mathematician: The Later Life of a Prodigy (Cambridge, MA: MIT Press, 1971), 324. 35. Shannon and Weaver, Mathemetical Theory of Communication. 36. To mention one interesting and relevant example, Jacques Lacan, in Seminar VII, associates Tyche with the event, which for him is the chance encounter with the real. Jacques Lacan, The Ethics of Psychoanalysis, edited by Jacques-Alain Miller, translated by Dennis Porter (London: Tavistock/Routledge, 1992). 37. See Lyotard, The Inhuman, and Stiegler, The Neganthropocene, edited and translated by Daniel Ross (London: Open Humanities Press, 2018). 38. Wiener, Human Use of Human Beings, 21. 39. Lyotard, The Inhuman, 61, 62. 40. He develops this idea primarily in part II, chapter 2, section III of Mode of Existence, 147–59. 41. Simondon, Mode of Existence, 149. Translation slightly modified. 42. Ibid, 150. 43. We can take these Leibnizian treatments of information to follow Wiener, who identified Leibniz as the ‘intellectual ancestor’ of cybernetics. Human Use of Human Beings, 19. 44. Simondon, Mode of Existence, 149–50. 45. A good example, in which technologies of information and communication play a prominent role, is Jean Baudrillard’s notion of ‘cold’ or ‘ludic’ seduction. See Baudrillard, Seduction, translated by Brian Singer (New York: St. Martin’s Press, 1990), 157–78. 46. Simondon, Mode of Existence, 150.

Chapter 2

Transcendental Instrumentality and Incomputable Thinking Luciana Parisi

The New Brutalist architecture of the 1950s to the 1970s activated a programme of techno-social living that aimed to abolish the sentimental attachment to the end of the ‘spiritual in Man’ and transcend the norm of architectural expression through an uninhibited functionalism, expressing the crudity of structure and materials.1 Not only functionalism but also a-formalism and topology were part of the procedural activities that tried to surpass the contemplation of the end of structure by reimagining the spatial experience of truths and facts. With New Brutalist design all media were levelled together in order to connect their specific content in an a-­formal dimension—one that could work through the incompleteness of a total image of sociality. The adjunction of the material world of animate and inanimate media preserves the dissembling complexity of these discrete parts that revise mediatic forms of truths and facts in multiple directions. The architecture of New Brutalism turns instruments of thinking into concrete, mass-modular, interconnected blocks and self-contained cells, elevated above the local territory, united by networks of corridors across blocks of buildings. Here, instrumental reasoning transforms the entropic dissolution of the post–World War II period, holding the concrete weight of the past in order to dissolve it into structural experiments of task-orientated functions and aesthetic transparency. If New Brutalism had a vision that facilitated a weaponisation of information, it was not simply to advance information technology, but rather to propose modes of instrumental reasoning that work through entropy, randomness, or noise to reprogramme codes and values, passages and bridges, contents and expressions of a united image of the social. These design attempts at reinventing the sociality of instrumental reasoning, 19

20

Chapter 2

which can be found across histories, cultures, and aesthetics, suggest a view of instrumentality beyond the mere teleology of means-to-ends. In the current political climate of post-truth and post-facts, the New Brutalist architecture may also offer an alternative envisioning of instrumentality that responds to what has been defined as the brutality of computational thinking2 running through the algorithmic choreography of consent on the networked platforms of social media. The algorithmic stirring of affective responses becomes recorded data that is abstracted from our selection of this or that music track, this or that pair of shoes, this or that movie from streaming websites. The brutality of thinking here coincides with the lack of critical insight into the current apparatus of power whereby technology conceals the human condition, which now seems trapped in a seamless apparatus of programmed decision-­making. In Infrastructural Brutalism: Art and the Necropolitics of Infrastructure, Michael Truscello discusses brutalism as ‘the historical context in which industrial capitalism has met the limits of its expansion and domination’ yet continues to invest in the production and consumption of oil, roads, and paved surfaces.3 The brutality of thinking concerns not only the crisis of truth, but rather the perpetuation of the truth of a persistent hyperindustrial capitalism involving the necro-social expansion (the production of death at large scale) of its toxic infrastructure. The new brutality of algorithmic post-truth builds upon the brutality of industrial capitalism that has been alimented by the accumulation of value since the project of slavery. As Achille Mbembe points out, life itself is now treated as a computable object. He asks: ‘[w]hat remains of the human subject in an age when the instrumentality of reason is carried out by and through information machines and technologies of calculation?’4 The brutality of computation is seen as a continuation of colonisation where the world corresponds to a data field awaiting extraction and where knowledge requires no critique but mere data collection fed into the operations of governance, military machines, and mega-corporations. And yet the computational machine of post-truth does not follow the self-determining logic of modern subjectivity that constituted the fortunes of industrial capitalism and its brutality. Instead, the computational machine has become the site for the production of an internal, binary logic of either/ or as well as a predictive logic of yes/no and maybe, that first of all requires an algorithmic compression of randomness. Data is never ‘given’ but is the result of deciphering pattern-less volumes of information.5 One could argue that insofar as post-truth politics is afforded by computational machines this machine is no longer digital, because it is no longer primarily concerned with mathematical proof or verification.6 Computation is instead meta-digital: it is a process-orientated forward-looking mode of decision. It is ‘meta’ because it lies above and below the digitalisation of the world into binary positions

Transcendental Instrumentality and Incomputable Thinking

21

by running predictive scenarios that generate the possibilities of compressing indeterminacies into complex states. The algorithmic abstraction of social relations, affective states, and behavioural tendencies produce not a truth, but sharable socio-technical conditions determined by interactive automations. The meta-digital designates the point at which computation shifts from the automation of information compression toward the generation of information, whereby data becomes instrumental to the transformation of truths. The metadigital machine of post-truth politics employs a heuristic testing—the inductive search for proofs—in order to learn from how conducts evolve, change, and adapt. This is not simply a statistical calculation of probabilities following already established trends in data usage; it is demarcated by a bold indifference toward the data retrieved and transmitted. Data here only serves as a propeller from which the computational machine can fish out proofs for truths that do not yet exist. The granular algorithmic analysis opens up the potential of content to be redirected for purposes that are not pre-known. In other words, this computational indifference to the binary logic of problem-solving (the negotiation of 0s and 1s) is meta-digital because it follows the logic of technological decisionism, for which making a clear decision quickly is more valuable than making a correct one. In decisionism, what is most decisive is not what is most reasonable. There is nothing to verify (wrong or right, true or false); what counts is the decisive act itself, which counts as the truth. When Benito Mussolini delivered the 1925 speech in the Italian parliament in which he took full responsibility for the murderous chaos his regime had created, he challenged his opponents to remove him from power. Mussolini’s proposition was decisionist; it didn’t follow the binary logic of the ‘if responsible’ (0) ‘then resign’ (1) but rather ‘stay in power’ because ‘responsible.’ Decisionism bypasses the logic of contradiction: the truth and its negation are the same. Similarly, the new brutality of the meta-digital logic entails a technological decisionism, which shows that machines make these speeches for us: while automated decision is responsible for fake news, for instance, it continues to hold onto the idea of true statements, too. This chapter brings together architectural theory and critical theory of technology and computation to address the complex formation of an intelligent infrastructure of automated decision-making, which exposes the need to face the presence of inhuman thinking at the limits of Man’s reason. What is at stake here is the way Man’s reason has constructed technology as a means to an end, suppressing, abstracting, and extracting the value of mediating vessels under the premise that instruments are merely tools and have no soul. What the critique of instrumental reason overlooks is precisely this assumption and its ontological implications for the racialisation, gendering, and sexualisation of machines.

22

Chapter 2

DECIDE NOT TO DECIDE AND THE RESULTS WILL COME The history of communication that obtains results by undoing truths and by fabricating rather than discovering facts must include at least three historical moments in the development of machine intelligence and the formation of the meta-digital machine. First, the period from the 1940s to the 1960s, involving the rise of the cybernetic infrastructure of communication and the introduction of computational logic into decision-making procedures; second, the 1970s and 1980s, which saw a shift toward interactive algorithms and expert and knowledge systems; and third, from the post-1980s to the post-2000s, characterised by a focus on intelligent agents, machine-learning algorithms, and big-data logic. As these forms of automated intelligence entered the social culture of communication, they became central to a critical theory of technology that incessantly warns us against the automation of decision, where information processing, computational logic, and cybernetic feedbacks replace the very structure, language, and capacity for thinking beyond what is already known. In his 1969 essay ‘The End of Philosophy and the Task of Thinking,’ Martin Heidegger argues that since the late 1940s, the advance of cybernetics—a techno-science of communication and control—has demarcated the point at which Western metaphysics itself reaches a completion.7 This means not only that philosophy becomes verifiable and provable through testing, but also that scientific truths become subsumed to the effectiveness of results. By replacing judgment based on the supposition of categories with the efficiency of truth-states carried out by machines, the instrumental reasoning of cybernetics fully absorbs Western metaphysics. Here, ideas are not simply demonstrated or proven but processed as information leading to more information. The new techno-science of communication activates a new language of thought embedded in the information circuits of input and output, whereby actions are programmed to achieve a series of results that become the input for more outputs. If, according to Heidegger, the end of philosophy is ‘the gathering in of the most extreme consequences,’ it’s because the development of the sciences and their separation from philosophy has led to the transformation of philosophy into ‘the empirical science of man.’8 Nowhere is this more tangible than in the advance of cybernetics and its concerns with ‘the determination of man as an acting social being.’9 Cybernetics, for Heidegger, is ‘the theory of the steering of the possible planning and arrangement of human labor’; it ‘transforms language into an exchange of news. The arts become regulated-regulating instruments of information.’10 As philosophy becomes a science that intercommunicates with others, it loses its

Transcendental Instrumentality and Incomputable Thinking

23

metaphysical totality. The role of explaining the world and the place of ‘man in the world’ is finally broken apart by technology. Under this new condition of the techno-erasure of metaphysical truth, Heidegger insists that the new task of thinking will lie outside the distinction of the rational and the irrational. Because thinking always remains concealed within the irrationality of systems, it cannot actually be proven to exist. From this standpoint, and echoing Aristotle, he poses the question of how it is possible to recognise whether and when thinking needs a proof, and how what needs no proof will be experienced. For Heidegger, however, only un-concealment—the condition in which thinking cannot be unconcealed—would coincide with the condition of truth.11 Here, truth will not imply the certainty of absolute knowledge and will not belong to the realm of scientific epistemology. From this standpoint, since the cybernetic regime of technoscientific knowledge is mainly concerned with the achievement of results, it can tell us nothing about truth, as the latter entails the unconcealment of what cannot be demonstrated (because thinking always hides in the irrationality of systems). Truths, therefore, must remain outside what is already known. This is why, in the age of meaningless communication, according to Heidegger, one must turn the task of thinking into a mode of education in how to think.12 It’s precisely a new envisioning of how to think in the age of automated decision that returns to haunt the post-truth politics today. We are at an impasse: unable to return to the deductive model of ideal truths, but equally unable to rely on the inductive method or simple factchecking to verify truth.13 How do we overcome this impasse? It’s difficult to shift perspective on what techno-politics can be without first attempting to disentangle this fundamental knot involving philosophy and techno-science, which is still haunted by Heidegger’s proposition that the transformation of metaphysics—of the un-demonstrable condition of thinking—into cybernetic circuits of communication demands an articulation of thinking outside reason and its instrumentality. The heritage of this critique of thought seems to foreclose the question of instrumental reasoning today whereby artificial intelligence—or bots—have re-coded critical perspectives on the ideology of truth and the fact-checking empiricism of data. Instead of declaring the end of metaphysical thinking and its completion in instrumentality, it’s important to re-enter the critique of instrumental reasoning through the backdoor, by re-opening the question of how to think in terms of the means through which indeterminacy, randomness, and the unknowns have become part of the techno-scientific knowhow, the ‘improper’ reasoning of machines.

24

Chapter 2

LEARNING TO LEARN TO THINK We could start by looking more closely at the historical attempts in cybernetics and computation between the late 1940s and the 1980s to bring forward models of automated intelligence that did not rely on the deductive logic of known truths. As the application of inductive data retrieval and heuristic testing shifted the focus of artificial intelligence research from a mode of validation to a mode of discovery critical theory’s assumptions that techno-science exhausted metaphysics with the already thought had to be revised. It is precisely the realisation of the ontic limit of techno-science that pushed computation away from symbolic systems and toward experimenting with knowhows—that is, with machines learning how to learn. This meta-processing of how to learn is now central to the curatorial bot-to-bot image of social communication in the age of post-truth. With computation, the rational system of Western metaphysics is not simply actualised; the eternal ground of truth here enters the vicissitudes of material contingencies. Cybernetic instrumentality replaces truth as contemplative knowledge with the means of knowing, and announces a metaphysical dimension of machine knowledge originating from within its automated functions of learning and prediction. And yet, in this seamless rational system, one can no longer conceive of thinking as of that which remains unconcealed in the invisible gaps of a transparent apparatus of communication. The problem of instrumentality is no longer that machines cannot unlearn or un-know what they have learned. Machine learning does not demarcate the reverse of un-concealment (or causal efficacy). For Heidegger, questioning the essence of technology entails above all accounting for the universality of human values, human nature, and the ethical way of living beyond the efficiency of instrumentality. Instead of being a mere instrument that turns human communication into a series of objects that interact with one another, he takes the question of technology as central to the universal values of humanity. Instrumentality here coincides with the use of the world that conceals within itself (holds within itself) what we think and how we value this world. Instrumentality is therefore not a paradox of un-concealment but part of it. Cybernetics enframes the world through communication feedbacks and networks; however, it also reveals the limits of instrumentality—or truth of the world as made in the image of rationality: that we have delivered judgment, experience, and wisdom to automated thought beyond good and evil. Technology entails the possibility of transmuting our relationship with the world. The computational processing of randomness in automation points to another possibility for instrumental reason, outside the intricacies of concealment and un-concealment where efficiency is the motor of the ontological presence of the totality of being ready to reveal the truth about humanity. At

Transcendental Instrumentality and Incomputable Thinking

25

the edge of modern metaphysics, instrumentality has carved out a space from teleological programming and from philosophical ethical judgments. Instead of being a mirror of the human condition (whose essence is to grasp the very being of knowing) the modern question of technology points to transcendental instrumentality where technicity is enfolded in the means that do not merely execute commands. Instead, instrumentality becomes a radical condition of transformation that matches neither formal nor causal efficacy. Instead of an accelerating chain of effects, means here become conditions for change, for overturning what Sylvia Wynter has called the recursive epistemology of Western cosmogony.14 Transcendental instrumentality is the moment at which instrumentality becomes generative of the conditions of automated programming to change. Once machines become operative of thinking, they become thoughts-operators and not simply fast executors. For generative thinking to happen there must be functions converging at the edge of compression. Here instrumentality ceases to be about the essence of humanity inscribed in the modern values of progress and self-determination and rather takes the function of compression as the starting point for exposing the infinities that run away from the binaries of truth and facts. Instrumentality here exceeds the fear of mediation taking over reasoning, of machines taking over neurons moving on with the design of New Brutalism. Transcendental instrumentality corresponds to the entanglement of the incompleteness of process with the sequential processing steps. Means are here no longer vehicles for efficient causality but have become the instruments of emergence. Likewise, instrumentality is the zone of transformation of the condition of and for thought from within the practices, processes, and know-hows. This is not because functions have a soul, in the sense of reflective consciousness. If the transcendental can become part of instrumentality, it’s because what emerges from computational processing is an alien without a soul, the negative marker slipping outside of the dualistic oppositions (and/ or mirroring) between (human) reason and (machine) intelligence. Once transcendental philosophy becomes a method of reasoning in machines— or instrumental philosophy—it loses the power to represent the world and becomes dependent on the practices of knowing and the existence of underworlds. As Gilles Deleuze and Félix Guattari remind us, if we simply react to the dominant determinations of our epoch, we condemn thinking to doxa.15 In particular, if the dominant political rules of lying and bullying are facilitated by a rampant neoheuristic trust in algorithmic search, would a non-reactive critique necessarily place philosophy outside of information technology? Deleuze and Guattari argue that philosophy must directly confront this new dogmatic image of thought formed by cybernetic communication, which fuses the past and the future, memory and hope, in the continuous circle of the present. To think outside the dominance of the present, however, does not

26

Chapter 2

require a return to eternal truth—to the metaphysics of true ideas that must be re-established against false ideas. If cybernetics coincides with the information network of communication exchange subtending the proliferation of opinions and generating consensus,16 philosophy must instead make an effort to create critical concepts that evacuate the presence of the present from the future image of thinking. But how do we do this? Deleuze and Guattari’s critique of communication society is a critique of computer science, marketing, design, and advertising.17 Communication is here understood as an extension of doxa, a model of the recognition of truth, which endlessly reiterates what everyone knows, what the survey says, what the majority believes. For Deleuze and Guattari, communication has impoverished philosophy and has insinuated itself into the micro-movements of thinking by turning time into a chronological sequence of possibilities, a linear management of time relying on what has already been imagined, known, or lived. In opposition to this, Deleuze and Guattari argue that the untimely must act on the present to give space to another time to come: a thought of the future. In ‘On the New Philosophers and a More General Problem,’ Deleuze laments the surrender of philosophical thought to media where writing and thinking are transformed into a commercial event, an exhibition, and a promotion.18 He insists that philosophy must instead be preoccupied with the formation of problems and the creation of concepts. It’s the untimely of thought and the non-philosophy of philosophy that will enable the creation of a truly critical concept. But how do we overturn the presumed self-erasure of critical concepts that stand outside the techno-scientific regime of communication? Can a truly critical concept survive the indifferent new brutality of our post-truth world? Doesn’t the mistrust of techno-science prevent philosophy from becoming a conceptual enactment of a world to come? Why does philosophy continue to ignore thinking machines that create alien concepts? Although Heidegger and Deleuze are worlds apart, there is a thread in both philosophers that defends philosophy from techno-science, against the poverty of thought in the age of automated thinking. If the Heideggerian un-concealing of truth ultimately contemplates an unreachable state delimited by the awareness of finitude (of Western metaphysics), Deleuze’s vision of the unthought of philosophy tends toward a creative unfolding of potentialities, the construction of conceptual personae that resist and counter-actualise the doxa of the present. And yet a strong mistrust of techno-science prevails here; in particular, forms of instrumental reason embedded in computational communication continue to be identified with control as governance. Similarly, the conception of the instrument of governance—that is, information technology—is left as a black box that has no aims (a mindless, non-conscious automaton) unless these aims are politically orchestrated. While it’s not possible to disentangle the political condition of truth from the

Transcendental Instrumentality and Incomputable Thinking

27

computational processing of data retrieval and transmission, it also seems self-limiting to not attempt to account for a mode of thinking generated by the instruments or means of thought. If, with cybernetics and computation, the instrument of calculation has become a learning machine that has internally challenged techno-science—the logic of deductive truth and inductive facts— it is also because this form of instrumentality has its own reasoning, whereby heuristic testing has shifted toward hypothesis generation. One could argue that by learning to learn how to think, this instrumental form of reasoning transcends the ontic condition in which it was inscribed by the modern project of philosophy. From this standpoint instrumentality may re-open the question of the relation between doing and thinking, and revise the core impasse of the critique of technologies. But what does it mean that machines can think? Hasn’t the critique of technology, from Heidegger to Deleuze, and even François Laruelle,19 argued that the immanence of thought passes through non-reflective and non-decisional machines? And if so, is post-truth politics indeed just the most apparent consequence of how irrational thinking pervades the most rational of systems? And yet there is the possibility of addressing the question of inhuman thinking otherwise from within the logic of machines as a marker of a machine epistemology whereby complex levels of mediation, and not an immediacy between doing and thinking, is at stake. Instead of a top-down programming of functions, adaptive algorithms in neural networks process data at increasingly faster speeds because they retrieve and transmit data without performing deductive logical inferences. However, according to Katherine Hayles, algorithmic intelligence, more than being mindless, should rather be understood as a non-conscious form of cognition, solving complex problems without using formal languages or deductive inference; by using low levels of neural organisation and iterative and recursive patterns of preservation, these algorithms are inductive learners—that is, they develop complex behaviours by retrieving information from particular data aggregates.20 However, Hayles also points out that emergence, complexity, adaptation, and the phenomenal experience of cognition do not simply coincide with the material processes or functions of these elements of cognition.21 Even if algorithms perform non-conscious intelligence, it does not mean that they act mindlessly. Their networked and evolutionary learning cannot simply be understood in terms of their material functions, or to put it another way, according to their executive functions. Instead, functions can also have imaginative modalities of pushing the programme to learn from what it does not know. In contrast to Lorraine Daston’s argument that algorithmic procedures are mindless sets of instructions that have replaced logos with ratio,22 Hayles’s argument about non-conscious cognition suggests that algorithmic procedures are transformed in the interaction between data

28

Chapter 2

and algorithms, data and meta-data, and algorithms and other algorithms that define machine learning as a time-based medium in which information vectors constantly converge and diverge. Since, according to Hayles, machine learning is already a manifestation of low-level activities of non-conscious cognition performed at imperceptible speeds, it’s not possible to argue that cognition is temporally coherent, linking the past to the present or causes to effect. Information cannot simply be edited to match expectations. Instead, the non-conscious cognition of intelligent machines exposes temporal lapses that are not immediately accessible to conscious human cognition. This is an emergentist view of non-conscious cognition that challenges the centrality of human sapience in favour of co-evolutionary cognitive structures where algorithms establish new patterns of meaning by aggregating, matching, and selecting data. If the inductive model of trial and error allows computational machines to make faster connections, it also implies that algorithms learn to recognise patterns and repeat them without having to pass through the entire chain of cause and effect and without having to know their content. As algorithms train increasingly larger data sets, their capacity to search no longer remains limited to known probabilities. Instead, they increasingly experiment with modes of interpretation that Hayles calls ‘techno-­ genesis,’ pointing towards an instrumental transformation of ‘how we may think.’23 In the last twenty years, this instrumental transformation has also concerned how algorithms may think among themselves. Since 2006, with deep learning algorithms, a new focus on how to compute unknown data has become central to the infrastructural evolution of artificial neural networks. Instead of measuring the speed of data and assigning it meaning according to how frequently data is transmitted, deep-learning algorithms rather retrieve the properties of a song, an image, or a voice to predict the content, the meaning, and the context-specific activities of data. Here, algorithms do not just learn from data, but also from other algorithms, establishing a sort of meta-learning from the hidden layers of the network, shortening the distance from nodal points while carrying out a granular analysis of data content. This focus on content-specific data is radically different from the 1940s conception of information in communication systems. For Claude Shannon, for instance, the content of data was to be reduced to its enumerative function, and information had to be devoid of context, meaning, or particularities.24 With deep learning, big data, and data mining, algorithms instead measure the smallest variations in content- and context-specific data as they are folded into the use of digital devices (from satellites to CCTV camera, from mobile phones to the use of apps and browsing). Indeed, what makes machine learning a new form of reasoning is not only faster and larger aggregation of data, but also a new modality of quantification, or a kind of qualitative quantification based on evolving variations of data. This is already a transcendental quality

Transcendental Instrumentality and Incomputable Thinking

29

of computational instrumentality, exposing a gap between what machines do and how they think. In other words, deep-learning algorithms do not just learn from use but learn to learn about content- and context-specific data (extracting content usage across class, gender, race, geographical location, emotional responses, social activities, sexual preferences, music trends, and so forth). The consequence of this learning seems to imply more than the unmediated expression of immanent thought, or the reassuring explosion of the irrational within rational systems. Machine learning rather involves augmented levels of mediation where uncertainty is manifested in terms of the incomputable forms of algorithmic automation, as that which does not simply break the calculation, quantification, and the numerical ordering of infinities. Instead, incomputables enter the complex sizing of mediation, involving the structuring of behalf of the algorithmic patterning of indeterminacies. This implies that machine learning should not solely be considered in terms of what algorithms do as a biased model of the reproduction of data-usage, data-context, and data-meaning. At the same time, however, one should resist the temptation to consider algorithms as mere placeholders that allow the manifestation of the non-conscious or irrational potentialities of thinking. Instead, I want to suggest that the general principle of learning in machines should be critically addressed in terms of a nascent transcendental instrumentality: what machines do does not coincide with the possibilities of machine thinking. Thinking transcends mere pragmatics; more importantly, pragmatic reasoning aspires to build thought by conceding that future actions can transform the conditions of knowing-how. This is a retro-deductive transcendentality: it is only through mediatic processing, the technical know-hows of machine processing, that the condition of instrumentality can be met. Instrumentality here doesn’t stand for the execution of a programme, but its transformation. Indeterminacy is not outside reason but is rather disclosed in the procedures of reasoning, unleashing alien possibilities that a general practice of reasoning can offer from within mediation where techno-social changes occur. CRITICAL THEORY OF AUTOMATED THINKING If this aspirational critique of machine learning asks for a change in perspective on the possibilities for a critical theory of automated thinking, one cannot overlook the fact that algorithmic control and governance do involve the micro-targeting of difference through the construction of given facts aimed at reinforcing existing beliefs. At the same time, the evolutionary dynamics of learning machines show that the time of computation, including the hidden layers of a growing network, also force algorithms to structure randomness

30

Chapter 2

beyond the already known. For instance, if a machine is fed data that belongs to already-known categories, classes, and forms, when the computational process starts, these data become included in the algorithmic search for associations that bring together smaller parts of data, adding hidden levels of temporalities to the overall calculation. This results in the algorithmic possibilities of learning beyond what is input in the system. From this standpoint, one can argue that computational control only results in the reproduction of the discursive structure of power that data uphold. In other words, whether it’s suggested that algorithmic and data architectures are another form of ideological design (imbued with human decisions) or that machines are ultimately mindless and can thus act empirically (simply as data checkers), what is missing is a speculative critique of machine learning that sees machines as vessels of knowledge that can at best perform Western metaphysical binaries, deductive truths, and inductive fact-checking at a faster pace. The logic of abduction, introduced by Charles Sanders Pierce at the beginning of the twentieth century, may have the potential to create semiotic chains (from non-signifying signs to meanings) driven by hypotheses that propose the best explanation from unknown situations. This could be a starting point for non-inferential practices, where materiality and truth are not the same, but partake in a larger continuum of modes of reasoning (abduction–induction– deduction).25 In particular, the non-inferential use of technology seems crucial for reassessing the truth of our current situation, perhaps affording possibilities of decision and the collective determination of truths. A critical theory of automation should therefore begin with an effort to overturn the auto-poietic dyad of instrumental reasoning where machines either execute a priori reasoning or reduce reason (and truth) to the brutality of data-driven reactive responses. Such a critique should reject the view that techno-science completes Western philosophy’s dream of reason. It should also account for the overturning of philosophical reason activated through mediatic experiments. But how might we address the correlations between the end of truth and the transformation of cybernetic binary states into forms of non-conscious cognition and meta-digital learning processing where algorithms learn how to learn and where indeterminacy is included in reasoning? Is it enough to blame the mindless techno-scientific quantification of biased beliefs and desires, or can we engage in a materialist theorisation of technicity beginning with a close engagement with the means by which thinking thinks? One could argue that since World War II, the algorithmic means of thinking should also be regarded as a mode of reasoning. It’s true that most machine-learning algorithms, such as Netflix algorithms, actually focus on the specific use of data through a heuristic analysis of data correlations, statistically matching and thus predicting your data categories of preference, according to what you already may know. However, deep-learning algorithms—as a means of

Transcendental Instrumentality and Incomputable Thinking

31

thinking how to think—involve not only the predictive analysis of content and the micro-targeting of data use, but also define a tendency in artificial intelligence to abstract modalities of learning about infinite varieties of contextual content. These infinite varieties are not only derived from the algorithmic recording of the human use of data according to frequencies, contexts, and content, but also include the meta-elaboration of how algorithms have learned about these usages. For instance, unlike recommendation algorithms, the RankBrain interpreter algorithms that support Google Ranking are not limited to making suggestions. Rather, they activate a meta-relational level of inference; in other words, the algorithm seeks an explanation for unknown signs in order to derive information through the hypothetical conjectures of data involving algorithmic searches of indeterminate words, events, or things for which one may not have the exact search terms. As opposed to the heuristic analysis of data correlations among distinct sets, these interpreter algorithms do not just prove, verify, or validate hypothesis, but must, first of all, elaborate hypothetical reasoning based on what other algorithms have already searched, in order to determine the possible meaning of the missing information in the query. These deep-learning algorithms work by searching elements of surprise—that is, unthought information—which can only occur if the system is apt to preserve, rather than eliminate as errors, micro-levels of randomness that manifest across volumes of data, and the entropic noise of increasing data volumes. This noise is precisely part of the learning process. Experimental hypothesis-making must preserve indeterminacy so that it can bind information to surprise. While one can assume that this inclusion of indeterminacy (or irrationality) in the computational process is yet another manifestation of the ultimate techno-mastery of reality, it’s important to reiterate that randomness is at the core of algorithmic mediation and as such opens up the question of epistemological mastery to the centrality of contingency within the functioning of any rational system. This results in the system’s hyperrational (or sur-rational, to use Bachelard’s term)26 articulation of the real, the unknown, the incomputable, in terms of technical mediations, automated actualisations, and machine becomings of the real in their manifest artificial forms. One could argue that this emphasis on the indeterminacy of technical objects is not dissimilar to Gilbert Simondon’s incisive arguments about how the technical aggregates are not functional because they can be optimised but because they can be changed in their very orchestration, where technical operations rub up against the indeterminacies of the orchestrator, the socio-collective use especially.27 While Simondon explained that the standardisation of technical objects is incomplete in the face of the margins of indetermination that is embedded in the design and use of technical objects, indeterminacy in computation concerns the internal mediation that the rules and procedures execute in the

32

Chapter 2

compression of randomness. Instead of establishing general results from particular conceptual associations derived from the frequent use of particular content by humans and machines, the inclusion of indeterminacy in machine learning concerns the parallelism of temporalities of learning and processing, which involves an elaboration of data that sidesteps the primary level of feedback response based on an already known result. We know that RankBrain algorithms are also called ‘signals,’ because they give page rank algorithms clues about content: they search for words on a page, links to pages, the locations of users, their browsing history, or check the domain registration, the duplication of content, and so forth. These ‘signals’ were developed to support the core Page Ranking algorithm so that it can index new information content.28 By indexing information, RankBrain aims to interpret user searches by inferring the content of words, phrases, and sentences through the application of synonyms or stemming lists. Here, the channelling of algorithmic searches toward already planned results overlaps with an algorithmic hypothesis that is exposed to the indeterminacy of outputs and the random quantities of information held in the hidden layers of neural networks. For instance, indexing involves information attached to long-tail queries that is used to add more context-specificity to the search content. Instead of matching concepts, RankBrain algorithms rely on the indeterminacy of results. In the age of post-truth politics, indeterminacy within machine learning defines not an external contingency disturbing an otherwise stable governance of information. Instead, the correlation of the ‘new brutality’ of fake and alternative news with the contemporary form of automation involves a granular structuring of unknowns, pushing automated cognition beyond knowledge-based systems.29 Indeterminacy is therefore intrinsic to the algorithmic generation of hypothesis and as such the techno-scientific articulation of truths and facts can no longer be confined to recurring functions and executions of the already known. The correlation between post-truth politics and automated cognition therefore needs to be further explored, contested, and reinvented. Because instruments are already doing politics, a question to ask is how to reorient the brutality of instrumentality away from the senseless stirring of beliefs and desires, and toward a dynamic of reasoning that affords the contingent re-articulation—rather than elimination—of aims. If the antagonism between automation and philosophy is predicated on the instrumental use of thinking, techno-philosophy should instead express not an opposition between philosophy and technology but the emergence of parallel philosophies of machines contributing to alien worlds, truths, and facts that have always existed outside the metaphysics of modern philosophy. As the architecture of New Brutalism proposes, it’s possible for instrumentality to transform the conditions of possibility for socio-technical assemblages that

Transcendental Instrumentality and Incomputable Thinking

33

challenge the opposition between philosophy and automation at the core of modern metaphysics. The a-formal dimension of New Brutalism rather takes know-hows as productive of scenarios for collective thinking that has no pretence to be valued within the ‘cosmogony of Man,’ the recursive epistemology of the human subject equipped with formal reasoning. New Brutalism shows the unintended possibilities that machine thinking brings forth in setting up the a-formal dimensions of truth and fact as these become dependent on the assembly complexities of information processing. What hyperindustrial colonialisms hide beyond what appears to be machines’ cold brutality of value extraction and automated decisionism is rather the over-representation of modern epistemology where the self-determination of the human and of ‘Man’s’ reason demarcate the rules of knowledge, being, and thought. What instrumentality can do for us instead is to open a path away from this cosmogony: the sheer processing of rules is already generating rules of another kind, a brutalism of the excluded from the system of value and yet contiguous with the incomputable infinities that keep on running through and beyond the circuit of truth. NOTES 1. Reyner Banham, The New Brutalism: Ethic or Aesthetic? (London: Architectural Press, 1966). 2. See Rosi Braidotti, Timotheus Vermeulen, Julieta Aranda, Brian Kuan Wood, Stephen Squibb, and Anton Vidokle, ‘Editorial: The New Brutality,’ e-flux journal 83 (June 2017), https:​//​www​.e​-flux​.com​/journal​/83​/142721​/editorial​-the​-new​-brutality​/. 3. Michael Truscello, Infrastructural Brutalism: Art and the Necropolitics of Infrastructure (Cambridge, MA: MIT Press, 2020), 4. 4. Torbjørn Tumyr Nilsen, ‘Thoughts on the Planetary: An Interview with Achille Mbembe,’ New Frame (2018), https:​//​www​.newframe​.com​/thoughts​-on​-the​-planetary​ -an​-interview​-with​-achille​-mbembe​/. 5. See also Natasha Lushetich’s chapter in this volume, ‘The Given and the Made: Thinking Transversal Plasticity with Duchamp, Brecht and Troika’s Artistic Technologies.’ 6. See also Emiddio Vasquez’s chapter in this volume on the histories and varieties of computation: ‘Computation and Material Transformations in Media.’ 7. Martin Heidegger, ‘The End of Philosophy and the Task of Thinking,’ translated by J. Stambaugh, in Martin Heidegger, Basic Writings, edited by David Farrell Krell (London: Routledge, 1993), 373–92. 8. Ibid, 376. 9. Ibid. 10. Ibid. 11. Ibid, 392.

34

Chapter 2

12. Ibid. 13. See also Andrej Radman’s chapter in this volume on transduction as another logical alternative: ‘Allagmatics of Architecture: From Generic Structures to Genetic Operations (and Back).’ 14. Sylvia Wynter, ‘Human Being as Noun? Or Being Human as Praxis? Towards the Autopoetic Turn/Overturn: A Manifesto,’ (2020), https:​//​www​.scribd​.com​/ document​/329082323​/Human​-Being​-as​-Noun​-Or​-Being​-Human​-as​-Praxis​-Towards​ -the​-Autopoietic​-Turn​-Overturn​-A​-Manifesto​#from​_embed. 15. Gilles Deleuze and Félix Guattari, What Is Philosophy?, translated by Hugh Tomlinson and Graham Burchell (New York: Columbia University Press, 1994), 99. 16. Ibid. 17. Ibid, 10. 18. Gilles Deleuze, ‘On the New Philosophers and a More General Problem,’ Discourse: Journal for Theoretical Studies in Media and Culture 20, no. 3 (1998), https:​ //​digitalcommons​.wayne​.edu​/discourse​/vol20​/iss3​/7. 19. See for instance Gilles Deleuze, ‘Thought and Cinema,’ in Cinema II: The Time-­Image, chapter 7 (Oxford: Athlone Press, 2000); and François Laruelle, ‘The Transcendental Computer: A Non-Philosophical Utopia,’ translated by Taylor Adkins and Chris Eby, Speculative Heresy (August 26, 2013). 20. N. Katherine Hayles, ‘Cognition Everywhere: The Rise of the Cognitive Nonconscious and the Costs of Consciousness,’ New Literary History 45, no. 2 (2014): 199–220. 21. Ibid. 22. Lorraine Daston, ‘The Rule of Rules,’ lecture, Wissenschaftskolleg Berlin, November 21, 2010. 23. N. Katherine Hayles, How We Think: Digital Media and Contemporary Technogenesis (Chicago: University of Chicago Press, 2012). 24. Claude Shannon and Warren Weaver, The Mathematical Theory of Communication (Urbana: University of Illinois Press, 1949). 25. Charles Sanders Peirce, ‘Abduction and Induction,’ in Philosophical Writings of Peirce, edited by J. Buchler (New York: Dover, 1955), 150–56. 26. Gaston Bachelard, The New Scientific Spirit, translated by Arthur Goldhammer (Boston: Beacon Press, 1984). 27. Gilbert Simondon, On the Mode of Existence of Technical Objects, translated by Ninian Mellamphy (Ontario: University of Western Ontario, 1980). 28. Renée Ridgway, ‘From Page Rank to Rank Brain,’ (2017), https:​//​eprints​.lancs​ .ac​.uk​/id​/eprint​/124586​/1​/machine​_research​.pdf. 29. See Braidotti et al, ‘The New Brutality.’

Chapter 3

Digital Ontology and Contingency Aden Evens

The ontology of the digital often gets pushed to the margins of digital studies, treated obliquely, if at all, within more capacious analyses of the digital as cultural phenomenon. A theorist exposing the gender imbalance among professional computer programmers might wonder, sceptically, whether there is something inherently masculine about the logic of command and control associated with digital processes. A sociologist who studies the displacement of human labour by digital robots might examine, as one inquiry among others, the essential limits of possibility for programmable machines. There are likely multiple reasons for this reluctance to examine the digital per se. Increasing disciplinary specialisation produces only few scholars whose expertise extends over both metaphysics and the details of digital operation. The ‘culture wars’ of the late 1980s, culminating in the Sokal Affair in 1996,1 discouraged many humanist researchers from engaging in technical and scientific discourses, fearing accusations of dilettantism from trained scientists and engineers, and obscurantism from their own colleagues. And some scholars, wary of technological determinism,2 may feel apprehensive about considering the digital independently of historical and cultural context. Until recently the extended examination of the digital in itself had therefore yielded only scattered examples following Bert Dreyfus’s landmark study from the 1970s, What Computers Can’t Do.3 But with the culture wars a waning influence, the concerted examination of digital ontology over the last decade has coalesced into a sub-field, sometimes called digital philosophy, which is concerned with the nature of algorithms, computation, and the defining principles and possibilities of digital technologies. Much work in this nascent subfield could be labelled digital apologia; the popular image of computers as plodding, decidedly inhuman calculating machines—not the only popular image of computers—must be countered by 35

36

Chapter 3

a more nuanced and better-informed understanding of the complexities of computation. Armed with that greater understanding, digital philosophy aims to look past the stereotype of the robot with the monotone machine voice, to reveal a rich and surprising range of digitally generated behaviour. Luciana Parisi, for instance, proposes that powerful digital processing techniques exhibit new varieties of thought, extending a traditional conception of thinking that has been until now arbitrarily restricted to certain human activities.4 Attacking the problem still more directly, Beatrice Fazi leverages foundational theories about the limits of computation to locate a vital contingency in any (digital) computational process, a contingency or openness that belies the purportedly deterministic and staid inevitability of algorithmic production.5 Fazi thus offers a blunt rebuke to those who regard digital machines as uncreative calculators, not by denying that computers are indeed calculators but rather by discovering a generativity in the act of calculation itself. The present chapter pushes back against this growing body of digital apology, confirming and defending the still widely shared intuition that, for all their shiny newness, there remains something sterile or hollow about digital technologies, a superficiality often felt but rarely examined in digital interaction. Specifically, this chapter motivates the claim that the digital, as a way of seeing the world and as the core mechanism of astoundingly widespread technologies, quashes contingency and erodes the conditions for novelty, creativity, and surprise,6 abetted by a social ideology that naturalises those consequences and that the digital also perpetuates. The technical basis of this claim, deriving from the formal properties of bits, is not detailed herein; instead, this chapter stages a contrast between digital ontology and the ontology of the actual, illustrating each through comparative examples of ‘objects in the world’ including sound, fruit, and games. It proposes that, notwithstanding the computational contingency revealed by Fazi’s argument, the digital realm is largely devoid of the kind of radical spontaneity that underpins the generativity of the actual. THE PHYSICS OF IMPRECISION The physical determinism of classical Newtonian mechanics has long seemed unassailable: if the energy and matter that constitute our entire universe follow strict rules that govern every aspect of their movement and other changes, then at each instant, the state of the entire universe is a necessary outcome of the previous state (combined with those rules). This deterministic necessity—which promises a kind of perfect order that is a dream come true for those who like perfect order—troubles concerns like free will, and leaves no room, it would seem, for contingency. Quantum physicists Flavio Del Santo

Digital Ontology and Contingency

37

and Nicolas Gisin, likely inspired by their research into the (purportedly) less deterministic quantum world, hypothesise that classical determinism may be based on an unproven but almost universally held assumption, the assumption that the properties of the objects of classical physics, particles and such, have in reality infinitely precise values.7 Of course, our measurements of those values—the mass of a molecule, for instance—can only be so refined, achieving a certain degree of precision but offering no further information beyond that degree of precision. This limitation has typically been understood as a shortcoming of our measurement tools and procedures, which can never match the infinite precision of the true value being measured.8 To discard this assumption introduces, or at least makes room for, an indeterminacy that would apply to everything. What if, propose Del Santo and Gisin, real objects have properties with values that are only finitely determinate, and do not have more precise values? What if beyond a certain degree of precision values just aren’t determined in reality, so that reality itself has a limited precision? According to this proposed new paradigm, measuring a given property will necessarily determine it up to the degree of precision of the measurement. Which means that this hypothesis of a world of indeterminate values is just as plausible, just as consistent with the evidence, as the long-assumed determinacy of the world, because both of these metaphysical hypotheses yield exactly the same experimental results. This proposal about the partial indeterminacy of the whole universe makes an interesting complement to chaos theory, wherein the core premise is that some processes are extremely sensitive to tiny variations in initial conditions.9 That is, extremely small differences in a value—chaos theory was literally born out of a small rounding error—can make big differences in the eventual dynamics of a physical system. So if those tiny differences of value lack precise values and are determined only to a limited precision, this means, in effect, that the universe gets to make an unconstrained choice, to pick a direction among wildly divergent possibilities. In other words, the theory of finite precision introduces into the universe (and the physics that describes it) a pervasive, radical contingency. Whereas the digital exhibits a positivist topology, usually understood as a network of individuated points or nodes which are secondarily tied together by lines or vectors, the contingent actual is a mesh of the finest threads, condensed here and there into knots and whorls that demonstrate a partial stability as a complicated collection of relations. The mesh is this haze of relations, the many ways that things encounter each other or communicate across their distances. Threads in this mesh are heterogeneous, not so much elements as events, acts of absorption and cleavage, invention and extension. Amid this anti-positivism, complexity is the only rule. That is what the network cannot simulate, however numerous, however plentiful its nodes and relations; a

38

Chapter 3

network, even at its densest and most dynamic points, still prioritises individual nodes and their (secondary) relations, but no number of individuals ever achieves the complex heterogeneity of the mesh. This shortfall imposes a tragic limit on digital technologies. SOUND AND FRUIT David Dunn is an environmental sound artist and acoustic ecologist who takes the somewhat unusual stance of choosing to interact with the environment that also serves as his recorded object. His art-based research underpins his attempt ‘to recontextualize the perception of sound as it pertains to a necessary epistemological shift in the human relationship to our physical environment.’10 Like many sound ecologists before him, most notably John Cage, Dunn draws on the Buddhist concepts of emptiness and unimpeded interpenetration.11 He writes: In Buddhism, the concept of Sunya (a Sanskrit word translated as ‘emptiness’) describes the complex chain of connection that forms the world. Each ‘thing’ is so densely connected to everything else that it resides nowhere. We cannot isolate the thing from all the states of matter or energy that preceded it or to which it will become.12

Dunn affirms contingency, the condition in which everything touches and is imbricated in everything else, and recognises that strict identity becomes unsustainable under that condition. Any audio recording includes all of the (spatial, temporal, material, and immaterial) circumstances of its production, everything that touches the scene.13 He emphasises that sound exposes the imbrication of each thing in each other; vision, by contrast, separates them into distinct entities with relations only after the fact: When we look at the world, our sense of vision emphasizes the distinct boundaries between phenomena. The forward focus of vision concentrates on the edges of things or on the details of color as they help us to define separate contours in space. . . . The sounds that things make are often not so distinct and, in fact, the experience of listening is often one of perceiving the inseparability of phenomena. Think about the sound of ocean surf or the sound of wind in trees. While we often see something as distinct in its environment, we hear how it relates to other things.14

Fruit offers a fine example, helping to reveal a world that admits both the destabilisation of contingency and the multi-scalar, dynamic order that constitutes the world’s things through their relations, giving things a sense coeval

Digital Ontology and Contingency

39

with their genesis. All the parts of a lemon, for example, respond to the same interwoven set of forces, such that each part—the pulp, the peel, the seeds, the shape, the colour, the flavour, the chemical constituents, its economic situation, its role in cooking, its cultural resonances, its aesthetic possibilities—all of these aspects of the lemon are tied together by its species history, natural history, individual history, the social history of its use by humans, in short, the reasons that make it what it is, determining in a wide-ranging but coordinated evolution its form, content, appearance, and its many behaviours. To answer the question of why a lemon is sour—to make sense of its sourness—is to choose one or more threads from within that contingent but coordinated history, that set of possible reasons or factors that provide the contextually appropriate response. It’s sour because it developed in a certain climate, determined by geography, with certain available nutrients, plant forms, and relations with fauna, or it’s sour because it has a relatively high concentration of an acid with a molecular structure that encounters responsive structures on the human tongue, or it’s sour because citrus fruits tend to be sour. These multiple reasons, a list that can continue to expand indefinitely, manifest the over-determination of reason, a principle of abundant reason, which is how contingency intervenes in this case. With many different reasons that contribute differentially to any particular thing or event, there is always more than one correct explanation, such that everything is multiply determined. No rule decides in advance which reason will be relevant in a given context, and one never knows where a thread will lead, what it might pass through as it marks the relations that determine the lemon in all its sourness. An actual lemon, sitting on one’s countertop, arises as a confluence of a multitude of entangled forces from the microscopic (chemical reactions) to the cosmic (the material composition of the earth). These forces enter into relations with each other, and these contingent encounters of forces—agonistic, sympathetic, allergic, punctual, resonant, parallel, interwoven, delimiting, to name but a few—work themselves out as that specific lemon. The lemon is this problematic encounter of many and diverse threads, it is the partial and passing solution to this billion-body problem. By contrast, the digital operates according to a principle of simplicity, wherein every object and every relation must be algorithmically designated, such that the complexity of output is proportionate to the complexity of input. A digital lemon would have a shape determined by one algorithm, a colour by another, and a set of possible actions (and passions) given by still other algorithms (or data structures), and all of these algorithms and data structures are in principle independent of each other and of the wholeness of the lemon.15 A trivial alteration of a single variable could change the lemon’s colour from yellow to purple, without thereby affecting any of its other qualities. If its parts are coordinated, if the digital lemon demonstrates a coherence, it is only because that coherence has

40

Chapter 3

been deliberately assigned by a programmer or by an artificial intelligence design system. CYBERPUNK 2077 AND THE RHETORIC OF CONTINGENCY Consider the much-hyped videogame, Cyberpunk 2077, developed by the Polish design studio CD Projekt Red.16 Cyberpunk 2077 provides an immense and astoundingly rich environment, including narrative elements, graphical intricacy and variety, a large cast of characters, prolific music and audio cues, and a great deal of attention to detail. Each neighbourhood of Night City, the game’s setting, has a distinctive architectural style. Every person walking down the sidewalk looks, dresses, and moves differently, and many will engage the player-character in brief conversation, revealing different voices and mannerisms. Though side quests—tasks inessential to the core plot but often thematically connected to it—fall into categories, those categories are very diverse, and each quest has multiple possible outcomes and its own specific in-game locations. Playing through the core missions unfurls a tangled science fictional plot, involving the cybernetic infection of the playercharacter’s mind and body by another character. The player-character (that is, the player’s avatar) is named ‘V,’ and in the game’s early sequences, V is forced to implant a chip into her neurocircuitry that contains the ‘engram’ of a famous, long-dead rock star, Johnny Silverhand. A bullet to V’s brain leaves V alive but damages the chip such that it cannot be removed without imperilling V’s life. Thereafter, V and Silverhand effectively share the player-character’s body, and the plot is driven by V’s attempts to have the defective chip neutralised to restore her physical and mental autonomy and to keep Silverhand’s engram from taking over her entire nervous system. As exemplified in its central plot, the rhetoric of the game purports an ontological ambiguity, muddying many familiar distinctions. Life and death, human and machine, self and other, male and female—these binarities may never have been as clear-cut, as binary, as is sometimes imagined (and juridically enforced), but Cyberpunk 2077 disturbs these distinctions as a recurring motif of the game to generate much of the game’s particular significance. One effect of these desegregations is to depict the future as a time of uncertainty, instability, where things refuse easy categorisation, a future where one makes choices about how to act or what to do, but where those choices do not indicate clear outcomes. This de-differentiation of binary category attempts to introduce a rhetorical complexity often absent in digital games,17 which typically draw razor-sharp lines between good guys and bad guys, and Cyberpunk 2077 distributes this

Digital Ontology and Contingency

41

complexity throughout many dimensions of the game. There are conversations between V and various non-player characters (NPCs) that provide essential information and sometimes even advance the plot by giving the player choices among dialogue options: different responses, choose-able by the player. In some cases, those choices are consequential, shaping V’s relationship to the interlocuting character and thereby opening or closing possibilities of plot development at later stages of the game. For example, in a given conversation, a friendly response might win the loyalty of the interlocutor, who is then available later in the game as an ally in battle, whereas an unsympathetic response in the same circumstance might make an enemy of that NPC. At the end of a mission in which V finds a reproduction of Silverhand’s old Porsche, V and Silverhand negotiate their relationship in an extended conversation within V’s mind. Depending on which lines of dialogue the player chooses for V’s side of that conversation, it is possible to trigger a ‘secret’ ending to the game, an ending that would not have been available had the player chosen different dialogue at that point. Though arguably only an appearance of complexity, this sensitive and subtle relationship between dialogue and plot in Cyberpunk 2077 instigates an unusual characteristic of digital games: by establishing dependencies between specific dialogue choices and events in the narrative future of the game, by concealing this relationship behind unremarkable dialogue rather than overtly signalling its import to the player, and by making the required dialogue choices themselves only subtly distinctive, such that they are not readily recognisable without foreknowledge, the game is made to feel like a deeply tangled meshwork of interconnected things and events. The mechanism that connects the dialogue choices to the eventual ‘secret ending’ feels narratively complex and precarious, as against the usual experience of the digital (in games and in other digital environments) as a narrowly causal structure in which each choice leads without ambiguity to a predictable and logically sensible consequence. This game thus represents a world that we might understand as contingent, a world in which even distant events are subtly interdependent and in which complexity overwhelms instrumentality. One cannot simply select the secret ending by pressing the ‘secret ending’ button; instead, one must meet the game at its obscure edges, engage not just instrumentally but affectively, follow intuitions rather than rules. The contrast with much other videogaming, including even other elements of Cyberpunk 2077, could not be sharper: so much gaming involves choosing among options with predictable and straightforward outcomes and in-game meanings. Contrasting mightily with the direct causality and stark distinction of much digital gaming, this construction of contingency points to the meaning of the other complexities represented in Cyberpunk 2077. The most ambitious significance of the dissolved binaries in this game is to challenge,

42

Chapter 3

at the level of rhetoric, one of the core limitations of digital gaming and of digital devices more generally, namely their discrete and rigid categories, deriving from their underlying binary operation. Persistently imposing a haze of indistinction around categories that are typically clear-cut, this game asks players to extend this blurred boundary from the domain of rhetoric to that of mechanics, to believe that, just as life and death or self and other are continuous rather than dichotomous, so can a menu selection engage a fuzzy logic to engender unpredictable and spontaneous consequences. If games—and digital machines generally—institute deterministic causal relationships between user action and machine reaction, Cyberpunk 2077 attempts to persuade its players that things are here more complex, that the world in which the player acts comprises a messy collection of interdependent elements so numerous and wildly imbricated as to defy human calculation. But however compelling a player might find this ludic argument, it remains confined to the game’s rhetoric and cannot transcend the underlying limits of the digital machine, its absolute, deterministic fixity. All the data of the game, all of its states, all of its conditions, all of its NPCs, all of its images and sounds, every line of dialogue, every possible action and result, even the secret ending, everything in the game is encoded in bits, and every bit at any given point in time is either a 0 or a 1. There is no in-between, no ambiguous or indeterminate possible value. Any action by the player, such as the choice of a line of dialogue, must be explicitly connected to its consequence in the code that runs the game, revealing a much starker and more rigid causal model underneath the appearance of a contingent world. It is thus only a compelling illusion that Night City is a world of incalculable complexity and interconnectedness; in fact, only the selection of the prescribed three or four lines of dialogue will avail the player of the secret ending, a strict causality dressed up as a subtle contingency. There can be no accidental path to that ending, and there can be no ending that is not prewritten into the game’s programmed script. THEORIES OF DIGITAL CONTINGENCY For Parisi and Yuk Hui, the principle of simplicity that restricts Cyberpunk 2077, among many other examples, to its prescribed possibilities seems to present no significant limitation, for they treat the digital as an incessant evolution, inextricable from a human and worldly context in which every algorithm, software application, or digital object changes all the time.18 Algorithms develop in response to user interaction, through exposure to unexpected data, under the influence of countless other algorithms to which they connect, and guided by the demands of a vast array of actors, individual,

Digital Ontology and Contingency

43

collective, and inanimate. Viewed as a limitless, complex, cybernetic system of culture and technology, the digital engages constant contingencies and so exhibits remarkable creativity and novel developments. But this evolving contextualised digitality does not locate contingency in the operation of any particular piece of digital software; rather, it discovers contingency in its familiar milieu, the domain of history and materiality, a feedback loop between people and machines that exists on par with the world’s lemons and amid the infinite mesh of threads that weaves together our universe. Taken in isolation as a static piece of software, Cyberpunk 2077 strictly rules out contingency, but a version update or even the next game from CD Projekt Red might generate something new in a negotiation with the contingencies surrounding Cyberpunk’s reception or player feedback. Unsatisfied with a contingency located outside of the confines of the computer processor, which would leave digital devices vulnerable to the charge of un-creativity, Fazi confronts this purportedly absent contingency head on, within individual static algorithms rather than an endlessly changing cultural context. In Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics, she demonstrates that digital algorithms include contingency in themselves, arguing that the ultimate outcome of an algorithm is not determinate—and is therefore contingent—until that algorithm actually terminates. It’s a tricky claim, as Fazi is not saying that an algorithm could have generated some other outcome, as though it proceeds by accident rather than by necessity. Rather, she insists that the outcome, though necessary, is not only not-yet-known but in fact not-yet-determined until the algorithm determines it, a kind of contingency at the heart of a necessity. This ‘computational contingency’ would imply that Night City really is a (virtual) space of genuine surprise, open-ended possibility. It’s not just that the player doesn’t know what’s going to happen; it’s that nobody can know what will happen in this game, how it will happen, in spatial, temporal, and interactional terms, for the events of the game are not determined until the software is actually running. But for all of its rhetorical challenge to identity and causality Cyberpunk 2077 makes this contingency difficult to locate in experience: once a player has learned how the game works, learned what machine reaction to expect from different player actions, the game always responds exactly as predicted. Fazi’s proof, in crude summary, grounds its claim of contingency in Alan Turing’s famous demonstration that there can be no general computational procedure for determining whether any given programme or algorithm will eventually terminate. Turing shows that algorithmicity, or the space of all algorithms, includes an irreducible indeterminacy or ‘undecidability’—because there are necessarily some algorithms for which it is unknowable whether they terminate—an indeterminacy only resolved for a given algorithm when it actually executes (and terminates).19 Fazi reasons that

44

Chapter 3

therefore the executing algorithm is performing an act of determination out of an essential indeterminacy, which is the very definition of contingency: the outcome of the algorithm is contingent on its actual execution. This brilliant argument equivocates between the general and the particular, and the potential and the actual, starting from Turing’s claim about the space of algorithms generally—no universal computational procedure can decide for every algorithm whether it terminates—and drawing the conclusion that each particular algorithm includes a kind of contingency. Turing’s proof, however, does not make a claim about every algorithm taken individually but about the space of algorithms generally. And indeed, there are numerous mathematical and practical methods to test whether a given algorithm eventually terminates, even if there is no universal method that can test all algorithms. (For example, for an algorithm that accepts denumerable, orderable inputs, one could attempt an inductive proof of termination, demonstrating through direct observation or just by executing the algorithm that it terminates for the first input value, and then proving further that, assuming the algorithm’s termination for an arbitrary input value, the algorithm can also be shown to terminate for an incrementally larger input value.) Fazi’s conclusion thus has a narrower scope than she intends, implying the possibility of computational contingency in very specific circumstances but not casting the modality of contingency across the entirety of computation. In affinity with Fazi’s work, Parisi complements her expansive contextualised understanding of the digital with a more focused claim about the ‘internal dynamic [of] computation.’20 Her argument is based largely on the ideas of mathematician Gregory Chaitin and his construction of an irrational number labelled Ω (Omega), defined as the ratio of the number of algorithms that terminate to the total number of all possible algorithms.21 Ω therefore represents the probability that an arbitrarily selected algorithm will terminate, rather than executing forever; it is for Chaitin a way of amplifying the consequences of Turing’s undecidability proof, showing that undecidability is a more prominent dimension of computation (and mathematics) than had formerly been acknowledged. Parisi proposes that the demonstrable properties of Ω, such as the maximal randomness of its digits, fall back on the operation of algorithms in general, such that ‘computation itself has an internal margin of incomputability insofar as rules are always accompanied and infected by randomness.’22 This provocative claim would certainly disrupt the image of digital machines as non-contingent, mechanistic calculators, because randomness is neither mechanistic nor calculable. A rule ‘infected by randomness’ is no rule at all, so that a computational process comprising such random rules might generate any result, the apotheosis of contingency. But Parisi’s technique of imputing properties of Ω to computation in general finds no license in her analysis nor in the writing of Chaitin, whom she cites

Digital Ontology and Contingency

45

frequently. How does the randomness of the digits of Ω somehow ‘infect’ the rules of computation as they are executing in our digital machines? In what sense does this infection of randomness infuse ordinary computation with a dynamism that undoes its presumptive mechanism? Where or by what means does the incomputability of the overwhelming majority of real numbers bear on the actual operation of computers as they successfully perform computations? Parisi’s claims are exciting but their vagueness leaves too many unanswered questions and robs her conclusions of their analytic compulsion. IDEOLOGY OF THE DIGITAL If Cyberpunk 2077—where the digital simulates contingency but does not finally include (much of) it—exemplifies a universal principle, how has the digital nevertheless become the default choice for so many human pursuits, including patently creative ones? The strengths of the digital have won out over its deficits primarily under the influence of powerful ideological currents. Prevailing values, especially in the Euro-centric West but increasingly also around the world, have for centuries ascribed at best a minimal importance to contingency, regarding it as an impediment to planning and control, and those same value systems elevate to the detriment of contingency its ontological contraries, namely positivism, rationalism, and instrumentalism. This trident of digital ideology has enjoyed growing pre-eminence at least since the Enlightenment, and thus predates digital technology by some centuries, but in the digital and its associated technologies these values meet their perfect complement, expanding into a hegemony that has all but cancelled other ways of looking at the world.23 Positivism is the ontological commitment to a world of existents; at bottom, it is an investment in the ontological priority of things.24 The world is, on this view, an aggregation of the things in it, each of which is real, individual, and in principle autonomous. A posit is therefore first and foremost itself, and only secondarily does it enter into relations. An anti-positivist outlook might grasp things as arising out of their relations, indebted in their very being to other things: the social over the individual, the continuous over the discrete, becoming over being, and difference over identity. Positivism is a form of ontological liberalism that serves as a tacit norm. The digital is nearly synonymous with positivism, for the digital by definition is organised around discrete, individuated units—bits—each of which is independent of the others and enters into relations only secondarily and by contrivance. A bit is thus an archetypal posit. But the positivism of the digital does not end with the behind-the-scenes operation of the elemental unit, for whatever is built with bits is also necessarily a posit, with strict boundaries, perfect definition,

46

Chapter 3

and an assertion of positive existence that belies the ephemerality of virtual ontology. A digital operation, whether a whole algorithm or a bit-level command of machine code, is a discrete, well-defined, self-contained event, even if it is also sometimes numerically extensive or very short-lived. A digital object, whether image, sound, or document, among other possibilities, is likewise complete and independent; it is exactly what it is, a perfectly defined ordered sequence of numbers, 0s and 1s, with no ambiguity, nothing still to be determined. A further suggestion, not defended here, is that the digital not only operates as a positivism but thereby also implicitly trains its operators to think and act through posits. The claim of simulation is that important aspects of the simulated real are on offer, but this likeness works both ways, and so we users of the digital learn to see the real as also digital.25 And indeed it is, or at least the real can be understood as (also) digital, for it submits itself to inspection as a world of individual entities with secondary relations. Through the relation of simulation, the computer instructs the user to understand its interface as a version of the non-digital world, and as digital technologies occupy more of our attention and most of our activity, we habituate to the positivist ways of the digital and so come to expect the world to exhibit a similar positivism, an expectation to which the world typically acquiesces. Rationalism is the conviction that the world makes sense, an ascription of reason to all things. Its most prominent precept is the principle of sufficient reason, which holds that there is a reason for every event and every thing— indeed, it is no coincidence that G.W. Leibniz, a chief proponent of sufficient reason, is also an inventor of the binary as a universal code.26 From one perspective, this is an immensely optimistic outlook, for the world appears to be full of arbitrariness and accident, making it seem unlikely at first glance that there is always a sense to be made. If there is a reason for everything, if everything that happens happens for a reason, then it is a short step from rationalism to universal necessity. That is, if everything happens for a reason, then those reasons explain in aggregate why things are the way they are, and this would seem to weigh against things being some other way, hence a world of necessity. This ideological pillar is patently and trivially true within the digital machine. No operation takes place by accident but only according to strict rules that are consistent, predictable, and wholly effective, rules that determine for any 0 or 1 whether it will stay the same or switch values in the machine’s next state. Each successive state of the machine follows from the previous state. Though inputs to the digital system cannot be anticipated, and so constitute a kind of local contingency, the consequence of any input is also entirely bound by rules that preexist those inputs and in fact determine which inputs are possible. Anything at all that happens in a digital machine

Digital Ontology and Contingency

47

discovers its reason in those rules and those inputs, such that everything digital has its formal reason by default. Rationalism, too, persuades users of the digital to carry its guarantee of sense from the hermetic perfection of the computing device to the chaos of the external world. As the machine responds to each command according to a predictable and straitened logic, so one comes to expect that the world also always has its reason, that everything must make sense. This conviction motivates the explosive growth of data science, which acknowledges that some reasons may be extremely complicated, but simultaneously avers that by collecting enough data and processing it sufficiently, even the most subtle and entangled reasons can be seized and confirmed. Instrumentalism treats the world in terms of means and ends. An instrumental outlook takes things and people as instruments by which to bring about some desired state of affairs. It is thus frequently lamented as a dehumanising defect of our techno-capitalist era, which encourages the mercenary treatment of everyone and everything. Instrumentalism leverages commitments to positivism and rationalism, as those outlooks bolster a world of preconceived ends; posits can be grasped and the order of reasons can be calculated, such that to achieve a given goal is a matter of placing the right things in the right relations.27 Agnostic or inert, the digital makes itself wholly available for instrumental appropriation. The elemental and a-signifying bit plays a key role in the digital’s maximal availability: even to call the two possible values of a bit 0 and 1 is already to impart to them more meaning than they really have, for those values are just pure logical opposites without even an arithmetic significance. And with no inherent meaning, the universal element of the digital is available to be assigned any meaning whatsoever, which it does without resistance, without complaint, without prejudice. The fervent hunt for algorithmic bias that fills the pages of academic research28 in the digital may be an essential corrective but it is nevertheless misnamed, for algorithms always do exactly what they are programmed to do; the digital has no desire, and so it serves silently, mechanistically, and with an absolute disinterest. As with positivism and rationalism, instrumentalism meets its apotheosis in the unresistant digital but is likewise amplified there, as the digital reinforces the instrumental attitude and spreads it well beyond the environment of the machine. Interacting with simulated entities that need only the right codes to order themselves according to the will of the user, that user learns thus to see the world, now harbouring the conviction that the correct strategy, the appropriate move, will bring to her the object of her desire.

48

Chapter 3

CONCLUSION Growing in influence for centuries, these three isms established a fertile soil for the cultivation of digital technology, which both benefits from these ideological factors and reproduces them, thus extending their reach. As the engine of a technology, the digital appears to answer to some of our dearest dreams, offering in microcosm a version of the world precisely as we wish to have it. Positivism, instrumentalism, and rationalism together form a knot of Enlightenment thought, yielding practices of reasoning and legitimation criteria for knowledge and action. As ideology, these comportments do not broadcast their foundational status but operate as unexamined defaults, frames with no outside, allowing no question to arise. This deeply embedded social-material ideology explains how digital technology can achieve such far-reaching success even as it minimises contingency. Where positivism rules, the contingent instability of things has no purchase. Rationalism, likewise, does not admit any contingency. No reason is guaranteed in a world of contingency, for contingency is what threatens every reason, the ubiquitous potential for the unexpected. And instrumentalism surely stumbles over contingency, which might throw a wrench into any plan, interpose a surprise between cause and anticipated effect. Were Hui, Parisi, and Fazi correct about digital contingency, then we might expect that the ongoing expansion of digital technology, broader and deeper into human life, would reverse rather than reinforce this ideological current. Ideology is of course difficult to pinpoint as one of its main operations is normalisation. However, in my view, the evidence of experience suggests that the degree of legitimacy of knowledge increasingly tracks its resemblance to the positivist, rationalist, and instrumentalised data that computation deals with so effectively, whereas the kinds of knowing and acting that pose a challenge for this operation, in the sphere of the digital proper and elsewhere, simply don’t count for much anymore. NOTES 1. In 1996, physicist Alan Sokal submitted a hoax article entitled ‘Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity’ to Social Text to see whether a nonsensical article, proposing that quantum theory was a social construct, would be accepted. The article was published in the ‘Science Wars’ issue. See Social Text, no. 46/47, Spring/Summer 1996. 2. The long history of technological determinism in its various socio-cultural and media-theoretical iterations includes Karl Marx’s The Poverty of Philosophy, translated by The Institute of Marxism-Leninism (Delhi: Progress Publishers, 1955), originally published in 1847; Jacques Ellul’s The Technological Society, translated

Digital Ontology and Contingency

49

by John Wilkinson (New York: Vintage Books [1954] 1964), originally published in 1954; and Ray Kurzweil’s The Singularity is Near: When Humans Transcend Biology (London: Penguin Books, 2006), among others. However, new technologies, such as blockchain, discussed, in this volume, in Alesha Serada’s ‘Blockchain Owns You: From CypherPunk to Self-Sovereign Identity,’ generate new forms of technological determinism. 3. Hubert L. Dreyfus, What Computers Can’t Do: The Limits of Artificial Intelligence, revised edition, seventh print (New York: Harper & Row, 1986). 4. Luciana Parisi, ‘Critical Computation: Digital Automata and General Artificial Thinking,’ Theory, Culture & Society 36, no. 2 (2019): 89–121. See also Parisi’s chapter in this volume, ‘Transcendental Instrumentality and Incomputable Thinking.’ 5. M. Beatrice Fazi, Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics (Lanham: Rowman & Littlefield, 2018). 6. The notion of contingency employed in this chapter responds to a number of philosophical antecedents, especially Quentin Meillassoux’s After Finitude: An Essay on the Necessity of Contingency (London: Bloomsbury Publishing, 2010), but it combines and revises those previous invocations of the term to generate a unique and novel meaning. 7. Flavio Del Santo and Nicolas Gisin, ‘Physics without Determinism: Alternative Interpretations of Classical Physics,’ Physical Review A 100, no. 6 (2019): 062107. 8. There are also cross-disciplinary theories, based on the work of quantum theorist Niels Bohr, such as those of Karen Barad, according to which the observer, the observing apparatus, and the observed are entangled. This view, which bridges the natural-social science divide, undermines the authority of measurement. See Karen Barad, What is the Measure of Nothingness? Infinity, Virtuality, Justice. 100 Notes— 100 Thoughts, dOCUMENTA (13) (Hantje Hanz, 2012). 9. James Gleick, Chaos Theory: Making a New Science (London: Heinmann, 1988). 10. David Dunn, ‘Nature, Sound Art and the Sacred,’ (2002), http:​ //​ www​ .davidddunn​.com​/​~david​/writings​/terrnova​.pdf, 3. 11. John Cage was fascinated by the relationship of formlessness to form, and unimpededness to interpenetration. See John Cage, Silence (Middletown: Wesleyan University Press, 1961). See also Iain Campbell’s chapter in this volume ‘How The Performer Came to be Prepared: Three Moments in Music’s Encounter with Everyday Technologies.’ 12. Dunn, ‘Nature, Sound Art and the Sacred,’ 4. 13. See also Pauline Oliveros’s work on sonic complexity: ‘Quantum Listening: From Practice to Theory (To Practice Practice),’ Sound Art Archive, https:​//​s3​ .amazonaws​.com​/arena​-attachments​/736945​/19af465bc3fcf3c8d5249713cd586b28​ .pdf. 14. Dunn, ‘Nature, Sound Art and the Sacred,’ 1. 15. The principle of simplicity that restricts digital objects and relations to only what the programmer has explicitly coded applies most readily to traditional linear programming. Neural networks, by contrast, develop a demonstrable sense of objects and relations autochthonously, through training that reinforces the likelihood of

50

Chapter 3

certain outputs and depresses the likelihood of others in response to a given stimulus. Because neural networks generate outputs through weighted pseudorandom calculations, those outputs are not strictly determinate, but neither are they responsive to the unlimited and expansive context that is invoked in this chapter as the pervasive effect of contingency. Some artists, such as Memo Akten, explore the degree to which neural networks (and other digital methods) effectively simulate an unbridled indeterminacy. See https:​//​www​.memo​.tv​/works​/. 16. Cyberpunk 2077. CD Projekt Red. Sony Playstation 4 edition. (2020). 17. Every digital game involves some measure of ambiguity, and many games deliberately incorporate that ambiguity into the game’s rhetoric. Drawing on the fecundity of non-digital games like Dungeons and Dragons, Blast Theory’s Day of the Figurines, https:​//​www​.blasttheory​.co​.uk​/projects​/day​-of​-the​-figurines​/, expresses a particularly acute frustration with the tension between the playfulness of games and the fixity of computational methods, generating action in the game through creative human decisions rather than preestablished digital calculations. 18. Luciana Parisi, Contagious Architecture: Computation, Aesthetics, and Space (Cambridge, MA: MIT Press, 2013); Yuk Hui, On the Existence of Digital Objects (Minneapolis: University of Minnesota Press, 2016); Yuk Hui, Recursivity and Contingency (Lanham: Rowman & Littlefield, 2019). 19. Turing was influenced by Kurt Gödel’s 1931 theories of undecidability and engaged with David Hilbert’s 1928 Entscheidungsproblem. See Alan Turning, On Computable Numbers with an Application to the Entscheidungsproblem (New York: Dover Publications, 1937). 20. Luciana Parisi, ‘Instrumental Reason, Algorithmic Capitalism, and the Incomputable,’ in Alleys of Your Mind: Augmented Intelligence and Its Traumas, edited by Matteo Pasquinelli (Lüneburg: Meson Press, 2015), 133. 21. Because there is no restriction in principle on how long an algorithm can be, there are in principle an infinite number of possible algorithms. Ω is therefore defined as a limit, as algorithms grow in size, of the ratio of (the number of) terminating algorithms to all algorithms. 22. Parisi, ‘Instrumental Reason,’ 133–34. 23. The conception of ideology employed herein cleaves most closely to Slavoj Žižek’s sense of the term as a sedimentation of past and present values embedded in physical and social objects. For Žižek, the design of everyday objects—such as toilets—cannot be purely utilitarian but is always indicative of socio-political values and attitudes. In a typical German toilet, argues Žižek, the hole into which excrement disappears is at the front so that the excrement can be inspected for traces of illness. In a typical French toilet, the hole is at the back so that the excrement may disappear as quickly as possible. The English toilet presents a synthesis of the two: the toilet basin fills with water, so that the excrement floats in it and is visible, but not for long enough to be inspected. These differences in design reveal, according to Žižek, three different existential and socio-political attitudes: reflective thoroughness (German), revolutionary hastiness (French), and utilitarian pragmatism (English). See Slavoj Žižek, The Plague of Fantasies (London: Verso, 2009). Whether this particular example is compelling or not, Žižek’s point is that ideologies are socio-material processes,

Digital Ontology and Contingency

51

observable in everyday object design, rather than versions of reality imposed by a political or religious authority. The digital is, likewise, inseparable from a tripartite ideology embedded in its unexamined defaults, though, notably, those defaults are inherent in the technical principles of digital operation and cannot be altered through any design modification that preserves the digitalness of the device in question. 24. Positivism traditionally refers to an epistemological commitment, especially in the philosophy of science, that restricts legitimate knowledge only to those claims that are statements of basic empirical fact or that can be immediately derived from such empirical facts. My use of the term withdraws from its relationship to the empirical to treat it as a principle of ontology: instead of a criterion for legitimate belief, positivism herein refers to the insistence that reality is fundamentally constituted of self-asserting, individual elements, or posits. 25. See, for example, Robert Romanyshyn’s Technology as Symptom and Dream (London: Routledge, 1989), in which he argues that technologies are perceptually assimilated as reality, and no longer seen as instruments or devices: Leon Battista Alberti’s linear perspective grid, which was initially a metaphor—look at the world through this grid, and it looks like a geometrical pattern—soon became a map. 26. Together with Leibniz, his younger contemporary Christian Wolff also represents the pinnacle of Enlightenment rationality. Both philosophers were committed to the idea that reason underlies everything and that everything therefore has its reason if only one employs the correct method to discern it. 27. In ‘The Question Concerning Technology’ (in W. Lovitt [translator], The Question Concerning Technology and Other Essays [New York: Harper & Row, 1977], 3–35), Martin Heidegger leverages the common view of technology as means to an end, or as an instrument, as a first step on the way to a more essential understanding of technology. Though he does not write in that essay about instrumentalism, my critique of instrumentalism as ideology is continuous with and inspired by Heidegger’s critical examination of technology. 28. See, for example, Cathy O’Neill, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (New York: Crown Publishing, 2016); and Virginia Eubanks, Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor (New York: St. Martin’s Press, 2018 [2016]).

Chapter 4

Blockchain Owns You From Cypherpunk to Self-Sovereign Identity Alesha Serada

The technological term ‘blockchain’ became a common buzzword in the late 2010s. As of the 2020s, it is impossible to ignore the rise of blockchain technologies in emerging projects of digital identity and governance. In this chapter, I argue that these projects represent the reaction to invasive surveillance and algorithmic control in online environments. Moreover, this reaction is so acute that it shifts to another extreme, regressing to unambiguous personal identification and direct control in dystopian projects characterised by what I will call ‘blockchain governmentality.’ The repressive nature of such projects is often concealed behind the façade of democratic decision-making, but the voting is too easily manipulated, and the options are too few. Negotiable outcome is replaced with the finality of a preprogrammed decisions set in self-executable code. This mode of digital determinism is further reinforced by the immutability of historical records, the un-deletability of personal data that is a feature of and not a bug in blockchain-based systems. The core promise of blockchain is decentralisation. From the perspective of its early adopters, decentralisation meant personal sovereignty and horizontal participation in decision-making; new forms of governance were expected to evolve on this platform. At its most basic level, blockchain is a distributed electronic database that consists of chronologically organised, cryptographically protected immutable records—the ledger records of transactions between different parties such as exchanges of cryptocurrencies or other tokens like non-fungible tokens. The same ledger is simultaneously kept in many nodes of the network and periodically updated, usually painstakingly 53

54

Chapter 4

slowly, in some form of collective consensus between the multiple nodes that have the power of decision. The decentralisation of blockchain governance means that no single entity can control other nodes, only the ones to which it holds the keys.1 Blockchain decentralisation affords multiplicity. In theory, it also affords direct democracy, if every node represents a person with the right to vote. There are two basic mechanisms of verification that support decentralised decision-making, although both are open to technological and social exploits. Proof-of-work requires algorithmic ‘mining’ performed at large, factory-like, by time- and energy-consuming private ‘mining farms.’2 Proof-of-stake, by contrast, gives the power of decision to the select few, typically those who hold big stakes in the appropriate cryptocurrency.3 Both in fact defy the democratic promise of decentralisation4; however, an elaborate discussion of this problem is beyond the scope of this chapter. While specific architectures of blockchain software solutions may be different, they have all inherited at least some principles from the first popular blockchain software—Bitcoin.5 The following generation of blockchain platforms is best represented by the most popular, although prohibitively expensive to use, Ethereum platform.6 Ethereum has an added functionality in the form of ‘smart contracts,’ or self-executing programmes hosted in the nodes of the network. ‘Smart contracts’ are not, in fact, smart—and they are not even contracts7—but we will return to this later. Bitcoin and Ethereum are the most typical blockchains that possess many archetypical qualities8: network decentralisation, the immutability of past records, cryptographic security, transparency of some or all transactional data, and other, more conceptual than technological, features. One such feature is pseudonymity: most blockchain-based accounts have arbitrary names, and their users do not have to disclose their real-life identity in public. Initially, Bitcoin wallets were anonymous by design, but now personal identification is possible, and in fact easy, by forensic means. Almost every active blockchain user can be de-anonymised by scraping data traces that connect the wallet to their named accounts. Besides, personal identification is required in most exchanges to convert cryptocurrencies into real-world money.9 Even before the major cryptocurrency exchanges adopted Know Your Customer policies, one of the early blockchain studies by Bill Maurer and colleagues already noted that “[i]ts protocols offer not anonymity, but ‘pseudo-anonymity’.”10 Later studies of crypto markets confirmed the performative nature of ‘pseudonymity’ and identity on blockchain, too.11 The reversal of anonymity has been crucial in blockchain-based projects that verify and manage identities of natural and legal persons. To understand the problematic assumptions behind these projects, we need to pay closer attention to such (more conceptual than technological) features of blockchains as disintermediation and ‘trustlessness.’

Blockchain Owns You

55

Disintermediation means removing intermediaries such as banks and corporations from the interaction between parties that exchange money for services and goods.12 Its liberating potential was discussed even before cryptocurrencies, in the virtual game items trade13—a highly speculative grey market with a high margin of risk. In a later development by Satoshi Nakamoto, disintermediation was a measure to protect merchants from dissatisfied customers, or at least to optimise the costs of settling their claims.14 This was Nakamoto’s reasoning behind making transactions irreversible, and their records immutable: neither party could abuse the payment system if they did not trust each other. Ideally, this meant the removal of the so-called unproductive beneficiaries; however, it also meant the removal of customer protection: everyone is their own bank with full financial responsibility for all their errors. ‘Trustlessness’ is a mode of operation in a decentralised and disintermediated trader network. It characterises transactions between parties who do not trust each other personally but need to interact nonetheless. The question of whether this is an absence of trust or a new kind of trust15—blind faith, to be more precise—is a discursive paradox specific to the community of blockchain users. To a blockchain enthusiast,16 ‘trustlessness’ refers to the trust in the blockchain technology, rather than in people. This trust is often unconditional, or at least, unshakeable by daily occurrences such as when this novel (and easily abused) technology fails.17 However, as contemporary society is sliding into anomie in a manner similar to the beginning of the twentieth century18—blockchain may be the answer to this sorry spectacle.19 Blockchain technologies represent powerful imaginaries in the international community of tech entrepreneurs.20 They have inspired many utopian projects of blockchain governance based on the ‘code is law’ principle,21 which originated in the techno-utopianism of early digital networks. In the words of Lawrence Lessig, ‘[t]his code, or architecture, sets the terms on which life in cyberspace is experienced.’22 In her later critique, Wendy Hui Kyong Chun highlights the repressive character of such conceptualisations by rephrasing it as ‘code as law is code as police.’23 In what follows, I examine representative projects of this kind through the lens of Foucaultian governmentality. The first section of the remainder of this chapter explains how digital society’s ‘algorithmic governmentality’ contributed to the emergence of the so-called self-sovereign identity (SSI)—the project of the ‘digital self’ now habitually imagined on blockchain. In the second section, I examine what I term ‘blockchain governmentality’ using examples of ‘decentralised autonomous organisations’ (DAOs). ‘Blockchain governmentality’ is a way to imagine, design, and, possibly, implement new forms of power relations in society by prioritising decentralised software architecture and selfexecutable machine code over the factual complexity of social relations. In the last section, I explain why blockchain governance should be considered

56

Chapter 4

deterministic: because it leads to less freedom—or to worse unfreedom—in terms of power and control.24 THE ORIGINS OF BLOCKCHAIN GOVERNMENTALITY The concept of governmentality,25 introduced by Michel Foucault, refers to a specifically European approach to government normalised in modernity. Premodern projects of governance, exemplified by the Machiavellian Prince, were focussed on taking control over a certain territory and maintaining it by ‘the right to kill’26 (which is still the case with the Russian war against Ukraine in 2022). In contrast, in modern European governmentality, state power governs the population through indirect forms of governing (rather than through claiming territories and de-populating them with the aid of military force). This form of power is based on knowledge, not on force. As Johanna Oksala has observed, ‘[g]overnmentality implies the emergence of a particular, circular relationship between power and knowledge, or government and science.’27 Governmentality is rational: it is a way of thinking about society as an object of governance. As modern society moved into its postindustrial, or informational, phase, the practices of governance incorporated more ‘scientific’ means such as big data processing and predictive statistics. Conveniently, major informational corporations today obtain the so-called objective truths about the population from data voluntarily submitted by the internet users, but also from the digital traces they leave, often unknowingly, in virtual environments,28 in the form of cookies and browsing histories. This way of observing and modelling social reality can be described as ‘algorithmic governmentality.’29 Drawing on Gilles Deleuze and Félix Guattari, Antoinette Rouvroy and Thomas Berns refer to ‘a certain type of (a)normative or (a)political rationality founded on the automated collection, aggregation and analysis of big data so as to model, anticipate and pre-emptively affect possible behaviours.’30 While the previous, Panoptical mode of surveillance implied human watchers, the data collected online is processed by machines capable of making decisions about individuals, such as whether they are allowed to cross the border31 or, in an authoritarian country, to travel at all.32 The result is a regime of ‘dataveillance33—a rather traditional model of Panopticon, now recreated by means of digital surveillance and automation. In a broader sense, governmentality is the rationalisation of governance through a variety of tactics: for example, applying statistics to make decisions about the well-being of a population. Blockchain governmentality is a socio-technological imaginary that seeks to rationalise fluid and often implicit relations of power and trust in society, codify and stabilise them with immutable ledgers and self-executing code. The practices of algorithmic and

Blockchain Owns You

57

blockchain governmentality can be studied as a variation on the more general #datapolitik, defined as ‘a form of realpolitik of and by non-human agents.’34 These non-human agents are represented by algorithms, in the first case, and by the so-called smart contracts on blockchain, in the second. SELF-SOVEREIGN IDENTITY: THE ONE-PERSON PANOPTICON SSI is a socio-technological imaginary that refers to the autonomous data subject, unambiguously attached to a natural person who shares only the data they wish to share about themselves. The most referred to SSI manifesto was authored by the software developer Christopher Allen.35 Allen’s manifesto is a vision, not a technological concept, so paradoxes inevitably arise: appealing to the Cartesian ‘I,’ the manifesto describes singular, durable, and portable digital identities where users fully control their data and online representation. These identities can exchange information across multiple systems and are issued independently of a centralised authority. The users’ rights are protected in conflicted situations, too. This appealing, if utopian, imaginary still inspires many blockchain projects. It is only natural to grow weary of ‘algorithmic governance’ and its statistical ‘regime of truth’ if one spends most of one’s time online. From this perspective, SSI symbolises resistance to the ‘algorithmic governmentality’ performed by major online platforms. Compared to the latter, decentralised blockchain platforms can indeed provide adequate cryptographic tools to limit access to personal data. However, their other affordances may not be as suitable. The core vision of SSI is that the user remains in full control of their own data—a worthy and timely goal under the condition of dataveillance. However, this is not technically possible: this paradox comes from the nature of digital data, which is duplicated when shared. In the practice of SSI, some data are shared consensually upon request—for example, to log into another party’s system. To make this work, the other party is trusted to erase the received data from its own system after use. In practice, and especially in blockchain environments, this goes against the condition of ‘trustlessness,’ where parties are expected to act against each other’s best interests (otherwise they would not use blockchain). The imaginary of a sovereign blockchain-based identity does not come from the technology itself, but from the philosophy of the crypto-anarchic ‘cypherpunk’ subculture that engendered Bitcoin,36 only without the extraordinary digital literacy required to be a cypherpunk. The most publicly active organisation in the SSI space is at this point the non-profit blockchain alliance Sovrin Foundation.37 Notably, the vision of this organisation is formulated as ‘Identity for All,’38 which implies that one does not have a singular and

58

Chapter 4

unambiguous identity until they receive it from an institution such as Sovrin. This is as if ‘everyone on the planet wants and needs an identity in the form prescribed by the authorities, and that all that is required is a system to provide it.’39 Many would argue that human beings have the (basic human) right to ‘life, liberty and security of person,’40 regardless of their digital presence ‘in the system.’ Sovrin’s agenda inspired the much-cited blockchain enthusiasts Michael Casey and Paul Vigna: from their perspective, SSIs are good because they ‘don’t depend on a government or a company to assert a person’s ID.’41 Instead, such identities are expected to be verified by blockchains, which does not make practical sense: blockchain as a technology can only verify digital information that is native to blockchain.42 Moreover, to forge a ‘native’ connection between a natural person and its digital identity, biometrics are commonly proposed,43 which takes us back to even more invasive regimes of biopower, in Foucauldian terms. While it is technologically possible to build a privacy-friendly configuration of biometric identification where no data is shared with external parties,44 it still remains to be seen whether blockchain adds value to such a configuration in terms of identity management. It could be that this technology’s affordance will steer blockchain in an even more dangerous direction, as we will see with another feature of blockchain: immutability. IMMUTABILITY AS UNFREEDOM Immutability means that it is impossible to delete, forge, or correct the record that has already been submitted to the block that secures the data; there are workarounds to fix it, but why would one, in that case, want it to be on blockchain in the first place? To satisfy the wish to permanently register all citizens ‘on the system,’ blockchain-based identity systems lend themselves to various exercises in instrumental governmentality by actual human governments.45 Such projects are already in development in a number of European countries, including The Netherlands,46 Belgium,47 Austria,48 and Malta,49 although still with some respect to personal freedom ensured by the General Data Protection Regulation. Some of the state pilot projects in the United States include voting on blockchain.50 Consequently, the real-life implementation of blockchain imaginaries engenders a very different manifestation of disintermediation: effectively, this means removing any mediator that could stand between the self and the state. As soon as blockchain technology is used for the needs of centralised state control, projects of immutable, verifiable ‘permanent and portable digital identities’51 bring back the most oppressive strategies of governmentality. The first impulse is always to test the creative solutionism of blockchain

Blockchain Owns You

59

governmentality on refugees, who are seen as tokens for exercises of the technology businesses in ‘social good.’52 As the Massachusetts Institute of Technology scholars Cara Lapointe and Lara Fishbane suggest in their ethical framework for blockchain developers: blockchain applications could provide the means to establish identities for individuals without identification papers, improve access to finance and banking services for underserved populations, and distribute aid to refugees in a more transparent and efficient manner.53

It is important to note that neither of these pilot projects has so far generated notable value for the people on whom they were tested.54 The same holds true for other projects for social good, such as sustainability. In their review of technology-led experiments in sustainability governance, Nick Bernards and colleagues conclude that experiments in blockchain-based democracy ‘are being developed and applied in ways that reinforce existing patterns of governance and relations of power.’55 Blockchain tends to reproduce, or even amplify, the existing asymmetries in owning and managing personal data. As Bernards and colleagues show on the example of blockchain applications in Africa, “technological solutions to sustainability issues often boil down to attempts to render complex and geographically dispersed informal spheres of activity ‘legible’ and traceable.”56 This serves the needs of the controlling entities, such as investors and governments, rather than of those in need of help, which is similar to the previously mentioned critique voiced by Btihaj Ajana.57 Curiously, in the eyes of Lapointe and Fishbane, blockchain addresses ‘issues for underserved or marginalized people in ways that were previously unimaginable.’58 We should read this ‘unimaginable’ as a lack of imagination, if not something worse. One such unimaginable application is tracking refugees by scanning their irises to confirm their right to free food, a feature presented as good and useful by the developers of this disturbing pilot project, Building Blocks. In a much-cited paper by Fennie Wand and Primavera de Filippi, ‘refugees only need to scan their irises at the point-of-sale to receive food assistance.’59 My question is how little this ‘only’ actually means, and whether sharing surplus food with the most disadvantaged should subject these people to invasive biometric and algorithmic control. Ajana characterises this technocentric form of governmentality as ‘biometric humanitarianism’60; this is also how data-driven and biometric technologies are de facto used for ‘bordering’61 and for keeping displaced people outside of the digital state even when they are physically present on the state’s actual territory. Such practices are reminiscent of the most authoritarian disciplinarian practices and do not meet the actual needs of the displaced, the homeless, and the

60

Chapter 4

vulnerable. We tend to forget about workable solutions such as the Nansen passport—a temporary identity document issued by the League of Nations to refugees in 1922 to 1942, granting them the right to work in its member states62—and, surprisingly, did not require blockchain. Finally, the immutability of blockchain contradicts the fluidity and flexibility of social identities as well as their multiplicity: being a refugee is not an identity; it is the most vulnerable social status that should not be commemorated with an immutable record. In the end, Lapointe and Fishbane acknowledge that ‘immutability of information on Blockchain removes the ability to be forgotten,’63 which also limits it applicability, at least in the EU projects of identity management. My guess would be that to leave it all behind is exactly what the most displaced, homeless, and, more generally, marginalised subjects might desire. ‘SMART CONTRACTS’ AS THE PILLARS OF TRUSTLESSNESS ‘Smart contracts’ are lightweight software programmes hosted on a blockchain platform, such as Ethereum. Typically, they are self-executable, when triggered by a message or a certain state of the system, and almost impossible to change after they have been deployed. A combination of smart contracts constitutes a decentralised application. Andrea Pinna and colleagues describe a thriving community of coders in the programming language Solidity, used to create smart contracts on Ethereum; the most common applications are decentralised finance apps and blockchain-based games.64 The term ‘smart contracts’ causes a lot of confusion in the community of blockchain users, who imagine them as the legal manifestation of the ‘code is law’ principle.65 Indeed, Pinna and colleagues describe a particular, although significantly less used, category of ‘smart contracts.’ These are ‘notary contracts’ that codify agreements between parties and turn them into self-executable programmes. Again, smart contracts are not legal contracts—they are self-executing programmes that do what their programmers tell them to do. The output of these programmes is only legal if accompanied by an actual contract in a legally accepted form.66 This remains the case even when countries, such as Malta, partially integrate ‘smart contracts’ into their legal framework.67 Blockchain-based governance is often imagined in the form of DAOs— communities that manage themselves and the projects they collectively develop by technological means afforded by blockchain. Some aspects of governance, such as managing the collective property of the project, can be automated and inscribed in smart contracts.68 Blockchain-based tokens can be used for voting in certain forms of direct democracy.69 DAO as a form of collective governance was introduced in the Ethereum white paper, which

Blockchain Owns You

61

described ‘long-term smart contracts that contain the assets and encode the bylaws of an entire organization.’70 However, the Ethereum DAO that was organised following these principles also became the first example of a major-scale fraud on the cryptocurrency market.71 While many blockchain projects today label themselves as DAOs, successful examples of long-term self-governance are yet to be seen, and it may be the right time to ask why. There is no shortage of democratic governance projects on blockchain today. A typical blueprint used in many such projects can be found, for example, in the paper by Vijay Mohan, who sees the main reason for academic misconduct in the increased competition for publications and funding, and yet suggests to solve the problem with ‘tokenomics’ and reputation tokens (which, upon second thought, would further intensify the competition and create black market for such tokens).72 Just to name a few least self-contradictory examples: in his exploration of ‘fully-automated liberalism,’ Bernhardt Reinsberg presents a utopian project of climate governance on blockchain in the form of a DAO73; Artyom Kosmarski explores how science and academic research can be governed in a decentralised way74; Morshed Mannan envisions a workers’ cooperative in the form of a DAO.75 However, even the most productive discussions of DAOs—as the future form of self-organisation—miss a crucial point. In all these cases, transparency, immutability, and automation via smart contracts are trusted above trusting the actual people and organisations that, by themselves, should constitute social changes. Blockchains are used to create non-negotiable binding agreements in place of relationships based on good will and shared ideals in the real world. This constitutes distrust, not trustlessness. Finally, governance with smart contracts brings back the same problem of #datapolitik that has already been discussed in the context of ‘algorithmic governmentality’76: only before, the rule of algorithms was indirect and implicit, and could be negotiated by most ‘algorithmically aware’ data subjects.77 Being ‘smart’ is no longer about anticipating the needs of the internet users. Today, it is about the acceptable tyranny of programmes over people. With ‘smart contracts’ on blockchain, ‘non-human, sovereign agents’78 are officially assigned as managers and judges of human behaviour in automated systems.79 Furthermore, they are entitled to govern reality itself, without the need to check back with the said reality. BLOCKCHAIN GOVERNANCE AS NEW FEUDALISM I have already mentioned the confusion between smart contracts and legal contracts. There is one more confusion to untangle: when blockchain enthusiasts speak of DAOs, they imagine smart contracts as a manifestation of the social contract. Their imaginaries of decentralised government usually repeat

62

Chapter 4

the principles of direct democracy described by Jean-Jacques Rousseau in 1762; in Rousseau’s words, ‘this act of association creates a moral and collective body, composed of as many members as the assembly contains votes, and receiving from this act its unity, its common identity, its life and its will.’80 However, unlike the voluntary and implicit social contract in Rousseau’s idealistic project, smart contracts are externally deployed, technologically unbreakable, and imposed by the developers of the software system. Even in the perfect and most reasonable implementation (yet to be observed ‘in the wild’ of blockchain communities), all power is delegated to the machines. Only now these machines do not even collect a digital dossier on their subjects—they reward and punish their behaviour based on predefined rules and simply apply them to ‘flocks of humans,’ like flocks of sheep, without wasting any resources on recognising their identity (subjectivity) as the most basic and archaic form of ‘pastoral power’ that preceded modern governmentality.81 In this way, blockchain governance projects use the most reactionary model of the Panopticon: Vigna and Casey even unironically praise ‘the God’s-eye view’ that blockchain provides.82 They may encourage communities to actively evaluate the worth of each member and reward or punish them by public voting on blockchain—as in a now reorganised social network Steem,83 one of the most active projects of collective blockchain governance. Compare this to Foucault’s Panopticon in its most radical and archaic form: ‘[y]ou have an apparatus of total and circulating mistrust, because there is no absolute point. The perfected form of surveillance consists in a summation of malveillance.’84 These communities may promise direct democracy of the Rousseauian kind; however, such democracy is predicated on Panoptical surveillance. Foucault would call it an illusion: It is the illusion of almost all of the eighteenth-century reformers who credited opinion with considerable potential force. Since opinion could only be good, being the immediate consciousness of the whole social body, they thought people would become virtuous by the simple fact of being observed.85

However, while blockchain adopters are aware that every transaction they make remains on blockchain forever, this does not make them virtuous. Everything is permitted in the environment of trustlessness, as long as it is technically possible, because all trust is now delegated to code,86 and ‘code is law.’87 Again, regardless of blockchain, Chun characterises the same scenario as ‘a decay of the decay that is democracy.’88

Blockchain Owns You

63

CONCLUSION: BLOCKCHAIN GOVERNMENTALITY Algorithmic and blockchain governmentality represent two different forms of #realpolitik, whose ‘aim is no longer to exclude anything that does not fit the average but to avoid the unpredictable, to make sure that everybody is truly themselves.’89 Algorithmic governmentality sees this truth as the product of auto-learning algorithms that devour real-life data in real time. Blockchain governmentality offers a different, and very literal and transparent, regime of ‘digital truth.’ Every record on blockchain is cryptographically secured and cannot be changed, unless a new version (‘fork’) of blockchain is created, or the voting majority (typically of 51 per cent) is controlled by a single entity (the former scenario is much more common than the latter). This conceptual (even if not always factual) immutability stands for ‘truth’ in the Vigna and Casey’s widely used metaphor of blockchain as the ‘truth machine.’90 In light of this anxiety, self-sovereign identities are a barely masked attempt to identify and discipline others, not oneself. In addition, algorithmic and blockchain governance implements a variety of means to achieve the shared goal to de-subjectify the individual. Apart from possibly malicious intermediaries (whose danger and involvement may be over-estimated), disintermediation also removes the intricate relation between the natural human, their data subject or data subjects, and the apparatus of control. The removal of intermediaries simultaneously establishes the blockchain network as the only unquestionable authority, the one that cannot be fooled by managing multiple personas or performing ‘pretend’ searches to trick the algorithm. The most disturbing effect of #datapolitik, potentially amplified in blockchain solutions, is that when non-human agents take control and make decisions about human subjects, human subjects lose their agency and even subjectivity. The subject is here rigidly, or even immutably, fixed at its designated location in the node network, and eventually becomes the property of the network, ruled by non-negotiable self-executing programmes. Blockchain inventors are well aware of the technology’s flaws. This, however, does not quell their urge to register homeless people on blockchain so they can get free food, likely because of the typical impulse of every inventor: ‘let’s try it (on others) and see what happens.’ This urge is what makes it ‘interesting,’ alongside a lack of empathy with the people to whom future blockchain solutions will be applied. By equating the individual with their data subject, blockchain governmentality downgrades to the previous regime of direct, unmediated discipline executed on one’s physical body. In blockchain governmentality, there is no relation between the individual and the state. Verified by a supposedly indestructible record, one becomes one’s own profile, and this, I would argue, is where blockchain governmentality

64

Chapter 4

wants us. Governing is outsourced to ‘the code is law’ principle; and, in fact, as of 2021, 51 per cent of European and 31 per cent of British respondents supported replacing parliamentarian seats with robots in the survey by IE University.91 Despite claims to decentralisation and democratisation, blockchain affordances support a very particular power fantasy, and a very individualist one. Even before and beyond blockchain, ‘code resuscitates fantasies of sovereign, or executive, structures of power,’ in Chun’s interpretation of Judith Butler’s work.92 When a technology innovator or a researcher proposes another project of governance, they adopt the perspective of those who govern, rather than those who are governed (understandably so, as such inventors are typically comfortable with their own privileges of prestigious education and career). This is particularly striking in utopian projects of democratic governance on blockchain such as DAOs. When in a position where external governance becomes visible and sensible, projects of personal autonomy are suggested, as we can see in the example of SSI. Both of these two modes of blockchain governmentality ignore the multi-lateral direction and inherent asymmetry of governance, which always includes relations of governing and being governed, by the forces and impulses that come from outside and from within. Initially, blockchain technologies promised decentralisation, anonymity, transparency, verifiability, the removal of unproductive intermediaries, security, and trust. All these promises have long been broken or twisted, but the last crucial feature, immutability, remains, well . . . immutable. Its most inviting application is to produce undisputed historical truth about the past and the present of an object, human or a non-human entity. A blockchain-based subject writes their future with every deed, and this future, as well as the past, cannot be rewritten. When used for data integration across business and social networks, blockchain technology becomes the ultimate ‘de-Terminator,’ which poses the direct threat to the project of the self by limiting its future to options immutably secured on blockchain and prescribed in self-executing rules of governance. Furthermore, when collectively materialised in the form of DAOs or similar ‘social bodies,’ blockchain governance creates new hegemony in the form of stakeholders, who are either the first ones to adopt a certain solution as well as the ones who own most of its tokens, which gives them voting and decision rights, or, most typically, both at once. On a theoretical level, it equates this new governmentality with the technocentric ‘Californian ideology’93 that privileges wealthy ‘First-world’ citizens. To showcase the ‘rags to riches’ principle, this ideology may also employ the lucky few token ‘self-made people’ who serve as the ‘model workers’ of the new blockchain economy.94 This technocentric meritocracy, however, is famously exclusive or ignorant. In the majority of cases, immutability, transparency, and irreversibility may radically limit what Foucault called the

Blockchain Owns You

65

‘work carried out by ourselves upon ourselves as free beings’95 that characterises the (post) modern individual. NOTES 1. See, for example, Joshua Ellul et al., ‘Regulating Blockchain, DLT and Smart Contracts: A Technology Regulator’s Perspective,’ ERA Forum 21 (June 30, 2020): 209–20. 2. As in ‘MASSIVE Crypto Mining Farm Tour | Bitcoin, Dash, and GPU Mining!,’ Deeper in the Mines (VoskCoin, February 10, 2020), https:​//​www​.youtube​.com​/watch​ ?v​=4ekOcDG2D8E. 3. ‘Ethereum Proof of Stake,’ EthHub, https:​//​docs​.ethhub​.io​/ethereum​-roadmap​/ ethereum​-2​.0​/proof​-of​-stake​/. 4. Irni Eliana Khairuddin and Corina Sas, ‘An Exploration of Bitcoin Mining Practices: Miners’ Trust Challenges and Motivations,’ Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19 (the 2019 CHI Conference, Glasgow: ACM Press, 2019), 1–13; Sarah Azouvi, Mary Maller, and Sarah Meiklejohn, ‘Egalitarian Society or Benevolent Dictatorship: The State of Cryptocurrency Governance,’ Financial Cryptography Workshops (2018). 5. Satoshi Nakamoto, ‘Bitcoin: A Peer-to-Peer Electronic Cash System,’ 2008. 6. Vitalik Buterin, ‘Ethereum White Paper. A Next Generation Smart Contract & Decentralized Application Platform,’ 2013, https:​//​ethereum​.org​/en​/whitepaper​/. 7. Kelvin F. K. Low and Eliza Mik, ‘Pause the Blockchain Legal Revolution,’ International and Comparative Law Quarterly 69, no. 1 (January 2020): 135–75. 8. See, for example, Cara Lapointe and Lara Fishbane, ‘The Blockchain Ethical Design Framework,’ Innovations: Technology, Governance, Globalization 12, no. 3–4 (January 2019): 50–71. 9. The European Union has banned anonymous crypto wallets and transactions altogether: Huw Jones and Tom Wilson, ‘EU Lawmakers Back New Tracing Rule for Crypto Transfers,’ Reuters, March 31, 2022, https:​//​www​.reuters​.com​/world​/europe​/ eu​-lawmakers​-back​-new​-tracing​-rule​-crypto​-transfers​-2022​-03​-31​/. 10. Bill Maurer, Taylor C. Nelms, and Lana Swartz, “‘When Perhaps the Real Problem Is Money Itself!’: The Practical Materiality of Bitcoin,” Social Semiotics 23, no. 2 (April 2013): 261–77. 11. Angus Bancroft and Peter Scott Reid, ‘Challenging the Techno-Politics of Anonymity: The Case of Cryptomarket Users,’ Information, Communication & Society 20, no. 4 (April 3, 2017): 497–512. 12. Paul Vigna and Michael J. Casey, The Truth Machine: The Blockchain and the Future of Everything (New York: St. Martin’s Press, 2018); Lapointe and Fishbane, ‘The Blockchain Ethical Design Framework.’ 13. Dan Hunter and F. Gregory Lastowka, ‘Amateur-to-Amateur,’ William and Mary Law Review 46, no. 3 (2004): 951–1030. 14. Nakamoto, ‘Bitcoin: A Peer-to-Peer Electronic Cash System.’

66

Chapter 4

15. Angela Walch, ‘In Code(Rs) We Trust: Software Developers as Fiduciaries in Public Blockchains,’ in The Blockchain Revolution: Legal and Policy Challenges (Oxford: Oxford University Press, 2018). 16. See Melanie Swan, Blockchain: Blueprint for a New Economy (Sebastopol, CA: O’Reilly Media, 2015); Vigna and Casey, The Truth Machine; Primavera de Filippi and Aaron Wright, Blockchain and the Law: The Rule of Code (Cambridge, MA: Harvard University Press, 2018). 17. See, for example, Gili Vidan and Vili Lehdonvirta, ‘Mine the Gap: Bitcoin and the Maintenance of Trustlessness,’ New Media & Society 21, no. 1 (January 2019): 42–59. 18. ‘It is to this state of anomie that, as we shall show, must be attributed the continually recurring conflicts and disorders of every kind of which the economic world affords such a sorry spectacle,’ Emile Durkheim, The Division of Labor in Society (New York: Simon and Schuster, 2014). 19. See also Franco ‘Bifo’ Berardi’s chapter in this volume, ‘The Double Spiral of Chaos and Automation.’ 20. Arnoud Lagendijk et al., “Blockchain Innovation and Framing in the Netherlands: How a Technological Object Turns into a ‘Hyperobject’,” Technology in Society 59 (November 1, 2019): 1–10; Yong Ming Kow and Caitlin Lustig, ‘Imaginaries and Crystallization Processes in Bitcoin Infrastructuring,’ Computer Supported Cooperative Work (CSCW) 27, no. 2 (April 2018): 209–32; Moritz Becker, ‘Blockchain and the Promise (s) of Decentralisation: A Sociological Investigation of the Sociotechnical Imaginaries of Blockchain,’ Proceedings of the STS Conference Graz, 2019, 6–30. 21. Walch, ‘In Code (Rs) We Trust.’ 22. Lawrence Lessig, ‘Code Is Law: On Liberty in Cyberspace,’ Harvard Magazine, 2000. https:​//​www​.harvardmagazine​.com​/2000​/01​/code​-is​-law​-html. 23. Wendy Hui Kyong Chun, Updating to Remain the Same: Habitual New Media (Cambridge, MA: MIT Press, 2017), 82. 24. The technical exposition of this chapter can be compared to Ashley Woodward’s metaphysical exposition in this volume. See ‘Information and Alterity: From Probability to Plasticity.’ 25. Michel Foucault, Security, Territory, Population: Lectures at the College De France, 1977 - 78, (London, UK: Palgrave Macmillan, 2007). 26. Johanna Oksala, ‘From Biopower to Governmentality,’ in A Companion to Foucault, edited by Christopher Falzon, Timothy O’Leary, and Jana Sawicki (Chichester, UK: John Wiley & Sons, Ltd, 2013), 320–36. 27. Oksala, ‘From Biopower to Governmentality,’ 326. 28. Antoinette Rouvroy and Thomas Berns, ‘Algorithmic Governmentality and Prospects of Emancipation,’ Réseaux 177, no. 1 (October 14, 2013): 163–96. 29. Ibid. 30. Ibid, 173. 31. Btihaj Ajana, ‘Biometric Datafication in Governmental and Personal Spheres,’ in Big Data—A New Medium?, edited by Natasha Lushetich (London and New York: Routledge, 2020), 63–79.

Blockchain Owns You

67

32. Chuncheng Liu, ‘Multiple Social Credit Systems in China,’ Economic Sociology: The European Electronic Newsletter 21, no. 1 (2019): 22–32. 33. Rouvroy and Berns, ‘Algorithmic Governmentality and Prospects of Emancipation,’ 169. 34. Çağlar Köseoğlu and Davide Panagia, ‘#datapolitik: An Interview with Davide Panagia,’ Contriver’s Review, November 14, 2017. 35. Christopher Allen, ‘The Path to Self-Sovereign Identity,’ http:​ //​ www​ .lifewithalacrity​.com​/2016​/04​/the​-path​-to​-self​-soverereign​-identity​.html. 36. Henrik Karlstrøm, ‘Do Libertarians Dream of Electric Coins? The Material Embeddedness of Bitcoin,’ Distinktion: Journal of Social Theory 15, no. 1 (January 2, 2014): 23–36; Enrico Beltramini, ‘The Cryptoanarchist Character of Bitcoin’s Digital Governance,’ Anarchist Studies 29, no. 2 (2021): 75–99. 37. Sovrin Foundation, ‘Sovrin: A Protocol and Token for Self- Sovereign Identity and Decentralized Trust,’ January 2018, https:​//​sovrin​.org​/wp​-content​/uploads​/2018​ /03​/Sovrin​-Protocol​-and​-Token​-White​-Paper​.pdf. 38. Sovrin Foundation. 39. Ajana, ‘Biometric Datafication in Governmental and Personal Spheres.’ 40. UN General Assembly, Universal Declaration of Human Rights: Proclaimed by the United Nations General Assembly, Paris, December 1948 (Oxford, UK: Bodleian Library, 2021). 41. Vigna and Casey, The Truth Machine, 8. 42. Laura Gonzalez, ‘Blockchain, Herding and Trust in Peer-to-Peer Lending,’ Managerial Finance 46, no. 6 (January 1, 2019): 815–31; Kelvin F. K. Low and Eliza Mik, ‘Pause the Blockchain Legal Revolution.’ 43. Applied Recognition and Sovrin Foundation, ‘Applied Recognition and Sovrin Foundation Announce Cooperation On . . . ,’ Toronto: PRWeb, April 19, 2021, https:​ //​www​.prweb​.com​/releases​/2021​/04​/prweb17871399​.htm; J. S. Hammudoglu et al., ‘Portable Trust: Biometric-Based Authentication and Blockchain Storage For Self-Sovereign Identity Systems,’ (2017), https:​//​arxiv​.org​/pdf​/1706​.03744; Margie Cheesman, ‘Self-Sovereignty for Refugees? The Contested Horizons of Digital Identity,’ Geopolitics (October 4, 2020): 1–26; Fennie Wang and Primavera De Filippi, ‘Self-Sovereign Identity in a Globalized World: Credentials-Based Identity Systems as a Driver for Economic Inclusion,’ Frontiers in Blockchain 2 (2020). 44. See, for example, Chen Tai Pang et al., ‘Biometric System-on-Card,’ in Encyclopedia of Biometrics, edited by Stan Z. Li and Anil K. Jain (Boston, MA: Springer US, 2009), 1–6. 45. Dutch Blockchain Coalition, ‘Co-Write Self-Sovereign Identity’s Next Chapter,’ Dutch Blockchain Coalition (blog), 2020, https:​//​dutchblockchaincoalition​.org​/ en​/news​/co​-write​-self​-sovereign​-identitys​-next​-chapter. 46. Dutch Blockchain Coalition. 47. Microsoft, ‘How a Decentralized Identity and Verifiable Credentials Can Streamline Both Public and Private Processes,’ March 17, 2021, https:​//​customers​ .microsoft​.com​/en​-us​/story​/1351115614634143059​-flanders​-government​-of​-belgium​ -government​-azure​-active​-directory.

68

Chapter 4

48. Ralf-Roman Schmidt, Roman Geyer, and Pauline Lucas, ‘Discussion Paper. The Barriers to Waste Heat Recovery and How to Overcome Them?’ (Austrian Institute of Technology, June 2020), https:​//​www​.euroheat​.org​/wp​-content​/uploads​/2020​ /06​/Discussion​.pdf. 49. Ellul et al., ‘Regulating Blockchain, DLT and Smart Contracts.’ 50. Sunoo Park et al., ‘Going from Bad to Worse: From Internet Voting to Blockchain Voting,’ Journal of Cybersecurity 7, no. 1 (February 16, 2021): 1–15. 51. Lapointe and Fishbane, ‘The Blockchain Ethical Design Framework.’ 52. See also Ajana, ‘Biometric Datafication in Governmental and Personal Spheres.’ 53. Lapointe and Fishbane, ‘The Blockchain Ethical Design Framework.’ 54. Cheesman, ‘Self-Sovereignty for Refugees?’ 55. Nick Bernards et al., ‘Interrogating Technology-Led Experiments in Sustainability Governance,’ Global Policy (2020). 56. Bernards et al. 57. Ajana, ‘Biometric Datafication.’ 58. Lapointe and Fishbane, ‘The Blockchain Ethical Design Framework.’ 59. Wang and De Filippi, ‘Self-Sovereign Identity in a Globalized World’; see also Vigna and Casey, The Truth Machine, who start their book with an enthusiastic description of registering refugees on blockchain. 60. Ajana, ‘Biometric Datafication.’ 61. Ajana. Anecdotally, the author of this chapter had one of the most terrifying experiences in their life when they almost failed fingerprint recognition in the EU VIS system that controls migration, due to intensive guitar practice having altered their fingerprints. 62. Otto Hieronymi, ‘The Nansen Passport: A Tool of Freedom of Movement and of Protection,’ Refugee Survey Quarterly 22, no. 1 (2003): 36–47. 63. Lapointe and Fishbane, ‘The Blockchain Ethical Design Framework.’ 64. Andrea Pinna et al., ‘A Massive Analysis of Ethereum Smart Contracts Empirical Study and Code Metrics,’ IEEE Access 7 (2019): 78194–213. 65. de Filippi and Wright, Blockchain and the Law, 72–88. 66. Kelvin F. K. Low and Eliza Mik, ‘Pause the Blockchain Legal Revolution.’ 67. Laura Camilleri, ‘Blockchain-Based Smart Contracts’ Legal Enforceability in Malta and the UK: A Square Peg in a Round Hole?’ (LLM in International Corporate and Commercial Law, University of York, 2019); Joshua Ellul et al., ‘Regulating Blockchain, DLT and Smart Contracts: A Technology Regulator’s Perspective.’ 68. See, for example, ArkDev, ‘Gray Boys Whitepaper,’ 2021. 69. See, for example, Barbara Guidi, Andrea Michienzi, and Laura Ricci, ‘Analysis of Witnesses in the Steem Blockchain,’ Mobile Networks and Applications (April 14, 2021). 70. Buterin,’Ethereum White Paper.’ 71. Quinn DuPont, “Experiments in Algorithmic Governance: An Ethnography of ‘The DAO,’ a Failed Decentralized Autonomous Organization,” in Bitcoin and Beyond: The Challenges and Opportunities of Blockchains for Global Governance,

Blockchain Owns You

69

edited byMalcolm Campbell-Verduyn (New York: Routledge, 2018), 157–77; Walch, ‘In Code(Rs) We Trust.’ 72. Vijay Mohan, ‘On the Use of Blockchain-Based Mechanisms to Tackle Academic Misconduct,’ Research Policy 48, no. 9 (November 1, 2019). 73. Bernhard Reinsberg, ‘Fully-Automated Liberalism? Blockchain Technology and International Cooperation in an Anarchic World.’ International Theory (2020). 74. Artyom Kosmarski, ‘Blockchain Adoption in Academia: Promises and Challenges,’ Journal of Open Innovation: Technology, Market, and Complexity 6, no. 4 (December 2020): 117. 75. Morshed Mannan, ‘Fostering Worker Cooperatives with Blockchain Technology: Lessons from the Colony Project,’ Erasmus Law Review 11, no. 3 (2018): 190–203. 76. Köseoğlu and Panagia, ‘#datapolitik.’ 77. Erin Klawitter and Eszter Hargittai, “‘It’s Like Learning a Whole Other Language’: The Role of Algorithmic Skills in the Curation of Creative Goods,” International Journal of Communication 12, (September 13, 2018): 21; Shagun Jhaver, Yoni Karpfen, and Judd Antin, ‘Algorithmic Anxiety and Coping Strategies of Airbnb Hosts,’ Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18 (2018 CHI Conference, Montreal QC, Canada: ACM Press, 2018), 1–12. 78. Köseoğlu and Panagia, ‘#datapolitik.’ 79. See, for example, Melanie Swan, ‘Transhuman Crypto Cloudminds,’ in The Transhumanism Handbook, edited by Newton Lee (Cham: Springer International Publishing, 2019), 513–27. 80. Jean-Jacques Rousseau, On the Social Contract (South Bend, IN: St. Augustine Press, 2018). 81. Oksala, ‘From Biopower to Governmentality.’ 82. Vigna and Casey, The Truth Machine, 20. 83. Raffaele Fabio Ciriello, Roman Beck, and Jason Thatcher, ‘The Paradoxical Effects of Blockchain Technology on Social Networking Practices’ (Thirty Ninth International Conference on Information Systems, San Francisco: USA, 2018). 84. Michel Foucault, Power/Knowledge: Selected Interviews and Other Writings, 1972–1977 (New York: Pantheon Books, 1980), 158. 85. Foucault, Power/Knowledge, 161. 86. Vidan and Lehdonvirta, ‘Mine the Gap: Bitcoin and the Maintenance of Trustlessness.’ 87. Lessig, ‘Code Is Law: On Liberty in Cyberspace’; See also de Filippi and Wright, Blockchain and the Law: The Rule of Code. 88. Chun, Updating to Remain the Same, 83. 89. Rouvroy, Berns, and Libbrecht, ‘Algorithmic Governmentality,’ 172. 90. Vigna and Casey, The Truth Machine. 91. Center for the Governance of Change, ‘IE University Research on Replacing MPs with Robots,’ IE University (blog), https:​//​www​.ie​.edu​/university​/news​ -events​/news​/ie​-university​-research​-reveals​-1​-2​-europeans​-want​-replace​-national​ -mps​-robots​/.

70

Chapter 4

92. Chun, Updating to Remain the Same: Habitual New Media, 83. 93. Richard Barbrook and Andy Cameron, ‘The Californian Ideology,’ Science as Culture 6, no. 1 (January 1, 1996): 44–72. 94. Ciriello, Beck, and Thatcher, ‘The Paradoxical Effects of Blockchain Technology on Social Networking Practices.’ 95. Michel Foucault, ‘What Is Enlightenment?,’ in The Foucault Reader (New York: Pantheon Books, 1984), 32–50.

Chapter 5

The Double Spiral of Chaos and Automation Franco ‘Bifo’ Berardi

At a certain point in their cultural history, usually referred to as modernity, (Global North) humans came to understand that God did not exist, or perhaps He did exist in some shape, form, and some place, but was not interested in human affairs. God was busy with more important things. From this point onwards, and as a result of divine indifference, human affairs were no longer determined by God’s will. This was also the point at which humans decided that the absence of divine determination should be called ‘freedom,’ and an entire civilisation was built on the asumption of the ontological freedom of human actions. Despite this assumption we—the ‘moderns’—have built our empires on slavery, deportation, and the extermination of those who are not ‘modern’ like us. From an ontological point of view, free will can be understood as an effect of the indeterminacy of the surrounding universe, world, and particularly the social sphere. In our current epoch, marked by the (interminable) fading of modernity, two opposite trends are visible in the fabric of social life. The first is the (apparently unstoppable) proliferation of automated interconnections asymptotically approaching the global cognitive Automaton. Huge amounts of data, captured from the daily life of the masses, are increasingly intertwined with intelligent devices, so that new events are (proportionatelly) increasingly reduced to the deterministic programmes installed inside the social brain. Life is datafied—transformed into a process of data manipulation—and social life is caught in the logico-mathematical trap of semio-capitalism. The cognitive Automaton is the final avatar of the modern process of rationalisation: automation doesn’t merely replace human acts with machinic operations—more importantly, it submits cognitive activity to logico-technological chains. Power is nothing else but the insertion of 71

72

Chapter 5

automated selections into the social vibration since automation is (at some stage at least) programmed by the human mind according to visions, ideologies and preconceptions. The techno-semiotic machine of datafication records human behaviour and translates it into consequences: real events activate mathematical functions inscribed in the machine as logical necessities provoking a pre-emption of life. The second trend is the proliferation of chaotic agents triggering ungovernable processes in the fields of geopolitics, economy, and the psycho-sphere which makes it difficult to predict whether the future-scape will be marked by the rigid determinism of digital automatisms or by the chaotic succession of various forms of social-order disintegration. The most likely future-scape, however, will be the entwinement of the Automaton and Chaos: determinist chains of interconnected automation shot through with flows of ungovernability and instability. Both the order fabricated (during modernity), by political power, and the order of technological automation are destabilised by the growing complexity of cultural flows, social conflicts, and unpredictable events such as the COVID-19 pandemic and the Russian invasion of Ukraine—events that have radically changed the symbolic attribution of meaning and value. In what follows I will sketch a potted history of this complex condition via a brief discussion of humanism, neoliberalism, and the arbitrarisation of value, complexification, and disruption, and the double valance of the general intellect in order to explain why, in the post-pandemic world, marked by the return of biology and a re-concretisation of values, we need a new methodological approach to the double spiral of indeterminacy and re-determination. HUMANISM AND INDETERMINANCY The Old Testament saw nature as a passive, emotionless flow of time without temporalisation and without change. God had created humans without feeling their suffering; the divine Creation was a game devoid of empathy: neither God nor Satan had any empathy with the pawns of their chess game. The essential innovation of the New Testament was that God became a man and came to Earth to feel the pains and passions of human beings. The passionless time of God was at this point interrupted and entwined with human time, which is why the modern humanist revolution evolved within the affective and cultural space of Christianity. Both Christianity and humanism view history as temporalised, not as a sphere of the ‘eternal Truth’ or passive nature. The foundation of the historical dimension lies in the suffering of humans since ‘[i]t is not truth that makes man great, but man who makes great the truth.’1 The foundations of modernity also reside in the unravelling of a new

The Double Spiral of Chaos and Automation

73

sphere—the sphere of human history and sociality—which does not follow the eternal rules of God or those of the universe. In science, we have discovered mechanical laws that govern the planets, the sky, mountains, rivers, and stones, but these laws and their determinism are not sufficient to establish the rules of human ethics. Human ethics are based on compassion, the etymology of the word ‘compassion’ being cum-patiri—to suffer together. The inaugural point of modernity is thus a conscious separation of historical temporality (ruled by the will of the Prince), from the so-called natural time ruled by the unchangeable laws of physics. This is the essential novelty of the humanist culture: the human sphere does not follow, nor should it follow, the laws of the universe, but those of compassion: mutual understanding and solidarity. Pico della Mirandola is explicit on this point: God created man in a way that is different from the rest of the universe. In in Oratio de dignitate homini, he writes: We have given to thee, Adam, no fixed seat, no form of thy very own, no gift peculiarly thine, that thou mayest feel as thine own, have as thine own, possess as thine own the seat, the form, the gifts which thou thyself shalt desire. A limited nature in other creatures is confined within the laws written down by Us. In conformity with thy free judgment, in whose hands I have placed thee, thou art confined by no bounds; and thou wilt fix limits of nature for thyself. I have placed thee at the center of the world, that from there thou mayest more conveniently look around and see whatsoever is in the world. Neither heavenly nor earthly, neither mortal nor immortal have We made thee. Thou, like a judge appointed for being honorable, art the molder and maker of thyself; thou mayest sculpt thyself into whatever shape thou dost prefer. Thou canst grow downward into the lower natures which are brutes. Thou canst again grow upward from thy soul’s reason into the higher natures which are divine.2

Modern history evolved in this space of indeterminacy and freedom. However, this freedom is not lawless; rather, human laws are a human construction, not the reflection of God’s natural laws. Humanism is thus a space of indeterminacy where human reason creates its own rules. THE DETERMINISM OF THE INVISIBLE HAND Modern science, too, replaced God as the calculating brain of the universe, the world and human mind, with the mechanical God of physical determination. When Pierre-Simon Laplace proclaimed that we can predict any future change once we know everything about the universe, he established the clockwork-like God of determinism.3 This kingdom of the mechanical God did not go unchallenged; many scientists advocated alternative, indeterminate views

74

Chapter 5

of the universe. For example, Werner Heisenberg argued that the observer cannot simultaneously know the position and the momentum of a particle,4 which in turn meant that it was impossible to predict the future state/s of a(ny) system. The question, however, is whether this indeterminacy is epistemic or ontological: is the impossibility of prediction the result of human ignorance, or is it the result of the ontological indeterminacy of the universe itself? Is it a problem of insufficient knowledge or intrinsic undecidibility? We don’t have a definitive answer to this question, but we have tried to reduce reality to our knowledge and to that which can be predicted. According to Charles Darwin, the evolution of all species is based on the general principle of natural selection: the organism that is best equipped to adapt to the environment will survive and prevail.5 But it is not clear whether Darwin is referring to human as well as non-human history, in other words, whether human history is consistent with the general laws of determination. In the second part of the twentieth century, socio-biology emerged as a theoretical backdrop of neoliberal ideology and the accompanying political agenda that irreversibly transformed social life. The theory of complex systems came into prominence and has since then evolved into an essential tool of political and economic governance: stochastic predictability and technological determination subjected the ‘kingdom of uncertainty’ to the rule of technological automatisms.6 The determinist principles that socio-biology inscribed in the domain of economy (and social life more generally) is reminiscent of the Adam Smith’s invisible hand: the market, as a new human-made determinist God, is here seen as a self-regulating mechanism based on the internal dynamics of techno-financial automatism.7 The neoliberal revolution, whose purpose was to liberate the economic sphere from (any kind of) regulation, has subjected social life to the mathematical rule of governance. The effects of this ‘liberation’ are now clear to see: public goods have been privatised, labour time has been ‘precarised,’ wages have fallen drastically in many post-industrial countries, and there is a resurgence of slavery in many areas of the world. ECONOMIC DETERMINISM AND ARBITRARINESS: FLOATING VALUES In Symbolic Exchange and Death, which first appeared in French in 1976, Jean Baudrillard argued that since Richard Nixon’s 1971 declaration of the independence of the American dollar from the fixed regime of intermonetary exchange, economy had turned into a regime of floating values.8 Nixon’s move made the socio-economic system essentially indeterminate. A process of post-industrial transformation began to unfold, epitomised by recombinant

The Double Spiral of Chaos and Automation

75

digital networks that gave rise to the construction of technological automata ruled by inner determinism, and to the acceleration—and ensuing complexification—of cultural flows and social behaviour. The American decision to break out of the post-war socio-economic universe of Bretton Woods was an act of freedom in the ontological sense of the word: it was an arbitrary act that broke the established order of monetary exchange. The superpower status of the United States made this decision possible, unleashing a long-lasting period of financial deregulation: the symbolic system of finance cut across the limits of (and undermined) the so-called real economy—the production and circulation of physical and semiotic goods. The ensuing American dominance was based on the violation of the connection between economic flows, financial value and evaluation, including the violation of the connection between the material reality of labour and consumption, and their symbolic appreciation. After Nixon’s decision, measurement no longer existed. The standard no longer existed. The possibility of deciding how much time was averagely needed to produce a good no longer existed. This is why Baudrillard speaks of floating values and of the indeterminacy of the entire system of values. Effectively, Nixon’s decision replaced measurement with violence. In aleatory conditions, the only thing that can decide the value of something is the arbitrary use of force. It should also be said that the relation between financial capitalism and violence is not a casual extemporaneous conjuncture—it is absolutely structural. There is no financial capitalism without violence. When there is no standard and the economic environment is aleatory, violence is clearly the only way to decide the price of a good or service. But who is the final decider? In an aleatory environment, it is arbitrariness itself. I call aleatory any environment that cannot be predicted, fixed, or determined in any way. And I call arbitrary the act of decision that does not refer to any pre-established rule or foundation. In Latin, the word ratio is used to describe the fixed relationship between things and processes; ratio is both the standard and measure. In philosophical parlance, ratio (reason) is the universal standard for understanding things. Arbitrariness is thus simultaneously the negation of reason and its (re-)establishment in different guise. This is why we should understand the neoliberal revolution as both de-regulatory (in the sense of elimination of the rational laws established by political will) and foundational of a different, automated rationality. SEMIOCAPITALISM There is a thread of continuity from Nixon’s 1971 decision to Greenspan 2001, when the irrational exuberance of stock exchange simultaneously

76

Chapter 5

created the conditions for the explosion of technological production, the conditions for the 2000 dot-com crash, and the conditions for the financial collapse of 2008. Financial capitalism is based on the inscription of mathematical determination into unruly daily activities. Under the rule of financial capitalism, money ceases to be a tool for calculating the value of things (commodities and services) and becomes a form of mathematical regulation of social relations. In other words, money adopts a determinist function. At this point both conscious political will and political action lose their potency much like the system of values—concrete, actual, symbolic, and virtual values—falls into indeterminacy since financial capitalism is not bound to or by a reference, measure, or ratio, but is based on self-validation. This is why financial capitalism can create its own reality where mathematical determinism rules. Abolishing determinism (the standard) and inscribing deterministic rules in daily life activities are two sides of the same (counter-)revolution: both break the relation between time and value. And here lies the essential contradiction of semiocapitalism, a recombinant machine that deracinates habit and floods the nervous system with information deluge: when indeterminacy takes the place of the fixed relation between labour time and value, the system of exchange gives way to the aleatorics of floating values. But this does not mean liberation from determinism; on the contrary, it means that determinism is inscribed in the semiotic activity itself—the production of signs. In the first pages of Das Kapital, Karl Marx explains that value is (the accumulation of) time, the average social time that is needed for the production of a certain amount of goods.9 Time is thus ‘objectified’ both in physical goods and in abstract value. Semioproductivity, by contrast, destabilises the relation of time to labour and value. Labour is here no longer the physical muscular labour of industrial production; the goal of labour activity is to produce exchangeable signs. In industrial capitalism it was easy to calculate how much time was needed to produce a material object. That is not the case with semiocapitalist cognitive labour: it’s much more difficult to calculate the time needed to produce an idea, a style, or an innovation. As the process of production turns increasingly semiotic, the fixed relationship between labour time and value melts into thin air, which means that under the condition of semiocapitalism, the mechanical determination of value becomes useless, too. THE BIOLOGICAL CONCRETE AS A FACTOR OF INDETERMINATION After decades of digital and monetary de-materialisation of multiple production processes, and after decades of abstracting social activities, we have, in the last two years, witnessed the return of concreteness in the form the

The Double Spiral of Chaos and Automation

77

viral outbreak of COVID-19 (and its many concrete consequences). The physical body has awoken from its decade-long digital oblivion like a zombie after a long torpor. Materials like iron and wood—also flesh, air, nuclear power plants, and the collective psyche—all have their own dynamics, and these dynamics are not dependent on abstractions. And it is precisely these non-abstract dynamics that COVID-19—an infinitely small, invisible agent of chaos—has unleashed. For the first time in decades, finance is powerless in the face of concrete human needs; the virus has brought biology back into the domain that was ruled by abstraction. But the return of biological matter has also unleashed waves of chaos in the domain of abstraction, simultaneously propelling new modes and new levels of abstraction. The spiral of Chaos and the Automaton is the spiral of indeterminacy and re-determination at the heart of the social fabric. The ‘revenge’ of concreteness chaotically interrupted most aspects of daily life; it also shattered numerous supply chains of raw materials and semi-finished products. These disruptions have continued to accumulate even if the immediate danger to human health seems contained by the programme of mass vaccination (temporarily at least). Worse still, these disruptions have fuelled a resurgence of instability and conflict. The Russian war on Ukraine can be seen as an explosion of the ‘rear fronts’ that are, in part at least, the result of the chaotic effects of the virus. The biological concrete has chaotised financial abstraction, but the pandemic has also forced us to spend increasing amounts of social time in the cyberspace, extending and expanding the incorporeal element of communication, work, and education. The transference of sociality to the domain of the digital is thus concurrent with the re-concretisation of material needs: the need for medicine and food in a population beseiged by an invisible yet very concrete agent. Financial accumulation did not suffer from the pandemic; on the contrary, stock markets went through an upward trend, due, in part, to the stunning profits of the pharmaceutical and digital industries. Despite this, the hold of financial abstraction on the concreteness of planetary life has weakened considerably as money cannot cure illness or stop the spread of fear. At present, money doesn’t seem to be able to stop the Russian-Ukrainian war either, or feed starving populations, those starving because of the war. DISRUPTIONS Indeed, the pandemic has taught us many things about the meaning of chaos and the destabilisation of the global integration of technological determinism and economy. I define chaos as an excess in complexity of an environment, resulting in the inability of the receiving system (whether the brain, reason, or government) to extract the relevant information from that environment.

78

Chapter 5

Chaos is both the measure of indeterminacy and the measure of disbalance between the speed of info-stimulation coming from digital transmitters and the processing speed of the receiving system. The virus is, in this sense, an indeterministic bio-agent with the power to disrupt the existing social order and the operation of global economy. The disruption of the production, distribution, and consumption chains has also signalled a possibility of the disintegration of capitalism, as the virus has ruptured the continuity of the accumulation and exchange of goods and services, and de-signified the promise of the omnipotence of money and financial capitalism. During the lockdowns that followed the initial outbreak of COVID-19, it became rapidly clear that money could not buy non-existent vaccines, magically multiply intensive care units, or populate them with trained doctors and nurses. This why I argue that, after a long reign of abstraction, the pandemic marks the return of concreteness. It also makes transparent the duplicity of determination and indeterminacy that is inscribed in (congitive) labour processses. Unsurprisingly, a phenomenon known as the ‘Great Resignation’ has spread over the Global North. American newspapers, among which also The New York Times, reported a figure of 4,500,000 workers who refused to return to work; the return of death to the social scene has clearly cued the refusal of exploitative labour. Paul Krugman writes: Why are we experiencing what many are calling the Great Resignation, with so many workers either quitting or demanding . . . better working conditions to stay? . . . What seems to be happening is that the pandemic led many U.S. workers to rethink their lives and ask whether it was worth staying in the lousy jobs too many of them had.10

Elsewhere, Krugman adds: ‘it seems quite possible that the pandemic, by upending many Americans’ lives, also caused some of them to reconsider their life choices.’11 Krugman is right: the experience of the pandemic, the effect of an unpredictable and indeterministic factor suspending the determinist chain of financial capitalist economy was a psychological changer. The Great Resignation is not only an American of Global North phenomenon; in China, too, an increasing number of young workers have decided to give up their grand economic expectations and make a life for themselves outside the economic framework. The Japanese can be seen as the avant-garde of Great Resignation; since the beginning of the twenty-first century a phenomenon known as hikikomori—a form of radical social withdrawal of (mostly) young males who lock themselves up in their rooms and refuse to leave them—spread all over Japan.12 More than one million young people have chosen complete isolation for months, some for years. The determinist chain of precarious work, ruled by the networked automaton, and the subjection of

The Double Spiral of Chaos and Automation

79

individuals to automated precarity is at the heart of this phenomenon. Though some of these chains have been broken by the virus, many new ones have resurged: think, for instance, of the recent proliferation of precarious food delivery jobs by companies such as Deliveroo. Despite this, we have witnessed a definitive psychological change in the last two years: resignation in its double meaning—as the acceptance of something painful but inevitable, and as a decision to leave both precarious and regular situations. But resignation is open to a third interpretation: re-signification as the attribution of a different meaning to events and activities. The current wave of resignation is of course motivated by the return of death to the scene of daily life; indeed, death has installed itself at the centre of social culture and imagination as we can see, for instance, from Aneantir, Michel Houellebecq’s 2022 book about illness, senescence, death, and the decline of the white race.13 The return of war in Europe, and the reappearance of the nationalist discourse in the public sphere (rather than in the subcultural margins) is a symptom of this inability of the Global North, and also of the white race (which includes Russia) to deal with its own exhaustion. It is also the inability of the Global North mind to adapt to the indeterminacy of future-scapes. Obviously, the indeterminacy of the future is not a novelty of our times. However, in the past, we were able to reconcile our future angst with faith in the power of techno-science and/or human will. The faith in the power of political will has faded considerably due to the hyperpotency of finance, and the faith in the power of techno-science has been seriously undermined by the outbreak of COVID-19. The Russian-Ukrainian war, whose outcome cannot even be imagined as I write these lines, will certainly have (and is already having) a chaotic impact on supply chains and capital accumulation, ultimately spreading instability and unpredictability in the global ecomic order despite the war’s (at this point in time) localised character. THE DOUBLE FACE OF THE GENERAL INTELLECT Indeterminacy is not a new notion in social theory. Social theorists began to study it as soon as the new brand of labour, centred on information rather than on muscular energy and the physical manipulation of matter, emerged. The Italian neo-Marxist school of Operaismo, namely Paolo Virno, Romano Alquati, and Franco Piperno, have stressed the double feature of intellectual labour, which is both the creator of intelligent machines and the object of the intelligent machine’s domination. In the well-known Fragment on Machines (unsurprisingly popularised by the Italian operaisti), Marx emphasises the double character of cognitive labour:

80

Chapter 5

The productive process has ceased to be a work process in the sense that the work transcends it and understands it as the unity that dominates it. Instead, the work appears only as a conscious organ at various points in the mechanical system in the form of individual living workers; dispersed, subsumed under the overall process of machinery, itself only a member, a ring of the system, whose unity does not exist in the living workers, but in the living machinery (active), which appears in front of the worker as a powerful organism to his single and insignificant activity.14

The relation between the intellectualisation of labour and indeterminacy is not linear, however. In fact, it’s eminently contradictory as on one hand the machine (especially the digital machine) technically subsumes the workers’ activity. On the other hand, workers, even if subsumed by automated technology, are nevertheless living, sensing, suffering, and thinking beings. So, despite the automated condition of their labour subsumed by the machine (the product of previous intelligent labour), workers are conscious organisms. As such they can (or cannot) organise, rebel, refuse, and/or transform their lives and the very process of production. Although digital information can be combined and recombined under the fractal shape and the mathematical rule of generative determination, the general intellect is a living subject, a collective body, and a recombinant association of different sentient-thinking organisms that cooperate, conflict, rebel against, and/or resign from the network to which they belong. The spiral dynamics of automated determination and indeterminist chaos is at the heart of semiocapitalism, its driving force being networked cognitive activity. The general intellect is simultaneously a function of the automated process of production, and a living organism, a conscious community of people who produce goods and services under the rule of capitalist profit, but may at any point dissociate themsleves from their objectified function and re-appropriate the role of autonomous subjects. The dynamics of the general intellect consists of two parallel operations: on the one hand, there is the deterministic repetition of the inscribed (and prescribed) code. In The Vital Illusion, Baudrillard anticipates the future connection between data and automation, and the entanglement of will and action in a chain of predictive concatenations inscribed in and prescribed by the code—the generator of networked reality.15 On the other hand, as a living organism, the general intellect is ontogenetic. The element that both connects and separates these two features is consciousness, understood as self-reflection in the process of becoming-other.

The Double Spiral of Chaos and Automation

81

CONSCIOUSNESS AND NEUROSCIENCE Neuroscience is a scientific domain that shares with psychology the task of understanding the functioning of the human mind. However, neuroscience often defines cognition in terms of determinist processes that enable intelligent behaviour and social interaction. This neuro-physical reduction cannot explain the innermost features of consciousness because consciousness is exactly this: the act of escaping deterministic chains. Consciousness cannot be reduced to the dimension of rational thinking, as it also includes the emotional, embodied, embedded, and enactive dimensions of cognition.16 It further includes the ‘intimate foreign land’ or Inner Ausland which, for Sigmund Freud, is the unconscious.17 The unconscious is the blind spot of neuroscience, and this blind spot feeds the indeterministic evolution of the mind, and thus also the aleatory formation of consciousness. When we shift from the sphere of mechanics to the sphere of social complexity, we can no longer predict future configurations of the world, nor can we extrapolate the future from the present configuration of society. And so we are faced with a theoretical dilemma: is predicting future configurations impossible because of the infinite complexity of inner physical determinism, which cannot be processed by our limited powers of comprehension, diagnosis, and prediction, or is it impossible because energy-matter itself evolves in a non-deterministic way? This dilemma, which, as I have mentioned earlier, runs through modern philosophy from Laplace’s determinism to quantum indeterminacy,18 takes on a crucial importance in the present relation of neuroscience to psychoanalysis and the future evolution of the mind. We may speculate that neurological matter (the machinery of neurones, synapses, and neurotransmitters) acts in a deterministic way on our psychological and cognitive behaviour, while the computing capacity of our brain remains insufficient to process data flows generated by that same machinery. In this scenario, we simply need more computation power, a technical enhancement of the computing capacity of the networked brain. But we may equally speculate that the relation between the neurological brain and the conscious mind is essentially indeterministic, and that only a quantum leap can explain the specificity of mental activity: a leap which marks the difference between conscious activity and neurological dynamics. For Freud, the unconscious was that leap. If we take the unconscious seriously, we realise the concept of ‘freedom’ needs reframing: freedom is the non-deterministic elaboration of possibilities entangled in the determinism of physical and neurological matter. However, a gap between neuroscience and psychoanalysis, between the physical determinism of neurology and the in-(de)terminability of desire remains, even if this gap is absolutely singular in its genesis and manifestation and cannot be

82

Chapter 5

reduced to the scientific exactitude of determination. Further, it is not a gap that manifests in a single human organism but in all forms of social behaviour. The experience of the pandemic and the ongoing trauma of the Russian war on Ukraine (that risks to evolve into a global economic and potentially military catastrophe) cannot be fully explained if we disregard the psychotic effects of those traumas, and the overall mental collapse of society, after four decades of acceleration of the info-neural net. The evolution of the collective unconscious, particularly of the social imagination and the psycho-sphere, has in recent years, fluctuated between panic and depression, melancholia and aggression. The social and geopolitical collapse is, in my opinion, rooted in the mental collapse of the Global North. The return of war in Europe exposes the double spiral of the Automaton and Chaos under a new (dark) light: the dynamics of war is based on a chain of automatisms (military, geopolitical, and psychological) but the horizon of war is indeterminate. In Leo Tolstoy’s War and Peace, this double nature of war is expressed in following terms: They do say war is a bit like playing chess. . . . Yes it is (said Prince Andrey) but there’s one little difference. In chess you can take as long as you want over each move. You’re beyond the limits of time. Oh, there is this one other difference: a knight is always stronger than a pawn, and two pawns are always stronger than one, whereas in war a battalion can sometimes be stronger than a division, and sometimes weaker than a company. You can never be sure of the relative strength of different forces. . . . Success never has depended, never will depend, on dispositions or armaments, not even numbers, and position least of all. . . . [it depends] on the gut feeling inside me and him, and every soldier.19

However, the quintessential indeterminacy of war, like the indeterminacy of all disruptions and catastrophes, does not always lead to more uncertainty; it can also lead to the consolidation of the existing relations. COLLAPSE AND CONSOLIDATION Social morphogenesis is the creation of new shapes in social systems. In political theory, the collapse of a system was once considered an opportunity for radical change. Today, the concept of revolution is unusable, because it is based on the illusion of control of a relevant share of social reality by rational will and a project of transformation. In the past, revolutions gave birth to violent and totalitarian systems, but they were nevertheless effective. Revolutions rarely fulfilled their utopian project; however, they did turn social collapses into radical systemic changes, including the shift in political power and the creation of new forms of social and economic life. Neoliberalism was the last

The Double Spiral of Chaos and Automation

83

effective revolution in the history of the Global North. It turned the social turbulence of the 1970s and the technological evolution of the 1980s into a sort of market-driven automated game of overall competition creating along the way a totalitarian system, and abolishing modern bourgeois democracy and replacing it with the automation of corporate (financial) reason. However, it has also sped up productivity to the point of oversaturation, fragmenting and precarising labour, and causing a mutation in and of the social body. Neoliberalism has destroyed democracy and the effectiveness of political action; what we are witnessing today are disruptions without revolution. The more complex a system grows, the more sensitive it becomes to internal and external disruptions. At the same time, the more complex a system grows, the less it can be voluntarily controlled, that is, consciously or intentionally changed. Given that a disruption (or catastrophe) cannot trigger a revolution, it often engenders consolidation; indeed, repeated (large-scale) disruptions do not lead to morphogenesis but to morphostasis. Complexity is a relation between time and information; a system is complex when the density of the infosphere saturates the receptivity of the psychosphere, and the speed of informational circulation surpasses the human ability to interpret or elaborate signs in time. When the infosphere becomes too dense and too fast for human consciousness, society needs automatic complexity reducers. A disruption is the result of the eruption of an unpredictable event that interrupts and disturbs the existing flow or chain of events and actions. The reason why disruptions proliferate in the sphere of informational connectivity in particular is because the excesses and overloads that the infosphere produces foreclose the possibillity of human governing of the systemic complexity of social and technological structures. Disruptions occur for various reasons; because of the unpredictable interference of nature in the techno-sphere, such as the Icelandic volcano cloud that blocked European air traffic in March 2010; because of the limits of technological control, like Chernobyl in 1986 and Fukushima in 2011. However, they also occur because the social psyche interferes in the field of automatic information flow (like the effect of panic in the financial circuit, or the role of human interpretation of disruption or disaster indicators).20 Social morphogenesis is simultaneously the cause and the effect of consciousness: morphogenesis makes possible the creation of a new epistemic and/or sensible paradigm, but a morphogenetic process is only possible in the sphere of human culture if we can consciously shift to a new epistemic and political paradigm. In (past) epochs of slow informational circulation and political governability, disruptions were considered triggers of social morphogenesis. In conditions of low complexity (situations where the speed of informational flows in the social circuit is commesurate with conscious human government), political reason was able to change the existing social organisation in such

84

Chapter 5

a way that a new pattern could emerge. In the present condition, however, disruptions and catastrophes are morphostatic and tend to reinforce the pattern that produced the disruption in the first place. Why do systems become more resilient when their complexity grows? One of the possible answers comes from political analyst Ross Douthat. Commenting on the 2008 to 2010 disruptions (the global financial collapse, the Icelandic ash cloud, and the gigantic oil spill in the Gulf of Mexico), Douthat writes: The economic crisis is producing the entrenchment of authority rather than its diffusion, and the concentration of power in the hands of the same elite that presided over the disaster in the first place. . . . The panic of 2008 happened, in part, because the public interest had become too intertwined with private interests for the latter to be allowed to fail. But everything we did to halt the panic, and all the legislation we’ve passed, has only strengthened the symbiosis. . . . This is the perverse logic of meritocracy. Once a system grows sufficiently complex, it doesn’t matter how badly our best and brightest foul things up. Every crisis increases their authority, because they seem to be the only ones who understand the system well enough to fix it. But their fixes tend to make the system even more complex and centralized, and more vulnerable to the next national-security surprise, the next natural disaster, the next economic crisis.21

Though it is pristinely clear that neoliberal politics was the cause of the financial collapse and of the ensuing social misery, the financial collapse has consolidated the neoliberal dictatorship and the cultural dogmatism of the political elite. This also means that the hypercomplexity of the system makes it next to impossible not only to implement but also imagine a process of transformation. The double spiral of Chaos and Automation is thus the general shape of our time: in order to decipher and navigate this complex condition we need two new methodologies. The first should be based on the techno-political (rather than techno-mathematical) determination of governance; the second on tuning into the rhythm, amplitudes, and repercussions of chaos, congitively and sensorially. NOTES 1. Thinn Thinn Lei, ‘The Concept of Man in Confucius’ Philosophy,’ Hinthada University Research Journal, 2, no. 1 (2010). 2. Pico della Mirandola, Oratio de dignitate homini (Einaudi: Torino, 2021), translation mine. 3. Pierre-Simon Laplace, Marquis de, ‘A Philosophical Essay on Probabilities,’ translated by F.W. Truscott and F.L. Emory (London: Chapman and Hill, 1902).

The Double Spiral of Chaos and Automation

85

4. Werner Heisenberg, Physics and Philosophy: The Revolution in Modern Science (New York: Harper Collins 2007 [1958]). 5. Charles Darwin, On the Origin of Species (London: John Murray, 1859). 6. See also the prologue to this volume. 7. Adam Smith, The Invisible Hand (London: Penguin [1759] 2008). 8. Jean Baudrillard, Symbolic Exchange and Death, translated by Ian Hamilton Grant (London: Sage Publications Ltd., 2016 [1976]). 9. Karl Marx, Capital: A Critique of Political Economy, volume 1, translated by Ben Fowles (London: Penguin Books, 1990 [1976]). 10. Paul Krugman, ‘How Is the U.S. Economy Doing?,’ The New York Times, December 9, 2021, https:​//​www​.nytimes​.com​/2021​/12​/09​/opinion​/economy​-inflation​ -spending​-jobs​.html. 11. Paul Krugman, ‘The Revolt of the American Worker,’ IPS Journal (October 21. 2021), https:​//​www​.ips​-journal​.eu​/work​-and​-digitalisation​/the​-revolt​-of​-the​-american​ -worker​-5500​/. 12. See, for example, Alan Robert Teo and Albert C. Gaw, ‘Hikikomori, A Japanese Culture-Bound Syndrome of Social Withdrawal? A Proposal for DSM-V,’ J. Nerv. Ment. Dis., 198, no. 6 (June 2010): 444–49. 13. Michel Houellebecq, Anéantir (Paris: Gallimard, 2022). 14. Marx, Capital, 439–40. 15. Jean Baudrillard, The Vital Illusion, translated and edited by Julia Witwer (New York: Columbia University Press, 2000 [1970]), 37. 16. See, for example, Andy Clark, Supersizing the Mind: Embodiment, Action, Extension (Oxford: Oxford University Press, 2010). 17. Sigmund Freud, The Unconscious (London: Penguin Classics, 2005 [1915]). 18. See also Niels Bohr’s work, for example, On the Quantum Theory of Line-Spectra (New York: Dover Publications, 2005 [1918]). 19. Leo Tolstoy, War and Peace, translated by Anthony Briggs (London: Penguin Classics, 2005), 858. 20. See also the discussion of crisis in relation to observation and interpretation in the prologue to this volume. 21. Ross Douthat, ‘Consolidation,’ International Herald Tribune, May 2010, 8.

PART II

Spatial, Temporal, Aural, and Visual Technologies

87

Chapter 6

Allagmatics of Architecture From Generic Structures to Genetic Operations (and Back) Andrej Radman

From a scientific point of view, the modus operandi of architecture is often (mis)perceived as vague or indeterminate. However, this ‘weakness’ could turn out to be architecture’s greatest strength, a result of the symbiosis between its Beaux-Arts and Polytechnic traditions. The chapter will argue that it is possible to be both inexact and rigorous. What distinguishes architecture from other disciplines and makes it the material-discursive practice par excellence is the interplay between the abstract means and concrete ends. Unsurprisingly, architecture has often been described as the first art1: not a mere epiphenomenon of culture, but, in Benjamin Bratton’s terms, a mechanism of terraforming here on Earth.2 Consequently, architectural activity will come to play a significant role in noetic processes. One does not decide to think differently. To think differently, one has to feel differently. As a matter of fact, ‘thinking differently’ is tautology. Thinking occurs only on the condition that it does not conform to a pre-established structure (sameness). To think is to be primed not for re-cognition, but for a genuine encounter with that which forces us to think (difference).3 It is this margin of indeterminacy that engenders sensitivity to outside information.4 From this perspective, the purpose of design is to ‘rewire our brains.’ Framed in such a way, architecture qualifies as psychotropic practice. In the words of Daniel Smail: The mood-altering practices, behaviors, and institutions generated by human culture are . . . psychotropic mechanisms. . . . these mechanisms have

89

90

Chapter 6

neuro-chemical effects that are not all that dissimilar from those produced by the drugs.5

The animate has always been utterly dependent on the inanimate.6 The aim of the chapter is to reclaim the importance of architectural technicity as the sine qua non of anthropogenesis.7 It is exemplary of what Bernard Stiegler calls ‘evolution by means other than life.’8 As Claire Colebrook aptly stated in her ‘Sex and the (Anthropocene) City,’ ‘life is essentially intertwined with non-life.’9 This revelation is a welcome antidote to the prevailing ‘carbon chauvinism.’10 The chapter’s ambition is to reframe architecture in terms of non-organic memory and set it free from the parochial tradition of supplanting operation by structure. The outsourcing of memory from the organic will change the conditions for further phylogenetic becomings, since evolutions continue to be extrinsically organised by the technicised associated milieus. It is here that Gilbert Simondon’s concept of allagmatics becomes indispensable. As we shall see, operation and structure (difference and repetition) have always been co-constitutive.11 A GENERAL THEORY OF QUASICAUSALITY Preliminaries aside, what is allagmatics? This, of course, is a badly posed question. We should rather ask: what does allagmatics do? The straightforward answer is that it modifies and transforms structures. Put differently, it is the operation that makes structures appear. Architectural typologies—the theatre, school, hospital, and similar reified generalities—are exemplary of the essentialist condition that subsumes all-too-many outcomes and thus precludes any genuine novum. To paraphrase Spinoza, we do not know (in advance) what architecture can do.12 Isabelle Stengers’s kindred caution remains as timely as ever: ‘in order to address practices, we have to accept the critical test of abstaining from the powerful drug of Truth.’13 Every practising architect knows that the way forward is through trial and error. Zigzagging between voluntarism and fatalism, they wish for a lucky accident to occur. A fortuitous breakthrough demands a temporary suspension of interpretation.14 The adoption of an experimental attitude turns architecture into an art of dosages.15 To put heuristics before hermeneutics is to replace the universal concept with the plastic singular one that is tethered to the variables determining its very mutation. The generic typology (one-size-fits-all) gives way to the anexact-yet-rigorous genetic topology.16 The term “allagmatics” appears several times in Simondon’s primary doctoral thesis Individuation in Light of Notions of Form and Information.17 This is one of the more succinct formulations:

Allagmatics of Architecture

91

The name allagmatic could be given to such a genetic method that seeks to grasp individuated beings as the development of a singularity that unifies . . . the overall energetic and material conditions; in fact, we should note that this method does not involve a pure causal determinism. . . . In fact, the being extends in time the meeting of the two groups of conditions that it expresses; it is not just the result but also the agent, both the milieu of this meeting and the extension of this realized compatibility.18

In Deleuzian parlance, it is both the actual cause and virtual quasi-cause. According to the Stoicist immanent spermatikos logos, a state of affairs produces an incorporeal effect, which in turn operates as a quasi-cause.19 By introducing a margin of indetermination, quasi-causality discredits pure causal determinism, a.k.a. mechanicism; it introduces the ‘elaboration’ and ‘creation and choice’ that Bergson credits to time, a notion of time that ‘hinders everything from being given at once’ and may be, Bergson proposes, ‘indetermination itself.’20 The virtual and actual are incommensurable or disparate. What relates them is the process of transduction or an operation that fuels both the ontogenesis of knowledge and that of being. As an activity, transduction propagates by successive phases in the physical, biological, mental, and social domains, and, as it does so, it structures and transforms itself and its structuration.21 The impact of the logic of transduction on allagmatic epistemology was not lost on Simondon.22 According to him, the discovery of field theories was the historical condition that yielded the conceptual tools for studying individuation qua transduction. Before thermodynamics and wave mechanics, we were stuck with deductive truths and inductive facts.23 While deduction remains tautological and induction all-too-extrapolating, transduction is not reflective or representational but, in Karen Barad’s terms, diffractive.24 Transduction is not a matter of communication, for how could it be, given that it is about progressive differentiation or falling out of phase with itself? It is about non-entailment and impredicativity.25 In other words, what is defined participates in its definition. Michael Turvey provides a graphic example by reference to a macroscopic object such as a knee-high stone. Its height is a primary predicative property. By contrast, its ‘jump-on-ability’ is the relational impredicative property for a cat, but not for an ant. For an ant, the stone is ‘crawl-up-able.’ Height and affordance are thus two different and complementary conceptions of a single macroscopic object sustaining two different and complementary conceptions of causality.26 The stone’s potential properties are actualised only when specific spatio-temporal relations hold, and the stone is not equivalent to any of them. In the context of the cat and ant, the stone is in a superposition of two states and manifests two affordances simultaneously. Consequently, this everyday object could be said to have no definite state and as such qualifies

92

Chapter 6

for a quantum-compatible conception.27 By analogy, plasticity and contingency as they pertain to architectural technicity are inconceivable without the condition that is neither stable nor instable. This is how Simondon describes the concept of metastability: The interior past and the exterior future confront each other at the level of the polarized milieu: this confrontation in the operation of selective assimilation is the present of the living being, which is formed by this polarity of passage and obstruction between past substances and substances to come that are present to one another via the operation of individuation; the present is this metastability of the [bond] between interior and exterior, past and future; the exterior is exterior and the interior is interior relative to this mutual allagmatic activity of presence.28

The mutual allagmatic activity of presence is what Erin Manning more recently revamped as ‘immediation.’29 By contrast to mediation, immediation neither makes a priori assumptions about what can make a difference, nor does it assume the role of the arbiter in the supposed ‘interaction’ between two existing limit-points. As Simondon underscores, topology and chronology fully coincide in the vital individuation, and it is only later on and according to psychical and collective individuations that the coincidence can be broken. Topology and chronology are not a priori forms of sensibility, but the very dimensionality of the living being undergoing individuation.30

Clearly, Simondon does not shy away from the critique of Kantianism. For him, space and time are not a priori. Is this not an affordance theory avant la lettre? Insofar as it deals with transformative dynamisms that cannot be designated by an objective domain, allagmatics belongs to what Erich Hörl calls a ‘generalised ecology.’31 But how does an ecology decide? This is the question that Rodrigo Nunes addresses in his book Neither Vertical nor Horizontal.32 According to him, an ecology is less than an organism and more than an organisation. ‘Less than an organism’ means that the emergence and functioning of its components is not determined in advance by some unifying principle. ‘More than an organisation’ means that an ecology is not intentional (not constituted by an act of will) and does not have agreed-upon boundaries. Nunes’s venturing below and beyond the bounds of the individual conforms to Simondon’s pre-individual and transindividual milieus, or what Deleuze and Guattari designated as epistrata and parastrata.33 In the subsequent section, I will provide a cartography of such ecological exchanges and modifications of states akin to the Stengerian ‘relaying’ operations:

Allagmatics of Architecture

93

The question can no longer be, then, one of commentary, rendering explicit what would have remained implicit, clarifying or elucidating. . . . Rather it is about ‘consolidating’ just a little more—always a little more—which is to say, forming relays. . . . As it happens, in regard to thinking about life, it will be a question of forming relays in the manner that A Thousand Plateaus struggles against the nearly irresistible slope that would transform the ‘voyage’ of thought into the destination, into the position of its final definition, and simultaneously assign an end, in the double sense of the term, to thought.34

It is, of course, just as difficult to define an operation as it is to define a structure, other than by example. The example we will draw on comes from Félix Guattari’s Schizoanalytic Cartographies.35 I am referring to his diagram of the Fourfold that is not a structuralist synoptic model, but a machinic synaptic meta-model, which surveys the singularities at play.36 This is no longer the ontic world of simple mereological (part-to-whole) relations, but the synaptic or mereotopological (parts-and-boundaries) world of transformations, events, and occurrences.37 Guattari’s four ‘unconsciousnesses’ are existential Territories (T), incorporeal universes of Value (U), energetic-semiotic Flows (F), and abstract Machinic Phyla (Φ). T and U belong to the virtual half of the diagram, while F and Φ occupy the actual half.38 The Fourfold will help us grasp the dynamic unity of an open system without identity, as operation and structure at the same time. The ‘whole’ here is not of the parts but alongside them. It comes with an important caveat: architectural technicity is, from the beginning, pharmacological.39 The event that is existence can be understood as a nurturing continuous temporal permutation linking and transforming the four functors. Any attempt to freeze the cycle into a structure of fixed relations, or guide it along a predetermined and repetitive path, will lead to the toxic condition of domination.40 Thinking ecologically implies a shift away from conceiving an assemblage solely in terms of a zero-sum game. An assemblage encompasses plasticity, or a variation of the means of bringing about an end; protention, or coordinating ongoing modulation with emerging states of affairs; and retention, or coordinating ongoing modulation with prior states of affairs. The fourfold is ‘axiontological’ as it takes on both a normative and metaphysical sense. This is the Simondonian way of saying that it grasps the reciprocity of the axiological and ontological. In his review from 1966, Deleuze describes Simondon’s work as a profoundly original theory of individuation: physical, vital, and psycho-social. Traditionally, the principle of individuation relates to an already made, fully constituted individual. For Simondon, individuation comes first and ‘ceases to be co-extensive with being.’41 The preliminary condition of individuation is the existence of a metastable system. The state of such a pre-individual being is singular without being individual. Individuation is the organisation of

94

Chapter 6

a solution, or, better, resolution for an objectively problematic system marked by disparation of at least two orders of magnitude. So what is the difference between physical and vital individuation? The physical individual is content to receive information once. The prime example is the crystal, which reiterates an initial singularity, grows outwardly on the surface, and makes no use of interiority. By contrast, a living being successively receives many batches of information and adds up many singularities. The increasingly complex levels of individuation bring us closer to our concern, namely architecture’s role in transindividuation. A properly psychical individuation arises precisely when the vital functions are not sufficient to resolve the problems posed to the living. This threshold may be said to mark the birth of immunology.42 It is not implausible that the emergence of an immune system is owed to the incorporated expectation of injury or risk of potential harm. It mobilises anticipation, or protention, whereby a new charge of pre-individual reality is mobilised within a new problematic, calling for a new process of resolution.43 As William James put it in his Essays on Radical Empiricism, if experience was perfectly harmless or smooth, there would be no need to isolate its terms, no need for concepts.44 Finally, transindividuation is triggered by what Simondon’s mentor Georges Canguilhem calls ‘inconsistency of the environment.’45 This ‘infidelity of environment,’ as Stiegler put it, calls the Darwinian ‘perfect fit’ into question.46 In the words of Ronald Bogue: The motor of such a universe is not survival of the fittest but creation, an experimental assemblage of heterogeneous forms for no other reason than that they are possible. Those forms display varying levels of autonomy in their organization of time and space and in their differentiation of functions, a general process.47

As Deleuze put it in his book on a thinker who was also highly appreciated by Simondon: ‘Nietzsche criticises Darwin for interpreting evolution and chance within evolution in an entirely reactive way. He admires Lamarck because Lamarck foretold the existence of a truly active plastic force, primary in relation to adaptations: a force of metamorphosis.’48 In her book On Habit, Clare Carlisle explains the importance of the distinction between plasticity and flexibility, fluidity and amorphousness: If we were simply receptive to change, without limit, then we would be incapable of habit. Each new action or experience would transform us, so that we would have no character or integrity to call our own. We would be empty, entirely subject to circumstance, blown hither and thither by the winds of change. Neither absolute resistance nor absolute receptivity allow a thing to have a nature.49

Allagmatics of Architecture

95

The theory of niche construction proposes that an organism does not only passively submit to the pressures of a pre-existing environment, but that it actively alters it (genetically, epigenetically, behaviourally, and symbolically).50 The neo-Lamarckian approach does not conflate the map with the territory and resists overfitting for the sake of optimisation, the latter being the wet dream of Parametricists. To echo Antoinette Rouvroy, who draws on Derrida’s critique of the metaphysics of presence, we have to stop reducing the future to an optimisation of the present.51 In the same vein, Wendy Chun reveals how the new epistemology of correlation—which supersedes causation and is at the core of contemporary data sciences—seeks to close off the future by operationalising probabilities.52 Such reductions inevitably lead to a loss of knowledge, or what Stiegler called the ‘proletarianisation of the sensibility’ caused by the culture industries’ canalisation and reproduction of perception.53 According to Guattari, ‘black holes’ as catatonic states appear precisely when an organism completely coincides with its environment, thus preventing the psychic to open upon a ‘transindividual collective.’54 The problem can be recast in terms of the degree of mnemonic detachability. In addition to the primary memory as genetic information expressed in DNA, and secondary memory acquired epigenetically through a complex nervous system, there is also tertiary memory.55 This epiphylogenetic memory is an accumulation and retention of historical epigenetic differentiations within the spatio-temporal organisation of material environments.56 It is beyond dispute that the built environment—as a differential exo-organisation—holds the capacity to catalyse noiesis. The built environment becomes an exteriorised artificial organ and the line ‘we build our cities and in return they build us’ takes a literal meaning. Thus we must debunk the genesis of thinking in thought itself. In contrast to the phenomenological fallacy that presupposes a fully constituted subject with a point of view, allagmatics of architecture fosters a ‘pedagogy of the senses’ where points of view engender their subjects. This reversal of the (illegitimately) reversed ontology constitutes ‘radical perspectivism’: ‘What makes me = me is a point of view on the world.’57 Simondon insists that being can only be grasped in reciprocity with becoming, through which the ontogenetic process transduces or propels itself. Knowledge of individuation = the individuation of knowledge.58 One may refer here to the Spinozian ‘third kind of knowledge’ (the third ethics), not of effect or cause (common notion), but an intuitive (quasi-causal) mode of knowledge that is our segue to the Fourfold.59

96

Chapter 6

Figure 6.1 Axes of reference and consistency based on Guattari’s Schizoanalytic Cartographies. Copyright Andrej Radman.

THE FOURFOLD Guattari’s horizontal axis of reference maps onto Simondon’s ontological real difference from his neologism of the axiontological, whereas his vertical axis of consistency maps onto Simondon’s normative modal difference. God is a lobster (Figure 6.1).60 This double articulation that relates forms and substances of both content and expression challenges our ingrained notion of causality, hence the advocacy of quasi-causality. This chapter refrains from exemplifying theoretical concepts and offering recipes for practice. It also tries to avoid reification, hence the insistence on the relational concept of affordance. What an architect designs are first and foremost capacities (enabling constraints), and not properties. The existential niche thus conceived—as a set of affordances—is irreducible to the manifest architectural properties. The allagmatics of architecture challenges the alleged primacy of the ‘physical’ world. What we engage with is the world considered as an environment and not an aggregate of objects.61 Insofar as affordances are activity-specific ‘meanings,’ ‘there can be no relevant ecology without a correlate ethology.’62 There is no causal sequence proceeding from environment to stimulation to organs to brain. Rather, there is a non-totalising unity achieved at the interface between the perceiver and the environment by means of resonance to information.63 The emphasis is on the encounter, where experience is seen as an emergence that returns the body to a process field of exteriority.

Allagmatics of Architecture

97

Consider the case of the irreducible multiplicity known as the Metropolis based on Rem Koolhaas’s influential book Delirious New York (1978).64 His retroactive manifesto maps the genealogy of Manhattan from 1850, when it became a mythical laboratory for the invention and testing of a revolutionary mode of life. It was arguably Otis’s contingent invention of the mechanical vertical flow, a.k.a. elevator (F), that triggered the ‘speciation’ of the skyscraper (Φ). The operation of stacking floors nourished the Culture of Congestion (U), which in turn affected the process of subjectification (T). The effect-cum-cause is the contingently obligatory ‘bachelor machine’ whereby the subject is produced as a remainder, an appendage to the big machine to which it is connected. Deleuze and Guattari characterised the existential apprehension as a machine of consumption under the third (conjunctive) synthesis of time, a gratification (self-enjoyment) that could be seen as auto-erotic or auto-affective: Let us borrow the term ‘celibate machine’ to designate this machine that succeeds the [first] paranoiac machine and the [second] miraculating machine, forming a new alliance between the desiring-machines and the body without organs so as to give birth to a new humanity or a glorious organism. This is tantamount to saying that the subject is produced as a mere residuum alongside the desiring-machines, or that he confuses himself with this third machine and with the residual reconciliation that it brings about: a conjunctive synthesis of consummation in the form of a wonderstruck ‘So that’s what it was!’65

Let us flesh out the Guattarian fourfold by reference to Koolhaas’s ‘synopsis’ of The Retroactive Manifesto.66 The elevator (F) is the ultimate self-fulfilling prophecy. The further up it travels, the more undesirable the circumstances it leaves behind. Any given site could now be multiplied ad infinitum. The Skyscraper (Φ) is one of the rare truly revolutionary twentieth-century buildings, or evolutionary by means other than life. It offers a full inventory of technical and psychological modifications quasi-caused by Metropolitan life. The psychoactive architectural technicity engenders a previously unthinkable spectrum of experiences. Remember, something forces us to think. It is not a matter of volition; rather, mind emerges from matter.67 The Metropolitan Condition (U) engenders an architecture with its own laws, methods, and achievements that has so far remained largely outside the field of official architecture and critique. Finally, (T) The Metropolitan Subject that steps out of the elevator on the N-th floor to find itself in a vestibule leading to a locker room, where it undresses and puts on gloves to enter an adjoining space equipped for boxing and wrestling. The locker room is also served by an oyster bar. ‘Eating oysters with boxing gloves naked on the N-th floor, that’s the 20th century in action,’ concludes Koolhaas.68 What his DNY qua Guattari’s

98

Chapter 6

F-Φ-U-T expounds is in fact anticipated by Simondon in his complementary doctoral thesis On the Mode of Existence of Technical Objects: Adaptation-concretization is a process that conditions the birth of a milieu rather than being conditioned by an already given milieu; it is conditioned by a milieu that only exists virtually before invention; there is invention because there is a leap that takes place and is justified by means of the relation that it brings about within the milieu that it creates: the condition of possibility of this turbo-generator couple is its realization.69

In his posthumously published Technics and Time 4, Stiegler concludes that what is at stake is a ‘conditioning of the present by the future, by that which is not yet.’70 This concerns the top half of the fourfold diagram—(Φ-U) far from equilibrium—where resingularisation occurs, a ‘horizon of finalities’ both forming and formed by what after Husserl is called protentions. According to Stiegler, such a horizon is possible only because the present and the future are presented and projected on the basis of tertiary retentions. Exo-somatisation constitutes these protentions precisely as an associated milieu that is both trans- and pre-individual, conforming to the left and right halves of the fourfold respectively. In the words of Simondon: The technical object insofar as it has been invented [F], thought [Φ] and willed [U], and taken up by a human subject [T], becomes the medium and symbol of this relationship, which we would like to name transindividual.71

According to Stiegler, this is a question of analysing the transductive co-individuation through the exo-somatic co-genesis of functions that gets metastabilised and materialised.72 This means that the ‘architectural enunciation,’ to use Guattari’s term from Schizoanalytic Cartography, is always already collective or constituted by the technical transindividual that precedes it.73 It also means that it is impossible to isolate the posthuman subject from the preceding exo-somatic productions that are carriers of information defined as a difference that makes a difference. It is safe to read the term ‘posthuman’ as a near synonym of ‘non-anthropocentric.’ Simondon sensed before many of his contemporaries that something was missing from the quantitative crypto-structuralist approaches to information.74 His transindividual remains outside of any interindividual relation. CONCLUSION This chapter launches an eco-logical ‘perspectivist’ assault on ego-logical representational thinking. As David Lapoujade put it, “the error lies in

Allagmatics of Architecture

99

believing that the perspectives are added from the outside to a preexisting world, ‘on’ which they have a point of view. Once again, they are not external to the world; on the contrary, the world is internal to the perspectives.”75 Where classical humanism based the representational unity of space and time upon the formal unity of consciousness, the posthuman difference fractures consciousness into multiple states not predicable of a single subject. To speak of allagmatics of architecture is to break with the differentiation of an undifferentiated world in favour of the homogenisation of a milieu. Lapoujade offers a helpful distinction between Étienne Souriau’s world-making ‘instauration’ and the fundamentalism of grounding: The grounding preexists, by right, the act that nonetheless posits it; it is external or superior to what it grounds, while instauration is immanent to what it instaurs. Instauration has no support other than its own gesture, nothing preexists it. . . . In other words, to ground is to make preexist, while to instaur is to make exist, though to make exist in a certain manner—(re)invented each time.76

In the words of Souriau himself: ‘To exist is always to exist in some manner. To have discovered a manner of existing, a special, singular, new, and original manner of existing, is to exist in your own manner.’77 It is, he stresses, an ‘accomplishment’ of transforming virtual into concrete: incommensurable realms, still, but ‘in the experience of making, I grasp the gradual metamorphosis of the one into the other, I see how that virtual existence is transformed, little by little, into a concrete existence.’78 In its zest to engage the influential and unavoidable ontic domain of the code, much of contemporary architectural theory often dismisses the co-constitutive pathic domain of the 4EA (embodied, embedded, enactive, extended. and affective) axiology as it pertains to the process of (re)territorialisation. Once again, the map is not the territory and, consequently, the value-free ‘presentism’ fostered by the algorithmic approach is problematic. As Achille Mbembe put it: The integration of algorithms and big data analysis in the biological sphere is not only bringing with it a greater belief in techno-positivism, in modes of statistical thought, it’s also paving the way for regimes of assessment of the natural world, modes of prediction and analysis that are treating life itself as a computable object.79

The built environment ought to be recast as the collective apparatus of subjectification critical of forms of digital mysticism. The ‘pure presence’ of data remains amnesic of everything that would refer it to the singularities of the material world and ethology.80 While the digital is per definition concerned with the effectuated, quasi-causality remains irreducible to the state of

100

Chapter 6

affairs.81 Only the given is calculable. By contrast, allagmatics gives access to non-local causality, which is real albeit in the impredicative sense. Given the framework of this volume, I will conclude on a pedagogical note by ventriloquising Stiegler.82 It is the ability to constantly enrich and renew knowledge that finds itself exhausted and sterilised today. Privileging the efficiency of applied over speculative research, purely and simply based on computational criteria, discredits academia. Furthermore, the epistemology of correlation makes Dataism incapable of rearticulating the four quasi-causes, incapable of re-evaluating values, and incapable of escaping the lingering structuralism in digital guise that remains trapped in the shackles of efficient causality. We can do better. Let us start by placing architectural technicity squarely within the domain of ecology. NOTES 1. Elisabeth Grosz, ‘Chaos, Cosmos, Territory, Architecture,’ in Chaos, Territory, Art: Deleuze and the Framing of the Earth (New York: Columbia University Press, 2008), 1–24. 2. Benjamin H. Bratton, The Terraforming (Moscow: Strelka Press, 2019), https:​ //​s3​.eu​-west​-1​.amazonaws​.com​/strelka​.storage​/2020​/4​/8100070b​-5651​-4409​-bc4c​ -cac813e51124​/the​_terraforming​_fin​.epub. 3. Gilles Deleuze, Difference and Repetition, translated by Paul R. Patton (New York: Columbia University Press, 1994 [1968]), 139. 4. Gilbert Simondon, On the Mode of Existence of Technical Objects, translated by Cécile Malaspina and John Rogrove (Minneapolis: Univocal Publishing, 2017), 17–18. 5. Daniel Smail, On Deep History and the Brain (Berkeley: University of California Press, 2008), 161. 6. Andrej Radman and Stavros Kousoulas, eds., Architectures of Life and Death: The Eco-Aesthetics of the Built Environment (London: Rowman & Littlefield International, 2021). 7. An abbreviated version of this chapter was presented in a talk on December 12, 2021, at DigitalFUTURES Doctoral Consortium: What can architects learn from philosophy?; Session on Simondon, under the title ‘Architecture’s Allagmatics: A General Theory of Quasicausality.’ 8. Bernard Stiegler, Technics and Time, 1: The Fault of Epimetheus, translated by Richard Beardsworth and George Collins (Stanford, CA: Stanford University Press, 1998), 135. 9. Claire Mary Colebrook, ‘Sex and the (Anthropocene) City,’ Theory, Culture & Society, 34, nos. 2–3 (2017): 59. 10. Monika Bakke, ‘Art and Metabolic Force in Deep Time Environments,’ Environmental Philosophy, 14, no. 1 (2017): 46–47.

Allagmatics of Architecture

101

11. The chapter can be read in conjunction with Stavros Kousoulas’s ‘Ananke’s Sway: Architectures of Synaptic Passages’ in this volume. The two chapters cut things together and apart (Karen Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning [Durham: Duke University Press, 2007], 389). On Simondon, also see in particular Woodward, ‘Information and Alterity: From Probability to Plasticity’ in this volume. 12. Drawing upon Spinoza, Deleuze contrasts Morality to Ethics as a topology of immanent modes of existence. By contrast, Morality always refers existence to transcendent values. See Gilles Deleuze, Spinoza, Practical Philosophy, translated by Robert Hurley (San Francisco: City Lights Books, 1988 [1970]), 23. 13. Isabelle Stengers, ‘Introductory Notes on an Ecology of Practices,’ Cultural Studies Review 11, no. 1 (2005): 188. 14. Anne Sauvagnargues, Artmachines: Deleuze, Guattari, Simondon, translated by Suzanne Verderber and Eugene W. Holland (Edinburgh: Edinburgh University Press, 2016). 15. Gilles Deleuze and Félix Guattari, A Thousand Plateaus, translated by Brian Massumi (Minneapolis: Minnesota University Press, 1987 [1980]), 160. 16. Andrej Radman, ‘Space Always Comes After: It Is Good When It Comes After; It Is Good Only When It Comes After,’ in Speculative Art Histories: Analysis at the Limits, edited by Sjoerd van Tuinen (Edinburgh: Edinburgh University Press, 2017), 185–201. 17. There is also a whole chapter with the eponymous heading under ‘Supplements.’ See Gilbert Simondon, Individuation in Light of Notions of Form and Information, translated by Taylor Adkins (Minneapolis: University of Minnesota Press, 2020). 18. Ibid, 74. 19. John Sellars, ‘The Point of View of the Cosmos: Deleuze, Romanticism, Stoicism,’ Pli, 8 (1999): 1–24. 20. Henri Bergson, ‘The Possible and the Real,’ in The Creative Mind: An Introduction to Metaphysics (Mineola, NY: Dover Publications 2007 [1946]), 98. 21. Gilbert Simondon, ‘The Position of the Problem of Ontogenesis,’ translated by Gregory Flanders, Parrhesia, 7 (2009): 11. 22. Simondon, Individuation in Light of Notions of Form and Information, 111. 23. ‘We are at an impasse: unable to return to the deductive model of ideal truths, but equally unable to rely on the inductive method or simple fact-checking to verify truth.’ See Luciana Parisi, ‘Reprogramming Decisionism,’ e-flux journal, 85 (October 2017): 3; and Parisi’s chapter, ‘Transcendental Instrumentality and Incomputable Thinking,’ in this volume. 24. The methodology of reflexivity mirrors the geometrical optics of reflection, and that for all of the recent emphasis on reflexivity as a critical method of self-positioning it remains caught up in geometries of sameness; by contrast, diffractions are attuned to differences—differences that our knowledge-making practices make and the effects they have on the world.

(Barad, Meeting the Universe Halfway, 71–72.)

102

Chapter 6

25. Marc Boumeester and Andrej Radman, ‘The Impredicative City: or What Can a Boston Square Do?,’ in Deleuze and the City, edited by Hélène Frichot, Catharina Gabrielsson, and Jonathan Metzger (Edinburgh: Edinburgh University Press, 2016), 46–63. 26. In the words of Gibson: An affordance is neither an objective property nor a subjective property; or it is both if you like. An affordance cuts across the dichotomy of subjective-objective and helps us to understand its inadequacy. It is equally a fact of the environment and a fact of behavior. It is both physical and psychical, yet neither. An affordance points both ways, to the environment and to the observer. James Jerome Gibson, The Ecological Approach to Visual Perception (New Jersey: Lawrence Erlbaum Associates, 1986 [1979]), 129.

Resisting the temptation to conceive of the affordance as an objective property, we might follow Turvey’s proposal: ‘Ask not what an affordance is; ask rather what it can do for you.’ Michael T. Turvey, ‘Quantum-Like Issues at Nature’s Ecological Scale (The Scale of Organisms and Their Environments),’ Mind and Matter 13, no. 1 (2015): 44. 27. Turvey, ‘Quantum-Like Issues,’ 35. 28. Simondon, Individuation in Light of Notions of Form and Information, 245. 29. Erin Manning, ‘Toward a Politics of Immediation,’ Frontiers in Sociology, 3, no. 42 (2019): 6. 30. Simondon, Individuation in Light of Notions of Form and Information, 245. 31. Erich Hörl, ‘The Ecologization of Thinking,’ in General Ecology: The New Ecological Paradigm, translated by N.F. Schott, edited by Erich Hörl (London: Bloomsbury Academic, 2017), 1–74. 32. Rodrigo Nunes, Neither Vertical nor Horizontal: A Theory of Political Organisation (London: Verso, 2021), 152. 33. Deleuze and Guattari, A Thousand Plateaus, 50–52. Cf. Andrej Radman, ‘Involutionary Architecture: Unyoking Coherence from Congruence,’ in Posthuman Ecologies: Complexity and Process after Deleuze, edited by Rosi Braidotti and Simone Bignall (London: Rowman & Littlefield International, 2019), 61–86. 34. Isabelle Stengers, ‘Thinking Life: The Problem has Changed,’ in Posthumous Life: Theorizing Beyond the Posthuman, edited by Jami Weinstein and Claire Colebrook (New York: Columbia University Press, 2017), 329. 35. Félix Guattari, Schizoanalytic Cartographies, translated by Andrew Goffey (London: Bloomsbury, 2013 [1989]). 36. The concept of ‘survey’ is appropriated from Ruyer. Life is always in touch with its mnemonic potential, always ‘surveying’ themes for action, improvising on learned instincts, manifesting inherited abilities, growing, changing, and evading a reduction to purely physical function—that is, to actuality. The living being “is never ‘fully assembled’; it can never confine itself to functioning, it incessantly ‘forms itself’.” See Raymond Ruyer, Neofinalism, translated by Alyosha Edlebi (Minneapolis: Minnesota University Press, 2016 [1952]), 147. The concept of ‘machinism’

Allagmatics of Architecture

103

comes from Deleuze and Guattari: ‘What we term machinic is precisely this synthesis of heterogeneities as such.’ Deleuze and Guattari, A Thousand Plateaus, 330. 37. David Lapoujade, The Lesser Existences: Étienne Souriau, an Aesthetics for the Virtual, translated by Erik Beranek (Minneapolis: Minnesota University Press, 2021), 38. Cf. Barry Smith, ‘Mereotopology: A Theory of Parts and Boundaries,’ Data and Knowledge Engineering, 20 (1996): 287–303. 38. While the term ‘deterritorialisation’ is sufficiently intuitive, decoding may be confused for deciphering the secret of a code. When Deleuze and Guattari use the term, they mean ‘undoing a code.’ See Daniel W. Smith, ‘Flow, Code, and Stock: A Note on Deleuze’s Political Philosophy,’ in Essays on Deleuze (Edinburgh: Edinburgh University Press, 2012), 171. 39. Bernard Stiegler, What Makes Life Worth Living: On Pharmacology, translated by Daniel Ross (Cambridge: Polity Press, 2013). 40. Brian Holmes, ‘Guattari’s Schizoanalytic Cartographies: or, the Pathic Core at the Heart of Cybernetics,’ in Continental Drift (2009), https:​//​brianholmes​.wordpress​ .com​/2009​/02​/27​/guattaris​-schizoanalytic​-cartographies​/. 41. Gilles Deleuze, ‘Review of Gilbert Simondon’s L’individu et sa genèse physico-biologique (1966),’ translated by Ivan Ramirez, Pli, 12 (2001): 43. 42. Inge Mutsaers, Immunological Discourse in Political Philosophy: Immunisation and its Discontents (London: Routledge, 2016). 43. Retention leads into and feeds protention which, in turn, rests and draws upon retention. 44. William James, Essays in Radical Empiricism (London: Longmans, Green, and Co., 1912), 96–97. 45. Georges Canguilhem, The Normal and the Pathological, translated by Carolyn R. Fawcett (New York: Zone Books 1991 [1966]). Another mentor of Simondon’s was Maurice Merleau-Ponty. 46. In the words of Patricia Reed: As a movement of non-adaptation, sociodiagnostics is the collective labour in making an incomplete, non-total picture of a particular world configuration intelligible and/or available to sensation, in a procedure that disproves the auto-reinforcing naturalization of necessity that stabilizes a social condition as complete and given (i.e., as unalterable). (my emphasis) Patricia Reed, ‘The Valuation of Necessity,’ in Blockchains & Cultural Padlocks | Towards a Digitally Cooperative Culture: Recommoning Land, Data and Objects, edited by Jesse McKee and Rosemary Heather (Vancouver: 221a, 2021), 130.

47. Ronald Bogue, ‘Natura Musicans: Territory and the Refrain,’ in Deleuze on Music, Painting and Arts (New York: Routledge, 2003), 58. 48. Gilles Deleuze, Nietzsche and Philosophy, translated by Hugh Tomlinson (New York: Columbia University Press, 2006 [1962]), 42. 49. Clare Carlisle, ‘The Concept of Habit,’ in On Habit (London: Routledge, 2014), 21 (emphasis in the original).

104

Chapter 6

50. Eva Jablonka and Marion J. Lamb, Evolution in Four Dimensions: Genetic, Epigenetic, Behavioral, and Symbolic Variation in the History of Life (Cambridge, MA: MIT Press, 2005). 51. Antoinette Rouvroy, ‘Adopt AI, Think Later: The Coué Method to the Rescue of Artificial Intelligence’ (2020). 52. Wendy Hui Kyong Chun, Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition (Cambridge, MA: The MIT Press, 2021). 53. Bernard Stiegler, ‘The Proletarianization of Sensibility,’ Lana Turner Journal, 4 (2012). 54. Félix Guattari, ‘Drive, Black Hole’ (2 October 1980), translated by Taylor Adkins, https:​//​fractalontology​.wordpress​.com​/2020​/07​/22​/new​-translation​-of​ -guattaris​-seminar​-drive​-black​-hole​/. 55. Bernard Stiegler, Technics and Time, 1. 56. Robert A. Gorny and Andrej Radman, ‘From Epiphylogenesis to Generalised Organology,’ in The Epiphylogenetic Turn and Architecture: In (Tertiary) Memory of Bernard Stiegler, Footprint 16/1, no. 30 (Delft: Architecture Theory Chair in partnership with Jap Sam Books, 2022), 3–19. 57. Gilles Deleuze, ‘Leibniz’ (1980). Cours Vincennes, translated by Charles J. Stivale, https:​//​www​.webdeleuze​.com​/textes​/50. 58. Simondon, Individuation in Light of Notions of Form and Information, 17. According to Jean-Hugues Barthélémy, the thesis expounds a way of overcoming the opposition between subject and object, which is the definitive ground of all the classical oppositions such as empiricism and innateness, idealism and realism, and dogmatism and scepticism. See Jean-Hugues Barthélémy, “Individuation and Knowledge: The ‘Refutation of Idealism’ in Simondon’s Heritage in France,” SubStance, 41, no. 3 (2012), 60–75. 59. Gilles Deleuze, “Spinoza and the Three ‘Ethics’,” in Essays Critical and Clinical, translated by Daniel W. Smith and Michael A. Greco (Minneapolis: University of Minnesota, 1997 [1993]), 151. 60. Deleuze and Guattari, A Thousand Plateaus, 44. 61. A similar attitude is reflected in the title of Harry Francis Mallgrave’s book From Object to Experience: The New Culture of Architectural Design (London: Bloomsbury Visual Arts, 2018). 62. Stengers, ‘Introductory Notes on an Ecology of Practices,’ 187. 63. Thomas J. Lombardo, The Reciprocity of Perceiver and Environment: The Evolution of James J. Gibson’s Ecological Psychology (Hillsdale: Erlbaum, 1987), 297. 64. Rem Koolhaas, Delirious New York: A Retroactive Manifesto for Manhattan (New York: Oxford University Press, 1978). 65. Gilles Deleuze and Félix Guattari, Anti-Oedipus, translated by Robert Hurley, Mark Seem, and Helen R. Lane (New York, NY: Penguin, 2008 [1972]), 17–18. 66. Rem Koolhaas, “‘Life in the Metropolis’ or ‘The Culture of Congestion’,” Architectural Design, 5 (1977): 319–25. 67. To borrow the expression from Terrence Deacon’s book subtitle. See Terrence W. Deacon, Incomplete Nature: How Mind Emerged from Matter (New York and London: W.W. Norton & Company, 2012).

Allagmatics of Architecture

105

68. Architectural Association School of Architecture’s website hosts a video dating back to March 14, 1976, of the young Koolhaas pitching the idea at its event. Rem Koolhaas and Elia Zenghelis—OMA, https:​//​youtu​.be​/ZXtyrp340gY. 69. Simondon, On the Mode of Existence of Technical Objects, 58. 70. Bernard Stiegler, Technics and Time 4: Faculties and Functions of Noesis in the Post-Truth Age, translated by Daniel Ross (2021), 130–131, https:​//​www​.academia​ .edu​/58785373​/Stiegler​_Technics​_and​_Time​_4​_Faculties​_and​_Functions​_of​_Noesis​ _in​_the​_Post​_Truth​_Age. 71. Simondon, On the Mode of Existence of Technical Objects, 252. 72. Stiegler, Technics and Time 4, 131. 73. Félix Guattari, ‘Architectural Enunciation,’ in Schizoanalytic Cartographies, 231–39. 74. As Reed put it: Sensitivity to information—as a catalyst for transductive processes—entails the sensitivity for reconfiguring frameworks of thought and making new commitments, not submitting thought to normatively encoded probabilistic ends that merely rehearse the game’s consensual axioms, but the capacity to invent new rules (frames of reference) that reconfigure the governing codes of the game-space as such. (Reed, ‘The Valuation of Necessity,’ 159.)

75. Lapoujade, The Lesser Existences, 35. 76. Ibid, 56 (emphasis in the original). 77. Étienne Souriau, L’Instauration philosophique (Librairie Félix Alcan, 1939), 366. 78. Étienne Souriau, ‘On the Work to Be Made,’ in The Different Modes of Existence, translated by Erik Beranek and Tim Howles (Minneapolis: University of Minnesota Press, 2021 [1943]), 225. 79. Achille Mbembe and Nils Gilman, ‘How to Develop a Planetary Consciousness,’ Noema Magazine (2022), https:​//​www​.noemamag​.com​/how​-to​-develop​ -a​-planetary​ -consciousness​/​ ?fbclid​=IwAR17lt0TSex​ _jnfoCnBnLO8Es​ _DIryAc​ _cBbHeDQ2hBayLk0hZjn8Q2vVws. 80. Rouvroy, ‘Adopt AI, Think Later.’ 81. Parisi identifies a new imperative that she calls ‘technological decisionism.’ It values making a clear decision quickly more than it does making the correct one. For decisionism, what is most decisive is what is most correct. See Parisi, ‘Reprogramming Decisionism,’ 1. 82. Stiegler, Technics and Time 4, 277–78.

Chapter 7

Computation and Material Transformations Dematerialisation, Rematerialisation, and Immaterialisation in Time-Based Media Oswaldo Emiddio Vasquez Hadjilyra

Continental philosophy, cultural studies, and media theory have been prone to multiple ‘turns’ and ‘returns’ in recent decades, in what now feels like a theory vertigo. This chapter aims to reorient, recalibrate, and contextualise the ‘computational turn’ along with the ‘turn to matter.’ It first investigates a prehistory of computation and its relationship to numbering, timekeeping, and physical matter, and then speculates on alternative modes of computation found in different media such as sound, photography, video, and media art. I find in these cases exemplifications of the material transformations that media undergo in our affective, and their effective, encounters with the world, and argue for media’s instrumental capacity to measure and compute the world. These accounts ultimately aim at reassessing the viability of new materialism in the face of an impending ubiquitous computation that digitally mediates our experience of the world, and at renegotiating those mediations away from essentialist divisions and into a more active role in transforming, synthesising, and even performing matter, digital or not. Drawing on a Heideggerian account of the history of philosophy and science, in a paper presented at the 2010 conference The Computational Turn, held at Swansea University,1 Yuk Hui offered an early attempt at deriving an image of the computational, understanding it as networks formed by ‘patterns and repetitions of digital matter,’2 but without explaining what the properties 107

108

Chapter 7

of such matter would be. Instead, reflecting on Heidegger’s essay ‘The Age of the World Picture,’ Hui invites us to approach the computational ‘from the question of Weltbild, Heidegger’s world picture,’ which is, according to Hui, Heidegger’s ‘fundamental, intuitive and immediate understanding of the world’ through which we ‘reduce the unseen to graspable entities.’3 Following Heidegger, one way to understand modernity is through the way that technology turned humans into beings that enframe (Gestell) and represent the world to themselves and for their use. Hence, culture’s periodisation becomes one of confrontation between world pictures or worldviews. By focusing on the world picture of the computational and anticipating his own proposition for the network as its image, Hui omits Heidegger’s elaboration on modern physics and its role in grounding the world picture of modern science. Considering physics’ own relationship to the ‘mathematical’—which for Heidegger is not the same as the numerical but nonetheless is reserved for it, since numbers represent the most striking cases of the ‘always-already-known’4—and given the title and nature of the conference, this omission is worth reflecting upon. Heidegger’s world picture, as an enframing that originates within European culture and carries its own inevitable destiny, must also on some level carry its premodern conditions. According to Hui, since Heidegger does not provide the details of how the world picture expresses ‘a differentiated stage of development of human knowledge’ that would, for example, explain ‘the shift from the ancient to the medieval,’ the overall inconsistency in Heidegger’s theory of the world picture becomes problematic. Hence, as a complement to Heidegger’s omission, Hui contrasts the narrative of mechanisation in Eduard Jan Dijksterhuis’s work The Mechanization of the World Picture. Rather than merely assuming that the world picture is a consequence of modern science and technology, and therefore embedded in the metaphysics of one’s time— another error that Heidegger commits—Hui wants us to conceive, in a temporal unity, the redoubling of the image of the mechanisation of the world that takes place. The image of the world as machine and the machine itself exist in a co-constitutive relationship: ‘the world picture determines the thought behind science and development’ and ‘science and technology development reinforce and crystallize the world pictures.’5 Hence, the world picture, in its relational redoubling, is repeatedly undermined by scientific enquiry and can only be disrupted by a radically new world picture that replaces it. The new world picture’s strength depends on its own ‘capacity for change’6 without eroding. According to Hui, quantum mechanics provided the conditions for such a rupture, since the distinctness found in the laws of classical mechanics were being diluted by new, indeterminate laws. In ‘Quantum Theory as Technology,’ Taylor Carman provides a more extensive depiction of Heidegger’s general position and the impact that modern physics had on his theory of enframing. In his essay, Carman argues that:

Computation and Material Transformations

109

Heidegger’s engagement with Heisenberg and his reflection on quantum theory . . . led him to refine his account of the distinctive way in which science and technology order entities by rendering them knowable, measurable, and ultimately susceptible to total organization and manipulation. What is essential to both science and technology, on his later view, is not the positing of objects over against subjects, but the total mathematical ordering of everything—including ourselves—into optimally integrated standing reserve.7

Hence, quantum theory reframes the subject-object dichotomy ‘as an effect or an expression of something more fundamental, namely, the less differentiated, more deeply interconnected “objectness” of all things,’8 and for Heidegger that relationality is a culmination of modern rationalist philosophy and science, since, in Carman’s words, ‘what it aspires to is a purely calculative representation of nature as mathematized objective reality.’9 All of these points raise the question: can we conceive of a mode of computation, even under the image of the network, without falling for a pre-set ‘relationality,’ a relationality which would appear at once in line with Heisenberg’s position that quantum mechanics renders redundant the subject-object relationship and Heidegger’s own position that quantum mechanics, in all its contingency, further reifies its originary calculative ambition in that relationality itself? If computation and the digital further accentuate said relationality, an argument that informs Hui’s later work on relational materialism, it is Hui’s turning toward the future of the mechanisation of the world picture for a new image of computation that sets the backdrop for the main argument of this chapter. That is, if relationality precedes its accentuation in digital conditions, then rethinking what computation has been historically in its relationship to numbers and physical matter would reframe the ongoing discourse around new materialism. This rethinking would offer an expanded notion of computation’s operation—beyond number and out of time—that one can best attest to not necessarily in machines and calculators, but in time-based media, their instrumental capacity, their techniques, and the material transformations that they undergo as they carry their own, albeit differently understood, computations.10 COMPUTATION BEYOND NUMBER AND OUT OF TIME For an expanded concept of computation with an alternative to the lineage of the mechanised image of the world, we can enquire into the conditions that made possible the level of abstraction and generality that could, at least operationally, instantiate everything material into numerical relationships within a mechanistic edifice. In terms of numerical computation, as a science

110

Chapter 7

and an art both distinct from geometry, one can trace an alternative history in which numbering and its conceptual decoupling from the material world in symbols explains the transitions from antiquity, through medieval times, and into modernity and its culmination in modern physics. In fact, such a history would be told by one of Heidegger’s own students, Jacob Klein, in his Greek Mathematical Thought and the Origin of Algebra.11 As Leo Strauss, one of Klein’s closest friends, states in a prologue dedicated to Klein’s birthday: ‘Klein was the first to understand the possibility which Heidegger had opened without intending it: the possibility of a genuine return to classical philosophy . . . with open eyes and in full clarity about the infinite difficulties which it entails.’12 By turning toward the numbering techniques of ancient Greece, Klein will take heed of the Greek concept of arithmos (αριθμός), understood as a discrete counted assemblage composed of two or more elements of ‘ones.’ In general, when counting engaged pure units of thought it was deemed as the science of arithmetic, and when calculating things that presupposed an established relationality among units, it fell under the operations of the art of logistic. The eventual transformation of arithmos into ‘number’—a symbol designed to designate magnitude in general and to operate in equations—has been assumed as a fact by modern science. For Klein, this conceptual rupture needs to be examined ‘from the view of its own presuppositions,’13 not its modern understanding, and so he demonstrates how critical moments in the transformation of counting would also contribute to the foundations of symbolic abstraction in the form of algebra and its application in mathematical physics. In resonance with Heidegger’s views, but also with Husserl’s, who was another of his teachers, Klein illustrates that at the core of modern mathematical physics we find an inseparable symbolic order which, in its representation of nature, becomes identical to nature itself. Following Husserl’s technique of desedimentation,14 a form of philosophical archaeology, Klein delves into a dense and intricate analysis of the numbering practices that formed the basis for the Diophantine technique, one of the first such techniques to introduce shorthand notation with variables in its operations, and which was reinterpreted and redistributed by Franciscus Vieta in the Renaissance.15 These shifts and transformations allowed for the algebraisation of the world into a universal science and for the ‘symbolic unreality,’ as Klein described it, that constitutes modern science. Joseph K. Cosgrove’s commentary on Klein’s work exemplifies this ‘unreality’ through Hermann Minkowski’s formulation of the ‘space-time interval,’ which is made possible through the symbolic coagulation of heterogeneous measurements, which it achieves by treating them, in its calculations, as dimensionless, and then, in its outcome, accounting them as real. Considering the exchange between the real and the

Computation and Material Transformations

111

symbolic, the heterogeneous and the homogeneous, Cosgrove, citing physicist David Bohm on the conceptualisation of the postulate of an ‘absolute world’ that Minkowski associates with relativity, explains that ‘the analysis of the world into constituent objects has been replaced by its analysis in terms of events and processes, organized, ordered and structured so as to correspond to the characteristics of the material system that is being studied.’16 To that extent, Cosgrove notes: Corporeal beings themselves . . . are to be replaced by ‘events’ defined in terms of ‘order’ within a symbolic calculus, this symbolic representation ‘corresponding to’ rather than directly representing the physical world. In this systematic interpretation of space-time, we have the exact counterpart in mathematical physics of the symbolic representation of number inaugurated by Vieta. Rendered systematically, the real entity becomes a nodal point or terminus, as it were, in a nexus of relations determined by the method of representation.17

Could the articulation of real entities as ‘nodal points’ explain the prefiguration of computation as a network and the dissolution of subjects and objects, with computation being a systematic ordering and a ‘method of representation’ corresponding not to the physical world but to its logico-mathematical symbolic representation, which accelerates even further in its electronic automatisation? And if so, what other modes of representation are there that would challenge and provide new images of both computation and the world? As we saw earlier, the fact that the quantum mechanics worldview disrupted that of classical mechanics, yet Klein still prescribed a return to ancient science for a renewed understanding of its methods,18 is indicative. In Arno Borst’s enquiry into a history of computation, The Ordering of Time: From the Ancient Computus to the Modern Computer,19 he also investigates how medieval Europe reckoned with time, what it inherited from antiquity, and how much it contributed to the modern period. In a similar vein to Klein’s research, Borst engages in a lexico-historical investigation of the word ‘computus,’ its various social and ideological interpretations and the instruments used for its measurement. From the early Roman practice of the ‘computus’ to, at least, the first mechanical clock, one finds a recursively selfcorrecting procedure of astrological documentation in service of calculating, estimating, and projecting the future. Computation, as the concretisation of time, was often the result of real economic and political forces, through which the very machines devised to improve those calculations ultimately replaced their own measured observations, becoming themselves time-producing instruments.20 For all the various meanings that computation had assumed from ancient Rome to the early Renaissance, ranging from computistical lookup tables to

112

Chapter 7

mnemonic psalms, time-reckoning to time-measuring devices, and all the conflations of scales of magnitudes sieved through political and economic needs, for our purposes what matters the most is Borst’s observation that the impact of the first mechanical clock (c.1300–1350) is often overstated by scholars. That is, even when there was a mechanical way to calculate time, timekeeping and its computation was still under the control of church and had to undergo a prolonged period of moral ‘symbolisation’ for it to alter the time consciousness of the world. Henryk Grossman similarly reflects on the mechanical clock and further attests that its contemporaries admired not so much its timekeeping capabilities but its ‘automatism,’ in the ability of the machine to carry out the ‘result of [all] the preliminary detailed computations made by their designers,’21 from digital running gears transmitting their counting to an analogue dial that continuously scans the disk. Therefore, ‘in the scholastic world of the irrational,’ those computations would come to signify ‘the birth of rationality’ as a ‘mechanism logically worked out to the last detail, binding all the dead component parts of the machine and all the partial motions transmitted from one part to another into a significant, intelligible and, so to speak, living whole.’22 The clock, then, returning to Borst, became ‘a symbol for a measured way of life in the midst of chaotic circumstances.’23 In other words, along with the mechanical clock came a form of edification, and with the improvement of measurements a new form of constancy—a universal constant—to be testified in the ‘newly discovered regularity of the planetary movements in God’s machina mundi.’24 Hence, ‘absolute time,’ as it is later understood in modern science, would appear less as an immediate result of time’s mechanisation and more as a process of moral symbolisation paired with a fascination for time’s computational automation. From the assumed regularity of motion, its stable equilibrium and its mechanical translation, a similar argument is further pursued by Boris Hessen, who reads Newton’s Principia as a work that distilled the economic and technological conditions of its time, while also accommodating Newton’s theological beliefs.25 Hessen argues that, since for Newton motion became a mode and not an attribute of matter, the regularity and regulation of motion implied that matter had to be inert. But for an absolutely inert matter a new frame of reference would have to be devised, since the forces acting upon objects to alter or retain their movement now resided in an empty container. That empty container would find its articulation in the newly conceived notion of an ‘absolute space.’ That view of space would also problematically necessitate the introduction of an external motive force, and for Newton, Hessen argues, that force is God, and ‘absolute space,’ God’s own ‘sensorium.’ Klein’s interrogation on the assumptions underlying modern physics, Borst’s demonstration of computation as a history steeped in a politics of time, and Hessen’s materialist reading of Newton’s Principia, call attention

Computation and Material Transformations

113

to the many contemporary reformulations of materialism and computation, especially as many of these seem oddly similar to their predecessors.26 What a pre-modern, expanded history of computation suggests is that, because of the symbolic abstraction and symbolisation that is present even in our inherited modes of investigation, the way we differentiate science from the humanities, social sciences, and ecological thinking seems at times to lack the necessary nuance to indicate an awareness of one’s own predispositions. In his commentary on ‘vibrant materialism’—a variant extremely popular in art institutions and amongst practitioners—Andrew Cole effectively summarises the main caveat of its method: by not taking the extra dialectical step to ask whether one’s theory is a function of one’s historical moment, vibrant materialism becomes a ‘method for method’s sake,’27 which, as we have seen with Klein, is also a characteristic of modern science in general. But even in more carefully articulated methodologies, like Vicki Kirby’s investigation of new materialism, we find the all-too-common bifurcation between the ‘two cultures,’ where science follows ‘a methodology [that] involves an instrument of measurement and observation, a tool that bridges the purported gap that separates the interpreter (subject) from the interpreted (object),’ and culture follows an ‘interpretive model [that] discovers its object within the productive force of the representation process itself, which means that the object of investigation will always, and necessarily, gather its legibility, or meaningfulness, through webs of subjective and cultural significance.’28 Speculating on computation as a process of temporal instrumentation that produces its own temporality, as a machine that oscillates between both cultures and permeates lived experience, our question then becomes one of how we are to conceive computation, to negotiate its historical and material life, so that it ‘triumphs over mechanism’29 and escapes its calculative predestiny. One possible way is to conceive alternative ontologies of number and numbering techniques for a renewed understanding of the sub-set of digital computation and binary logic.30 Keeping in mind Klein’s analysis of arithmetic and logistic and Borst’s elucidation of the various stages of computation that served as a function of the Church, the State, and techno-capitalism, we find some leads in Deleuze and Guattari’s conviction that the so-called numbering number can be far less dehumanising and unfaithful to materiality’s richness than the use of ‘number as a numeral, as a statistical element, [which] is proper to the numbered number of the State.’31 It is important to note that the structuration of numerical data into statistical science has been a function of the State: literally, statistical enquiry is the science of the State par excellence,32 and even in present forms, in data analysis and machine learning, statistics, regression analysis and its algorithms still operate under the same ‘symbolic unreality’ of mathematical physics. What, then, would a computation of ‘numbering number’ be?

114

Chapter 7

Moving beyond its logico-mathematical expression in digital computation, I would like to think about an expanded notion of computation in three processes where media undergo material and/or instrumental transformations. Drawing direct inspiration from Alexander Nagel’s methodology in his Medieval Modern: Art Out of Time,33 whereby the juxtaposition of radically different periods serves more than just ‘legitimating precursors for modern practice’ and instead ‘[activates] a wider set of reference points that cannot be arranged chronologically,’34 I will look at moments of interest where I think certain modes of computation, in an expanded and variegated form, shed light upon their symbolic abstraction and even engage with it materially by transforming itself along the way. These moments are (1) in one of the first instances of dematerialisation, as it takes place in a congealed history of harmony, mathematics, and philosophy, in the story of Pythagoras and the forge; (2) in the way that filmmaker Harun Farocki speaks about digital images as processes of rematerialisation, his theorisation of photography as a measuring device and his handling of technical images in the montage to suggest a certain instrumentality; and (3) in immaterialisation as it is developed in the exhibition of Les Immatériaux by Jean-François Lyotard, for which he constructs the concept of the immaterial to speak about a new materiality of the digital.35 The rationale behind this selection is that sound, video, and new media art are predisposed to capture a more general immediate process of materialisation—that is, matter as it computes—through which we become co-constitutive participants in its synthetic execution. Dematerialisation As part of Borst’s historical investigation of the various usages of the term ‘compute’ and its derivatives, he turns to Boethius. Boethius, as well as having translated several ancient Greek texts into Latin and being well familiar with Pythagoras’s and Plato’s ideas on arithmetic and logistic, had also written the most influential book on arithmetic of its time, a book that would shape the Middle Ages, and the educational curriculum of the quadrivium is often ascribed to him. In his book Fundamentals of Music, Boethius referred to computare in its more conventional usage as calculation, even when referring to the various operations on tones and semitones that make up the diapason (octave).36 As Borst had shown, the same term will be used interchangeably in other calculations. Such endeavours and overlaps in meanings were common, especially for those following the quadrivium, which in Middle Ages would have also included the ‘computus.’ In that same book, Boethius recounts a well-known myth that describes the way Pythagoras derived the ratios that create consonances, which in turn would shape Western tonality. After assessing various instruments and their

Computation and Material Transformations

115

environments as unreliable, Pythagoras decides to move away from human hearing and situated knowledge toward the ‘weight of rules’; that is, he sought to acquire, through reason alone, a full understanding for the criteria of harmony. During that investigation, the story goes, ‘by a kind of divine will’ Pythagoras overheard, outside a forge, the hammering of blacksmiths.37 From the disparate beatings he discerned a common consonance and, to his mind, an underlying hidden principle. Through various permutations in force, participation and exclusion of blacksmiths, Pythagoras concluded that the consonance was the result of the hammers’ weight ratio. In other words, the principle was to be found not in men or their actions, but in their instruments. With the τετράκτυς as a principle,38 Pythagoras retained the arithmetic ratio of four values and experimented with different weights to test his hypothesis in strings’ tensions, different pipe lengths, and even with liquids in glasses that he would then strike, with the same and different material rods. Despite the variability of his materials, the arithmetic ratio yielded a pleasing consonance that persisted across media. Remembering Klein’s work, a key affordance of measuring numerically was revealed, operating in a pre-algebraic fashion, where material and media vary to solve the same harmonic equation. What is more, even though Pythagoras was initially drawing away from human hearing, when that consonance emerged and it kept reappearing, he regained his trust in his imperfect hearing, just enough to document the consonance he perceived, extend its laws to different materials, and generalise it to an ontological doctrine: nature can be expressed in numbers—nature can be computed. There was, however, a conscious omission in Pythagoras’s calculations, which Boethius mentions in passing. As Daniel Heller-Roazen incisively detects in his reading of Boethius, there was a fifth hammer, which was in dissonance with all the rest, and Pythagoras simply opted to exclude from his computations. For Heller-Roazen, this omission has important ontological and epistemological implications. Heller-Roazen’s primary question is: if the fifth hammer ‘was discordant with all the others,’ then ‘what is this “all” if something—if even only one thing—sounds in utter dissonance with it?’39 That is, what image of nature and matter did Pythagoras carry in his mind to simply negate his undeniable perception of a ‘being without measure’?40 As we already saw with Klein, what made possible the transposition of ratios across objects, dimensions, and materials, to extend its usage from discrete things to spatial geometric measurements, would eventually set the conditions for its symbolic abstraction. The concept of αριθμοί allowed Pythagoras and his followers to ultimately deduce a unit of harmonic construction in the single tone, and, to that extent, to construct an instrument—the monochord (or κανών)—that would reductively recreate his ratio experiments, and with which a rationalisation of the world would take place. Hence, to conclude,

116

Chapter 7

what the myth of Pythagoras suggests is that from the very outset of philosophy a decoupling from the material world, a dematerialisation via enumeration, was needed for computation to emerge, and, inversely, this decoupling from contingent phenomena would culminate into units, metrics, and instruments that would reconstruct the order of the world. Rematerialisation In a 2010 ciné-fils interview, the late filmmaker and theorist Harun Farocki recounts how in his early years he was convinced that the haptic experience of sixteen millimetre film was very important to his work, due to the fact that he could touch and measure his material with his own hands, just like the typewriter, he adds, that gives a life-size measure of the page and the stack of pages piling up giving a measure of one’s labour.41 Having rid himself of all these little delusions, as he calls them, he admits that working on film material was much better on a computer. Rather than making brute physicality its main characteristic, Farocki displaces the question of the material from the material carrier of the image to the process of familiarising himself with the images and the semi-conscious process of activating said material in the editing process. For Farocki, any material, found or other, to become material, would have to undergo a process of selection, technical transformation, and internalisation. That is how a director activates the material of the image. In that same interview, Farocki refers to the process of digitalisation of images as one of rematerialisation, and treats the digital as a new material out of which new types of images as well as spectators emerge (e.g., a computer that ‘sees’ thermal effigies). What grants these images a material reality, in all their artificialness, are the implications they have in the real world, a theme that Farocki has extensively explored throughout his work. As a case in point, in his film Inextinguishable Fire (1969), the opening sequence finds Farocki leaning on a desk, reading through the testimonies of napalm bombing survivors from the Vietnam War Crimes Tribunal. In a direct confrontation with the viewer, Farocki asks how the repercussions of napalm bombing are to be depicted, by acknowledging that the viewer has the agency to simply look away and ignore the image and its politics. As the camera zooms in, Farocki picks up a lit-up cigarette from outside the frame and puts it out on his own arm. During this performative gesture, he offers a numerical comparison of the burning temperatures of a cigarette and napalm. More effective than indexical images of napalm burned skin, this numerical ratio computes, as it were, in a spectator already trained to look at the world through numbers, what would have otherwise caused repulsion. The following sequence, showing a dead lab animal burning on napalm, gives a sense of how long napalm takes to burn out on skin. The provocation here

Computation and Material Transformations

117

is that the spectator would not look at human skin burning but may observe the animal burning and associate the previous computation with the ongoing image, thus further perverting both spectatorship and one’s modern relationship to science. It is this technical property of images, through dialectical combinations, that Farocki exploits, turns on its head, to serve his critique of technical images. Facing the impossibility of representing war, where pain cannot be simply inferred through numerical ratios, the action of the burning cigarette must take place to effect it. What renders itself visible and invisible, in and out of the frame and through the devices used to document the world, is one of the recurring themes in Farocki’s work. A similar enquiry into what is rendered invisible during war takes place in Farocki’s essay ‘Reality Would Have to Begin,’42 and in his film Images of the World and the Inscription of War (1988), where he recounts the story of architect Albrecht Meydenbauer, a building surveyor for the Prussian government, and his role in the birth of photogrammetry in 1858. After a near-death experience while measuring the facade of a cathedral, Meydenbauer considered the possibility of using photographs as measuring devices that would replace the labour of measuring by hand. Like early Renaissance perspectival methods of tracing outlines of objects on stretched transparent papers, or

Figure 7.1.

118

Chapter 7

Figure  7.2. Stills from Harun Farocki, Inextinguishable Fire (1969). Copyright Harun Farocki GbR.

even Brunelleschi’s first perspective images done with mirrors, a photograph, with the assistance of optics, bends light to the laws of projective geometry and maps three-dimensional objects on a flat plane. This correspondence transforms the camera into a measuring device. Citing art historian Erwin Panofsky, Farocki will emphasise that, since perspective, observation could be understood either in terms of ratio and objectivism or in terms of chance and subjectivism. When an image becomes a measuring device, then chance and subjectivity have to be ignored. In Farocki’s words: To conceive of a photographic image as a measuring device is to insist on the mathematicality, calculability, and finally the ‘computability’ of the image-world. Photography is first of all analog technology; a photographic image is an impression of the original, and impression at a distance, made with the help of optics and chemistry. Vilém Flusser has remarked that digital technology is already found in embryonic form in photography, because the photographic image is built up out of points and decomposes into points. The human eye synthesizes the points into an image. A machine can capture the same image, without a consciousness or experience of the form, by situating the image points in a coordinate system. The continuous sign-system image thereby becomes divisible into ‘discrete’ units; it can be transmitted and reproduced. A code is

Computation and Material Transformations

119

thus obtained that comprehends images. This leads one to activate the code and to create new images out of the code language. Images without originals become possible—generated images.43

In Towards a Philosophy of Photography, Flusser develops the question of photographs as encodable by arguing that the ‘green of a photographed field . . . is an image of the concept “green” [that] the camera (or rather the film inserted into it) is programmed to translate.’44 But, he adds, ‘between the green of the photograph and the green of the field, a whole series of complex encodings have crept in.’ In fact, Flusser continues, the higher the fidelity of a photograph the more untruthful they really are. This is true of all other elements of photographs, but rather than getting lost in these infinite sequences of decoding which make critique and reflection nearly impossible if one were to trace them, Flusser offers a criterion upon which one can critique a photographic image: did the photographer succeed at subordinating the camera to human intention? Photography, which for Jean-François Lyotard operates as a monocular vision, ‘brings to its end the program of metapolitical ordering of the visual and the social’45 by the camera’s very own construction. Tasks that used to demand ‘huge experience . . . are now programmed into the camera thanks to its refined optical, chemical, mechanical and electronic abilities.’46 Returning to Flusser, the ‘program’ of the camera is increasingly succeeding at redirecting human intention in the interests of the camera’s function and, conversely, cameras are programmed purely for the transmission of information. On the question of information’s distribution, Flusser argues that while information in the universe disintegrates progressively according to the second law of thermodynamics, humans struggle against this entropy in terms of receiving, storing, passing on, and deliberately creating information. ‘This specifically human and at the same time unnatural ability is called “mind” and culture is its result.’47 Immaterialisation For the occasion of the 1985 Les Immatériaux, a landmark exhibition of new media art at Centre Pompidou in Paris co-curated by Lyotard, in an interview with Bernard Blisténe Lyotard confesses that the exhibition’s original title, ‘New Materials and Creation,’ had undergone several changes before resulting in the ‘monstrous neologism—The Immaterials.’48 Troubled by the question of ‘What was “new” in the so-called New Materials,’ Lyotard explains that the term ‘materials’ itself had undergone so many shifts that it called for a new point of view. That view was to immaterialise it.

120

Chapter 7

Hence, by tracing the Sanskrit root word mât (‘to make with the hand, to measure, to construct’ in material) and grouping works accordingly, Lyotard invited the participants to extend the meaning of the word material to cover referents (matières), hardware (matériels), matrices (matrices), and even maternity (maternité). Without going into details of how those categories relate and the roles played out in the exhibition, what is important to draw from this decision was that the distribution of materials as immaterials in these domains ultimately aimed at decentring the human from culture by problematising the permeability of the immaterial and man’s relationship to nature. In a collection of essays coedited by Yuk Hui titled 30 Years after Les Immatériaux,49 Hui explains that the immaterial aimed to form a resistance against the then modern conception of materiality in which, under the Cartesian programme, man desires to rule over nature. Just like the designation of Lyotard’s ‘post-modern,’ the immaterial was an attempt to liberate man from that same modern paradigm. Hence, to refer to these immaterials as materials was not simply done out of appreciation for new technologies, but rather as a provocation directed toward the human desire to master nature. Whereas Flusser called for mastery over the program of a tool, Lyotard wants to subordinate technology to nature: hence, the ambiguity of an immaterial material. In Lyotard’s words, cited by Robin Mackay: Immaterial materials, albeit not immaterial, are now preponderant in the flux of exchanges, whether as objects of transformation or investment, even if only because the passage through the abstract is now obligatory. Any raw material for synthesis can be constructed by computer and one can know all of its properties, even if it does not yet exist or no longer exists.50

Mackay explains Lyotard’s position by assigning to the human the role of another ‘transformer’ that fleetingly generates immaterials by ‘extracting and contracting flows of energy-information’51 according to one’s own rhythms. In Mackay’s words: ‘we are synthesizers among synthesizers, and not the destination and arbiter of all matters.’52 And, in Lyotard’s words, ‘if there is some greatness in man, it is only insofar as he is one of the most sophisticated, most complicated, most unpredictable and most improbable interfaces.’53 Therefore, for Lyotard, the post-modern human synthesises, they do not create. The immaterial aimed at resisting any notion of a human role as creator and, once again, resonating with much current post-human thinking, he saw these new technologies as inciting the possible decline of humanism. Out of these new technologies what would emerge would be a metaphysics of interaction, whose materiality would be grounded in computation and the digital. By interaction Lyotard means more than just a dialogue between

Computation and Material Transformations

121

transmitter and receiver. Instead, he refers to an ontology whereby an infinite transmission of messages, in constant translation, circulate. It is a circulation of messages in which the human is not the origin of them, but instead is sometimes the receiver, sometimes the referent, sometimes a code, sometimes a support for the message, and sometimes even the message itself. In line with the definition of mât, this malleability suggests that human identities would no longer be fixed and with the advance of scientific, philosophical, literary, and artistic research, the human will be able to occupy many places in this new structure. In an interim internal report during the preparations of the ongoing project of Les Immatériaux titled ‘After Six Months of Work . . . ,’ Lyotard further associates the immaterial with the immediate. In contradistinction to space, which had been modernity’s project, mastery over time implies the abolition of any delay. The conquest of space, as executed by the modern mind, translates the exploitable properties of an object into language—say, algebraic notation—and retranslates them back into geometrical properties. Because time is not a material itself, but the form of immateriality, if one were to follow the logic of the Cartesian programme, then conquering time would entail the intervening in its behaviour as it evolves independently. Here is where Lyotard finds examples in real-time computation, when looking at the work done at IRCAM (which is also housed at Centre Pompidou), and the Sogitec 4X digital signal processor, which, as an early precursor of the visual programming language Max/MSP, makes it possible for the composer to intervene in the production of synthesised music as it is listened to. This procedure of immediate intervention, according to Lyotard, completes the program of modern metaphysics, and the conquest of now realises the model of the immediate that linguists refer to as performativity. In fact, with the performative we find ourselves in immediacy par excellence, and therefore back in the immaterial. Given the performative character of immaterials, further interpreted through the processes of dematerialisation, rematerialisation, and immaterialisation, as different modes of computation, the question of new materialism is necessarily bound up in its relationship to computable media. To be clear, at least in its conventional meaning, not everything is computable and not everything computes. But as we forcefully accelerate toward a world of ubiquitous computation, Lyotard’s reflections on real-time computational media and the role of the immaterial calls for a re-evaluation of the general role of computation in current discussions around materialism. In one of the most comprehensive summaries on what new materialism is and what its future holds, ‘What is New Materialism?,’ Gamble and colleagues identify three strands of new materialism—negative, vital, and performative—and grant the performative as the sole candidate with the potential to ‘radically

122

Chapter 7

undermine a discrete separation between humans and matter’ with ‘an understanding of science in which every act of observing also constitutes, at once, a transformation of what is being observed.’54 In this chapter, a prehistory of computation has been presented in which its material and political history and its intrinsic relationship to symbolic abstraction are seen to have had a defining role in modern science. Read along with computation’s capacity to produce its own temporality through time-reckoning operations, perhaps the question of a performative matter would have to take into account the material transformations that take place in time-based media and their techniques, as they carry out their specific modes of computation. NOTES 1. ‘The Computational Turn,’ https:​//​sites​.google​.com​/site​/dmberry​/. 2. Yuk Hui, ‘The Computational Turn, or, a New Weltbild,’ Junctures: The Journal for Thematic Dialogue, 13 (2010): 43, emphasis added. 3. Hui, ‘The Computational Turn,’ 42. 4. Martin Heidegger, ‘The Question Concerning Technology,’ in Off the Beaten Track, edited and translated by Julian Young and Kenneth Haynes (Cambridge: Cambridge University Press, 2002), 59. 5. Hui, ‘The Computational Turn,’ 43. 6. Ibid, 43. 7. Taylor Carman, ‘Quantum Theory as Technology,’ in Heidegger on Technology, edited by Aaron James Wendland, Christopher Merwin, and Christos Hadjioannou (New York: Routledge, 2019), 307. 8. Ibid. 9. Ibid. 10. On computation, see also Parisi, ‘Transcendental Instrumentality and Incomputable Thinking’ in this volume. 11. Jacob Klein, Greek Mathematical Thought and the Origin of Algebra, translated by Eva Brann (Boston: MIT Press, 1968). Henceforth GMT. 12. Leo Strauss, ‘An Unspoken Prologue to a Public Lecture at St. John’s College in Honor of Jacob Klein,’ in Jewish Philosophy and the Crisis of Modernity: Essays and Lectures in Modern Jewish Thought, edited by Kenneth Hart Green (Albany, NY: State University of New York Press, 1997), 450. 13. GMT, 6. 14. Burt C. Hopkins, The Origin of the Logic of Symbolic Mathematics: Edmund Husserl and Jacob Klein (Bloomington: Indiana University Press, 2011), 71. Henceforth OLMS. 15. See Joseph Cosgrove, ‘Husserl, Jacob Klein, and Symbolic Nature,’ Graduate Faculty Philosophy Journal, 29, no. 1 (2008): 227–51. 16. David Bohm, The Special Theory of Relativity (London: Routledge,1966), 148, emphasis in original.

Computation and Material Transformations

123

17. Cosgrove, ‘Husserl, Jacob Klein, and Symbolic Nature,’ 241. 18. OLMS, 72, citing GMT, 123: ‘the fundamental methodological preoccupation of ancient science with the meaning of “abstraction” is replaced [in modern mathematics] by “its attention first and last to method as such”.’ 19. Arno Borst, The Ordering of Time: From the Ancient Computus to the Modern Computer, translated by Andrew Winnard (Cambridge: Polity, 1993). Henceforth OT. 20. As Borst asks: ‘Clocks and machines had been combined from as early as the fourteenth century. . . . But was it not now possible for machines to compute time as well, along the lines of Copernicus and Scaliger? The first modern calculating machine, designed in 1623–4 by . . . Wilhem Schickard, was actually meant to assist the chronological and astronomical work of Johannes Kepler. What his friend was undertaking logistice, he was attempting to do mechanice, Schickard wrote in a letter to him.’ OT, 106. 21. Henryk Grossman, ‘Descartes and the Social Origins of the Mechanistic Concept,’ in The Social and Economic Roots of the Scientific Revolution: Texts by Boris Hessen and Henryk Grossmann, edited by Gideon Freudenthal and Peter McLaughlin (Dordrecht: Springer, 2009), 197. Henceforth DSO. 22. Ibid, 197. 23. OT, 45. 24. Ibid, 101. 25. Boris Hessen, ‘The Social and Economic Roots of Newton’s Principia,’ in The Social and Economic Roots of the Scientific Revolution: Texts by Boris Hessen and Henryk Grossmann, edited by Gideon Freudenthal and Peter McLaughlin (Dordrecht: Springer, 2009). 26. On materialism, see Charles T. Wolfe, ‘Materialism New and Old,’ Antropología Experimental, 17 (2017): 215–24. 27. Emily Apter et al., ‘A Questionnaire on Materialisms,’ October, 155 (2016): 3–110. 28. Vicki Kirby, “Matter Out of Place: ‘New Materialism’ in Review,” in What if Culture was Nature all Along?, edited by Vicki Kirby (Edinburgh: Edinburgh University Press, 2017), 4–5. 29. ‘life is the continuous negotiation with matter that creates the conditions for its own expansion and the opening up of matter to its own virtualities: “[Life] was to create with matter, which is necessity itself, an instrument of freedom, to make a machine which should triumph over mechanism”.’ Elizabeth Grosz, ‘Feminism, Materialism, and Freedom,’ in Realism Materialism Art, edited by Christoph Cox, Jenny Jaskey, and Suhail Malik (Bard College: Sternberg Press, 2015), 57. Grosz quotes Henri Bergson from his Creative Evolution. 30. Elizabeth de Freitas, Ezekiel Dixon-Román, and Patti Lather, ‘Alternative Ontologies of Number: Rethinking the Quantitative in Computational Culture,’ Cultural Studies ↔ Critical Methodologies, 16, no. 5: 431–34. 31. Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia, translated by Brian Massumi (Minneapolis: University of Minnesota Press, 1987), 391.

124

Chapter 7

32. Ian Hacking, ‘Political Arithmetic,’ in The Emergence of Probability (Cambridge: Cambridge University Press, 2006). 33. Alexander Nagel, Medieval Modern: Art Out of Time (London: Thames & Hudson, 2012). 34. Ibid, 22. 35. On Lyotard, see Woodward, ‘Information and Alterity: From Probability to Plasticity’ in this volume. 36. OT, 29. 37. Anicius Manlius Severinus Boethius, Book I in Fundamentals of Music, edited by Claude V. Palisca, translated by Calvin M. Power (New Haven: Yale University Press,1989), [197], 18. 38. Tetractys, as a fourfold, derived a special meaning in Pythagorean numerology from the number four. Among other reasons, one lay in how the first four numbers, counting to four, sum up to ten, which was a symbol of perfection and completeness. 39. Daniel Heller-Roazen, The Fifth Hammer: Pythagoras and the Disharmony of the World (New York: Zone Books, 2011), 16. 40. Ibid, 17. 41. ‘cine-fils magazine: HARUN FAROCKI on MATERIALITY,’ YouTube, June 3, 2010, 6:40, https:​//​youtu​.be​/YuVLOzW3J​-k. 42. Harun Farocki, ‘Reality Would Have to Begin,’ in Imprint Writings, edited by Susanne Gaensheimer et al., translated by Thomas Keenan and Thomas Y. Levin (New York: Lukas & Sternberg, 2001), 186–213. 43. Ibid, 198. 44. Vilém Flusser, Towards a Philosophy of Photography, translated by Anthony Mathews (Reaktion Books, 2000), 44. 45. Jean-François Lyotard, ‘Representation, Presentation, Unpresentable,’ in The Inhuman, translated by Geoffrey Bennington and Rachel Bowlby (Cambridge: Polity Press, 1991), 120. 46. Ibid. 47. Flusser, Towards a Philosophy of Photography, 49. 48. Jean-François Lyotard and Bernard Blistène, ‘Les Immatériaux: A Conversation with Jean-François Lyotard and Bernard Blistène,’ Flash Art, 121 (March 1985): 32–39. 49. Andreas Broekmann and Yuk Hui (eds.), 30 Years After Les Immatériaux: Art, Science, Theory (meson press, 2015). 50. Robin Mackay, ‘Immaterials, Exhibition, Acceleration,’ in 30 Years After Les Immatériaux: Art, Science, Theory, edited by Andreas Broekmann and Yuk Hui (meson press, 2019), 219. 51. Ibid. 52. Ibid. 53. Jean-François Lyotard, ‘After Six Months of Work . . . (1984),’ in 30 Years After Les Immatériaux: Art, Science, Theory, edited by Andreas Broekmann and Yuk Hui (meson press, 2019), 36. 54. Christopher N. Gamble, Joshua S. Hanan, and Thomas Nail, ‘What Is New Materialism?,’ Angelaki, 24, no. 6 (2019): 111–34.

Chapter 8

How the Performer Came to Be Prepared‌‌ Three Moments in Music’s Encounter with Everyday Technologies Iain Campbell

What kind of technology is the piano? It was once a distinctly everyday technology. In the bourgeois home of the nineteenth century, it became an emblematic figure of gendered social life, its role shifting between visually pleasing piece of furniture, source of light entertainment, and expression of cultured upbringing.1 It performed this role unobtrusively, acting as a transparent mediator of social relations. To the composer of concert music it was, and sometimes still is, says Samuel Wilson, like the philosopher’s table: ‘an assumed background on which one writes.’2 Like other instruments standard to Western art music, the piano was designed to facilitate the production of a consistent and refined timbre.3 More than most other such instruments, the piano also facilitated a kind of sonic neutrality. With its wide pitch range and smoothing of the percussive attack of its predecessor instruments, the piano presented composers with a technological means of approaching composition from a seemingly objective vantage point. It exemplified, in Heideggerian terms, the instrumentality of the instrument,4 serving as a mediator between idea and expression that apparently adds no character of its own. This notion of the invisibility, or transparency, of the mediations that musical technologies such as the piano enact is one of my areas of concern here.5 So too is its inverse: when these mediations become visible or opaque. Transparency has been a topic of significant recent theoretical attention. Stefanos Geroulanos, for example, has detailed how the supposed 125

126

Chapter 8

transparency of intersubjective, epistemological, and social relations was a major point of critique in post-war French thought, where the supposition of transparency was taken to suppress how the world was ‘complex, layered, structured, filled with heterogeneity’6—and, as I will stress here, contingency. The thinkers Geroulanos considers, from Jean-Paul Sartre through to Jean-François Lyotard, can be said to be united in their refusal to invisibilise mediatedness.7 From a starting point of conceiving of the piano as a technological artifact, and in particular from John Cage’s ‘prepared piano,’ I will explore how a similar concern has appeared in musical contexts, albeit not without the risk of reversion back into a logic of transparency. Treating the piano as a technological artifact also puts us into conversation with contemporary work on performance and musical technologies. A recent attempt by Tom Mudd to map the field of research concerning musical engagement with tools and technologies is useful here.8 On one side there is, as Wilson highlighted regarding the piano, a position that treats technology as ‘an ideally transparent medium for communicating ideas.’9 Mudd follows the philosopher of technology Don Ihde in identifying within this a stance that accepts the transformative power of particular technological artifacts while diminishing awareness of their presence.10 At the same time, the feeling of transparency and non-mediation can in fact be closer to its opposite: virtuosity can be understood as an immersion in the nature of a medium and its specific characteristics, with the ideas of a performer being governed by the possibilities their instrument offers.11 On the opposite side of the spectrum, technology is seen as ‘a necessary and creative mediation that can be a source of ideas itself rather than simply a means for their expression.’ The former perspective treats technology as neutral, while the latter treats technologies as having ‘particular tendencies, biases, and values embedded within.’12 Expanding on Mudd’s engagement, in Ihde’s terms this would constitute a distinction between, or move from, embodiment relations to hermeneutic relations and alterity relations, and from transparency to opacity, with which the interpretive capacities technologies offer become tangible and ultimately the technology seems to take on a certain independence.13 This can be pushed further still through Ihde’s concept of ‘background relations.’ Ihde highlights cases of automatic and semi-automatic technologies that provide the basis for our activities without us entering into explicit relation with them or having any explicit experience of them. Thermostats and other such heating regulation devices are one such example, but the idea of a technology or set of technologies having a ‘background’ or ‘field’ relation to individuals in the constitution of an overall environment is increasingly pertinent. In what follows I will suggest, within and between examples, a movement between these different positions: movement between transparency and opaqueness, visibility and invisibility, audibility



How the Performer Came to Be Prepared

127

and inaudibility. How relations with musical technologies are conceived and enacted will be seen to raise a number of epistemic, aesthetic, social, and political questions. Requiring a percussion ensemble but lacking the space, John Cage instead sought to turn the piano into one.14 He inserted everyday objects—bolts, screws, coins—between the strings of the piano, turning it into a producer of unpredictable, diverse tones, redolent of the Gamelan orchestra. The transparent, highly stable emblem of Western art music is turned into an unpredictable mechanism for producing noises; the score is no longer a key to be accurately interpreted but a prompt to perform actions; the composer’s control over their work, their capacity to directly communicate in the language of Western art music, is diminished. While Cage’s compositional work with his ‘prepared piano,’ the majority of which was in the 1940s, does not seem to share the explicit intent to deal with the piano as a historical and social object as the critical artistic inquiry of figures such as Nam June Paik, George Maciunas, Philip Corner, and Annea Lockwood did in the 1960s,15 it nevertheless anticipates that later work in rendering the piano legible as a piece of technology, and as a piece of technology that operates as part of social and historical contexts. I begin here with a study of the prepared piano and Cage’s work for it. In particular, I highlight two tendencies in Cage’s work and thought that the prepared piano is part of, and which, while overlapping, may be in some tension with each other. The first of these is the prepared piano as a rendering visible of a piece of musical technology as a site of aesthetic, epistemic, and social contingency. The second is the prepared piano as part of a drive toward the goal of an ‘all-sound’ music16—we might carry this forward to what Mudd, citing Peter Worth, describes as an ‘any sound you can imagine’ approach17— and the accompanying desire to ‘liberate’ the (sonic) ‘spirit . . . inside each of the objects of this world.’18 The site of conflict here is between a fundamental openness to contingency and a desire to totalise. Via the mediating point of Cage and the pianist David Tudor’s work on Variations II (1961), and in particular Tudor’s use of basic consumer electronics in the development of his ‘amplified piano,’ I turn to Variations VII (1966), performed as part of the 9 Evenings series in collaboration with Bell Labs. This large-scale work, facilitated by Bell Labs’s latest developments in communications technologies, can be seen as continuing to pursue the kinds of contingencies that the prepared piano prompted. It also, however, presents a more troubling tendency toward totalisation. Being facilitated in the capture of remote sounds by the then rapidly expanding communications network, Variations VII can be seen to enact, in Ihde’s terms, a ‘background’ mode of technological relation,19 through which technology is once again rendered transparent and its social force rendered invisible.20

128

Chapter 8

To close I will address some musical work that emerges as part of the shift from the industry-affiliated project that was Variations VII back to independent work with consumer electronics.21 I will consider the characteristics of the improvising systems developed by the composer-performers David Behrman and George E. Lewis. Through a study of Behrman’s work on human-machine interactions in pieces such as Interspecies Smalltalk (1984) and Lewis’s constantly evolving Voyager software (1987–), I will suggest that this work signals a renewal of concern with the conditions of technology and of human-technological relations, equally facilitating an enriched understanding of musical technologies as sites of contingent encounter, with distinct social, aesthetic, and epistemic repercussions. CAGE’S PREPARED PIANO AND AN ‘ALL-SOUND’ MUSIC In his 1940 essay ‘The Future of Music: Credo,’22 John Cage discusses the capacity of technological advances to engender ‘new sound experiences.’23 Electric instruments, including ‘oscillators, turntables, generators, means for amplifying small sounds, film phonographs, etc.,’24 are described in their potential to generate any sound or rhythmic characteristic whatsoever. The composer, no longer limited by traditional musical restraints, is free to work with ‘the entire field of sound.’25 This would be an ‘all-sound’ music.26 In the years leading up to this essay, Cage’s concerns increasingly moved toward percussion music. For Cage, percussion was a musical form where formerly excluded sounds—noise—could be reclaimed into the territory of music: insofar as percussion music is not concerned with the control of tones, any sound is permissible. This situates percussion music at a point between the keyboard-influenced music of the past and the all-sound music of the future, at the moment of a historical-aesthetic shift. Cage seems to have developed his prepared piano shortly after first delivering the lecture that became ‘Future of Music: Credo,’ and so in the terms of this period of Cage’s thought we can discern that the prepared piano lies at a point between the piano-informed music of the past and the electrical instrument-informed music of the future, as a curious hybrid technology deforming the past to provide a provisional look toward the future. Cage’s perspective on an ‘all-sound’ music also points in two directions. On the one hand there is the question of developing the technological means to explore the entire field of sound. But accompanying this is a seemingly more metaphysical question. Cage would speak of a ‘spirit . . . inside each of the objects of this world,’ and of how ‘all we need to do to liberate that spirit is to brush past the object, and to draw forth its sound,’27 a notion he credits



How the Performer Came to Be Prepared

129

to the filmmaker Oskar Fischinger, with whom Cage briefly apprenticed in 1937. Percussion provides a very literal means toward the liberation of these ‘spirits.’ As the musicologist Richard H. Brown notes, ‘[a]s a mechanical act, the percussive strike of an object is the simplest application of a technology in the reproduction of sound.’28 Dealing with ‘the entire field of sound’ is then not only a question of a quasi-scientific examination of this field, but of giving voice, through technological means, to a purported inner existence of objects. A range of positions on Ihde’s technological continuum are implied here. There is the suggestion of the kind of embodiment relation by which technological artifacts extend human abilities but awareness of their presence as technologies is diminished. The artifact is a neutral and ideally, if not yet actually, transparent point of access to an already intuited field. Yet to speak of a ‘spirit’ inside of individual things suggests something not yet known by the human subject. Technology can be a means to enact an interpretive process with regards to the objects of the world, and can even make the objects of the world appear in their singular distinctness, in their alterity. At this end of the spectrum technology still seems to serve as a means—as tool, or ‘instrument’ in its musical and everyday senses—to access unknown, but conjectured, sounds. However, a more radical kind of technological alterity also arises. As well as being a provisional point of access to the field of sound in its entirety, the prepared piano also pushes back. Much has been made of the seemingly incidental, yet significant, change that the preparation of the piano effects with regards to notation. There is, by tradition, a direct and determinate relation between scored note and sounded note. The passage from the mind of the composer, through score, performer, and instrument, has been conceived as transparently smooth. But, with the prepared piano, notation and sound event are decoupled. As Brown puts it, ‘[t]he notation . . . was in essence a form of tablature, dissolving the relationship between notation and sound, and instead focusing on the relationship between mechanical action and sounds.’29 The decision to interfere with the mechanisms of the piano interrupts the appearance of a smooth passage from the mind of the composer to the sound event. The technological underpinning of the sounding process—where both instrument and notation can be understood as kinds of technologies mediating between composer and sound event—becomes tangible.30 The prepared piano, then, seems to have been intended as a means toward an all-sound music, a multiplication of the sounds with which the composer could work, potentially as transparent a means as the traditional piano was toward tonal music. But as a piece of technology, it resisted this. It proved unstable and unpredictable, the sounds the piano and its performers produced, as Cage highlighted and came to embrace, slipping out of the composer’s

130

Chapter 8

control.31 It became no longer a transparent technology but an opaque one, disrupting the composer’s attempts at expression. At a pivotal moment in Cage’s development, then, there is another sense in which the prepared piano points both backwards and forwards: backwards toward a desire for an ‘all-sound’ music and full command over the field of sound, and forwards toward the relinquishment of control and embrace of contingency Cage came to favour, evidenced in his erasure of questions of expression, deployment of chance procedures, and desire to ‘let sounds be themselves.’32 The notion of a ‘spirit . . . inside each of the objects of this world’ can also be applied in both directions. Looking backwards, it deepens the concerns of an ‘all-sound’ music, seeming to render it on a more ontological level, where every and any object has a sound that, through technological means, may be captured.33 Looking forwards, it presents sounds and objects as having a life of their own, an alterity, that cannot be reduced to our attempts to know, capture, and control them, and that, Cage decided, we should let be. A question I wish to maintain, however, is the extent to which the past is really left behind. The question of whether these forwards and backwards perspectives can be neatly separated should be held in mind in the next section. SECOND-ORDER PERFORMANCE: VARIATIONS II, VARIATIONS VII, AND EXPERIMENTS IN ART AND TECHNOLOGY Across the 1950s, Cage would adopt various chance procedures into his compositional process, as well as experimenting with graphic notational practices that required interpretation on the part of the performer, among other attempts to relinquish his own control over the performance situation. 1961’s Variations II, written with no specified instrumentation or number of players, presents a mature exemplification of these tendencies in Cage’s work. Its mobile graphic score, making use of multiple transparencies for the performers to combine and then interpret, predetermines seemingly nothing of what kinds of sounding events a performance could involve, and intensifies the tangibility of notation as a constructed hermeneutic technology, short-circuiting any attempt at a transparent translation from the mind of the composer to the sound. On one hand the field of sound in its entirety then seems available to the performer, but at the same time the concern seems to be more with the production of singular, contingent sound events; Cage states that ‘the universe in which the action is to take place is not preconceived.’34 Cage’s long-time collaborator, the pianist David Tudor, took Variations II as an opportunity to perform his own inquiry into the piano as a technology, with what he termed his ‘amplified piano.’35 Working from the six basic



How the Performer Came to Be Prepared

131

parameters indicated in Cage’s score, Tudor designed a complicated system of microphones and phonograph cartridges to be triggered in various ways, with the sound events deriving only from the resonances, feedback loops, and signal interferences of the piano, microphones, and cartridges in reciprocal interaction. The piano becomes, as You Nakai notes, a kind of resonance chamber.36 This could be seen as a technological means to intensify the indeterminate qualities of Cage’s prepared piano, going beyond what Cage expected of Tudor in his role as virtuoso pianist and making the piano the opposite of a transparent tool:37 the control over sounding events is shifted away from the performer and distributed through a complex instrumental assemblage, with Tudor noting he could ‘only hope to influence’ it.38 More than even a relation of alterity, we move here into the explicit territory of distributed sounding agencies. The piano is perhaps no longer an instrument or a tool, if we are to understand those terms as means to a subject’s ends, but fully a technological artifact, or part an assemblage of technological artifacts alongside performer, notation, and other factors, each in relations of alterity with the others. With the amplified piano Tudor may, as John Driscoll and Matt Rogalsky suggest, have been following an impulse sparked by Cage, dating back to the attempts to reveal the sonic characteristics of everyday objects, the ‘spirit . . . inside each of the objects of this world,’ that his performances of Cage’s Cartridge Music (1960) involved.39 Yet in addition to the two directions in which I have already suggested this notion points, of the performer as transparently expressing the thought of an ‘all-sound’ music and the performer as a producer of contingent events, Tudor’s investigations into the amplified piano propose new ways of thinking of the performer within the performance situation. The integration of a position that might seem sovereign—observer, performer, experimenter—into a technological system, as just one part of that system, anticipates the more explicit concern with reflexivity that came to concern ‘second-order’ cybernetics.40 Such meetings of artistic, theoretical, and technological practices became more visible as the 1960s proceeded, and so too did the explicit and implicit refiguration of composer-performer-instrument/technology relations. The 9 Evenings: Theatre and Engineering event series held in New York City in October 1966, organised with Bell Labs, and the subsequent Experiments in Art and Technology (E.A.T.) programme, marked a zenith of these meetings.41 The cutting-edge audiovisual and communications technologies that Bell Labs could offer allowed artists including Tudor and Cage to drastically extend the scope of their working through of musical technology and its contingencies. Tudor, for his part, contributed to 9 Evenings with Bandoneon! a performance for a bandoneon—a traditional Argentinian instrument similar to an accordion—through a setup of several microphones fed through electrical

132

Chapter 8

systems and transmitted via twelve conventional loudspeakers and further mobile speakers, and with accompanying visual effects. This setup flooded the cavernous 69th Regiment Armory in which 9 Evenings was held with a fluctuating, self-sustaining tumult of sound. The sound of the audio system itself could be muted such that only the architectural resonance of the performance space itself would be audible. In the words of W. Patrick McCray, ‘Tudor was, in effect, playing not just the bandoneon but the Armory itself.’42 Here, another of Ihde’s technological relations may become tangible in a musical context: in bringing into focus the sounding space as part of the technological apparatus, and suggesting not only a differentiated performance space but a single environment, background relations on which experience itself rests come into visibility. If Tudor played the Armory, then Cage’s Variations VII, performed by Cage, Tudor, and others, might be said to have played the entirety of New York City. Variations VII took advantage of the specialisation in telecommunications at Bell Labs to construct a system that included a number of radio and television receivers and ten telephone lines, linked to locations across New York. It stressed, as a retroactive draft score suggests, ‘[q]uantity instead of quality.’43 Alongside outside sounds transformed by electrical means, it included the inaccessible sounds of physiological processes (e.g., brainwaves) and other transformations of non-aural inputs into sound, and the recursive capture and redeployment of transmitted sound, ‘making audible what is otherwise silent.’ Everything became material to be captured, processed, and transmitted. Is this a ‘liberation’ of the ‘spirit . . . inside each of the objects of this world’? If it is, how should that notion be understood? Variations VII is, on the one hand, an extension of the radical contingencies that Cage introduced to his music with the prepared piano: the instruments, if the electrical devices involved can still be called that (archival footage shows Cage and the other performers working with a maze of wires and devices), become opaque to composer and performer, pushing back in their alterity, finding a kind of agency of their own and giving form to a sound environment that the performer, in Tudor’s earlier words, ‘could only hope to influence.’ It may seem to give individual objects and sounds an expressivity that is not determined by the ideals of composer or performer. Yet this is not the only way that Variations VII can be interpreted. The subsequent history of Bell Labs’s work with musicians and artists suggest an alternative perspective. Bell Labs engineer Billy Klüver and artist Robert Rauschenberg founded E.A.T. in late 1966 in order to capitalise on the momentum gathered by 9 Evenings. Significantly, and as John Beck and Ryan Bishop have recently detailed, at this point the social basis for these art-technology explorations was made explicit. Klüver, making use of the kind of technological analogy that would become more pronounced in his



How the Performer Came to Be Prepared

133

stances over the next few years, saw the ultimate purpose of E.A.T. as being to ‘act as a transducer between the artist and industry, to protect the artist from industry and industry from the artist, to translate the artist’s dreams into realistic technical projects.’44 Yet this vision of ‘adjustment’ and ‘integration’ did not prove easy. Ultimately, the organic meeting of art, technology, and commerce that E.A.T. sought to bring about proved unsustainable when artistic ideals could not be translated into commercial terms.45 Furthermore, at a personal level, artists and engineers would come to clash over feasibility, undermining Klüver’s desire to break down individual goals and achieve a fully accomplished kind of non-hierarchical collaboration. More than this, Beck and Bishop highlight how the social forms Klüver sought to bring about were based on ‘the remarkably propulsive power of American modernity and its capacity to actualize new modes of experience on a grand scale,’46 and Klüver was explicit on wanting to participate in a wave of American cultural ascendency.47 By the late 1960s such attitudes were under increasing attack: in Beck and Bishop’s words, ‘technology was no longer seen as an open invitation to the future but increasingly perceived simply as a weapon.’48 The imbrication of Bell Labs and other major technological organisations in Cold War military research (around 40 per cent of Bell Labs’s staff worked on military projects49) became unavoidable, and the project of dismantling boundaries in the name of radically non-hierarchical practices of problem-solving became rife with political tensions as well as corporate demands. Despite Klüver’s insistence on a practical approach—hence an emphasis on engineering rather than science—there is a significant theoretical parallel here. At this time the extension of ideas from cybernetics and information theory into other disciplines had become widespread, and such a perspective can be heard in Klüver’s non-hierarchical, organic ideals concerning the social realm, where the ideal social setting seems to take on the form of a self-regulating system. Yet already in 1955 the philosopher of science Georges Canguilhem had expressed a worry that a cybernetics-informed analogy between organism and society could serve to falsely imply social and subjective harmony: the purportedly smooth transposition of one realm into another may well mask a violence being done in this process.50 This is an example Stefanos Geroulanos highlights as part of his sustained account of the critique of transparency in post-war French thought. What is at risk in the analogy between realms is a seeming rendering transparent of every thing and every relation at the expense of that which does not fit or does not want to fit.51 The ideals that Klüver seeks in proposing a non-hierarchical meeting between art, technology, and commerce risk effacing the very real and often recalcitrant, even violent relations that already hold between these fields.

134

Chapter 8

Yuk Hui’s recent work has also brought attention to the troubling kinds of organic unity that cybernetic thought can imply, and the distinct theoretical difficulties involved in accounting for movement between domains.52 Here, these unities present a complex dialectic between recursivity and contingency. The recursive system always faces and must incorporate contingencies if it wants to continue existing, but at an organicist limit point of a fully self-sustaining system it can do this comfortably: any contingency is accommodated, rendered into the terms of the system, and so loses any feature that made it meaningfully contingent. This presents cybernetics in a peculiar light. It is at once a theory of radical contingencies and of unprecedented control. This is manifest in how it has at once and the same time been characterised as a science of the counter-culture, in Deleuze and Guattari’s terms a ‘nomad’ science,53 working in the cracks of institutions, and as a totalising programme or meta-disciplinary position;54 even, on Hui’s account, a kind of historical a priori. This peculiar feature of cybernetics gives cybernetics-informed art a peculiar status itself. The musicologist Eric Drott has recently detailed a distinction between what he calls a ‘cybernetic sublime’ and a ‘cybernetic mundane.’55 The cybernetic sublime can be found in artistic works that enact a spectacularisation of cybernetics—Drott names Tudor’s Rainforest IV as an example, and we could add to that Variations VII—and involve for the spectator an oscillation between immersion and reflection. The cybernetic mundane, meanwhile, involves situations in which ideas derived from cybernetics and information theory ‘operate in the background.’ (We may recall here Ihde’s typology.) It concerns the obscured ways in which cybernetics has never left us. An instance of this is where the seemingly neutral cybernetic language of information is deployed not only to make comparisons but to assert equivalences.56 Drott highlights the equivalence of music is information. When, in the early 1950s, the engineers who designed the RCA Synthesizer—originally intended to both analyse and synthesise music—equated music with information, this allowed them to cast this instrument as ‘a universal instrument capable of generating any sound imaginable.’57 Yet this requires the elimination, or standardisation, of the improbable: in musical terms, the unidiomatic, or, in terms equally cybernetic and musical, noise.58 As Drott goes on to argue, the music is information equation is now taken as given, discussing the ways that Spotify binds user information and musical information together in a closed circuit that both incorporates and guides user receptivity. Where the cybernetic sublime involves making cybernetics visible, the cybernetic mundane renders it invisible, transparent. Returning to Variations VII in this light presents a complicated picture. It is on one hand a thoroughgoing embrace of contingency, an act of radical openness to the alterities that the uncountable objects of the work present to



How the Performer Came to Be Prepared

135

us. Yet, at the same time, Variations VII participates in a much wider narrative that draws together information theory and cybernetics, the counter-culture of the 1960s, the Cold War military-industrial complex, and a distinct brand of American exceptionalism. Moreover, through its use of the telecommunications network as a means to make, in principle, anything in New York function as its material to work with, it may even be seen as premonitory staging of what is now called, in Shoshana Zuboff’s widely adopted phrase, surveillance capitalism.59 If what it means to liberate the ‘spirit . . . inside each of the objects of this world’ is to homogenise these ‘spirits,’ to use technological means to render them mutually legible, whether as sound or as information, then this is no kind of liberation. This is less the cybernetics of a utopian non-hierarchical sociality than the cybernetics of Amazon tracking and managing the movements of warehouse workers. Cage perhaps registers this himself with his departure, from the late 1960s, from work heavily invested in new technologies, and in the dedication to his 1969 essay collection, A Year from Monday: ‘To us and all those who hate us, that the U.S.A. may become just another part of the world, no more, no less.’60 The issue, in short, is that with Cage’s journey from the prepared piano into work with advanced telecommunications technologies, the tensions, opacity, and contingencies of the prepared piano risk becoming again something transparent, immediate, invisible, albeit not to the composer, but at an inhuman level of informational exchange. All ‘noise,’ whether defined sonically or informationally, can be readily absorbed into the system. Composers and performers, insofar as they remain significant, seem absorbed into the system too, one node among others. What strategies can there be for evading this risk? CONCLUDING THOUGHTS: INTERSPECIES SMALLTALK AND VOYAGER With the decline of large-scale art and technology collaborations, inquiry into this relation had to take on new forms. Two examples I will highlight to conclude here derive from the work of David Behrman and George E. Lewis. Behrman was one of the performers on Variations VII, but his own compositional work has tended to smaller scales. In a conversation with Ron Kuivila, Behrman spoke of two moments when ‘that great, inviting door seemed to swing open’: the introduction of inexpensive transistors in the early 1960s and the advent of inexpensive microcomputers in the mid to late 1970s.61 David Tudor and his ‘quirky, homemade circuitry’ was a key figure in the first moment, with Behrman remarking on—and seeming compelled by—the

136

Chapter 8

instability and fleetingness that technological change imposed on this work. Much of Behrman’s own work since the late 1970s has been marked by the second moment. In particular, Behrman has used consumer electronics—the KIM-1 microcomputer being a notable turning point62—in the development of interactive musical systems. Far from the inestimable complexity of Variations VII, Behrman’s basic systems involved no more than pitch sensors and simple software to respond to sensed pitches with synthesizer sounds. The vividly named Interspecies Smalltalk, commissioned by Cage and Merce Cunningham for the Cunningham Dance Company, is exemplary of this approach. Performed by the violinist and Fluxus member Takehisa Kosugi alongside Behrman’s interactive system, the elements are minimal: Kosugi’s sinuous playing and simple synthesised tones. Yet the sense is of an exploration. The synthesizer responds to Kosugi in unexpected but seemingly sympathetic ways, and Kosugi does the same. Like work in the vein of a ‘cybernetic sublime,’ it dramatises a human-technology interaction, but one quite different from a shared entry into a totalising system. What is the distinction between Interspecies Smalltalk and Variations VII? It may be, in part, a recognition, and thematisation, of scale. These are small social interactions, and coexistences, which yet may point elsewhere: for instance, to the seemingly radical alterity of the Cunningham dance piece they accompanied. Technological—and social—relations of alterity are sustained. They resist the total and remain with the contingent. Lewis’s Voyager work highlights and clarifies some of these themes further. The Voyager software, under constant development since 1987, is what Lewis calls an interactive ‘virtual improvising orchestra.’63 Like Behrman’s systems it listens to and responds to human performers, and also produces its own independent behaviours. Where it significantly differs from Interspecies Smalltalk is at the level of complexity. Lewis highlights a specific concern with how attitudes about music become embedded in music software—as does Behrman when he suggests that music composed using established software could be treated as collaborations with the authors of that software.64 Voyager is programmed in a way that does not adhere to many of the standards of Euro-American institutional computer music. It is designed with sixty-four asynchronous voices, permitting ‘simultaneous multiplicities of available timbres, microtonal pitchsets, rhythms, transposition levels and other elements.’65 By conventional compositional standards, it is overcrowded. But these are not the standards to which Lewis appeals. He is instead favouring an Afrodiasporic notion of ‘multidominance,’66 enacting a high degree of pluralism in contrast to the often-unifying ideals of Western art music. In Lewis’s description, Voyager is dialogic: it involves ‘multiple parallel streams of music generation, emanating from both the computers and the humans—a nonhierarchical, improvisational, subject-subject model of



How the Performer Came to Be Prepared

137

discourse, rather than a stimulus/response setup.’67 The computer is treated not as an instrument as such, but as an independent improviser. Here we return to some of our initial considerations. Lewis treats the software as not a transparent tool for achieving musical ends, but a kind of subject in its own right. We find a technological relation of alterity. Yet it is not only this. On one hand, Lewis follows Behrman in directing his concern to specific performance spaces. Individual agents can be readily identified, even if this does not predetermine the multi-agential sounding whole, and a system can be demarcated. But by highlighting the social perspective that underlies the software, Lewis brings into focus an inescapable outside that cannot be absorbed into the system. Not only alterity relations are thematised, but also background relations, both technologically and socially. Sound, in Voyager, is not only an objectively determinable and capturable object or quality. It sits within a social milieu, even if it does not only sit there. Let us now circle back to the piano. A recent work by Lewis, Timelike Weave (2018), is scored for harpsichord—that Baroque predecessor of the concert piano—and inspired by Afrodiasporic quilting aesthetics. The piece highlights the distinctly percussive qualities of the harpsichord, but through its overlapping repetitions also suggests a peculiar digitality. This is suggestive of why Lewis also cites inspiration in the notion of the ‘closed timelike curve’ from mathematical physics.68 Where Cage’s prepared piano was situated in a present pointing both forwards and backwards, Lewis locates the harpsichord within a set of paradoxical time loops. What permits mobility around these loops is the pluralisation of sound. Cage’s ‘all-sound’ approach to music presents a persisting concern of a tendency to totalisation. But Lewis highlights how sound is not only sound. It is something at once, and irreconcilably, sonic, social, and technological, and can be brought to run through these registers endlessly. It remains jarring that the piece of technology that is the harpsichord is at once an emblem of early modern Europe, an image of Afrodiasporic quilting, and a digital system working through its inputs. In the encounters staged in these processes, there is a glimpse of a contingency without totalisation. NOTES 1. See Richard Leppert, ‘Sexual Identity, Death, and the Family Piano,’ 19th-Century Music, 16, no. 2 (1992): 105–28; cited in Samuel Wilson, New Music and the Crises of Materiality: Sounding Bodies and Objects in Late Modernity (London: Routledge, 2021), 52.

138

Chapter 8

2. Wilson, New Music, 51. Alongside the table we could also consider the blank page—see Dominic Smith, Exceptional Technologies: A Continental Philosophy of Technology (London: Bloomsbury, 2018), chapter 2. 3. See Michael Chanan, Musica Practica: The Social Practice of Western Music from Gregorian Chant to Postmodernism (London: Verso, 1994), 242–43. 4. See Martin Heidegger, “The Question Concerning Technology,” in Off the Beaten Track, edited and translated by Julian Young and Kenneth Haynes (Cambridge: Cambridge University Press, 2002). 5. Musical mediation is a major topic in recent scholarship; see Georgina Born and Andrew Barry, ‘Music, Mediation Theories and Actor-Network Theory: Introduction,’ Contemporary Music Review, 37, no. 5–6 (2018): 443–87. 6. Stefanos Geroulanos, Transparency in Postwar France: A Critical History of the Present (Stanford, CA: Stanford University Press, 2017), 10. 7. As Emmanuel Alloa puts it in his account of Maurice Merleau-Ponty, ‘[t]he fiction of transparency expresses . . . the forgetting or oblivion of . . . the constitutive corporeal mediatedness of any relationship to the world.’ Emmanuel Alloa, Resistance of the Sensible World: An Introduction to Merleau-Ponty, translated by Jane Marie Todd (New York: Fordham University Press, 2017), 13. 8. Tom Mudd, ‘Material-Oriented Musical Interactions,’ in New Directions in Music and Human-Computer Interaction, edited by Simon Holland et al. (Cham: Springer, 2019), 123–34. Mudd’s emphasis is on digital tools, but his demarcations can be understood within the more general field of instrument design. 9. Ibid, 123. 10. Ibid, 126. 11. Ibid. See also the project ‘The Garden of Forking Paths’ at the University of Leeds. ‘The Garden of Forking Paths: Overview,’ https:​//​forkingpaths​.leeds​.ac​.uk​/. 12. Mudd, ‘Material-Oriented Musical Interactions,’ 125. Mudd here speaks of affordances, but I avoid that term so as to not suggest any particular link to the account of affordances developed by J.J. Gibson and, moreover, the many adoptions of it. See Patrick Valiquet, ‘Affordance Theory: A Rejoinder to “Musical Events and Perceptual Ecologies” by Eric Clarke et al.,’ The Senses and Society, 14, no. 3 (2019): 346–50. 13. Don Ihde, Technology and the Lifeworld: From Garden to Earth (Bloomington: Indiana University Press, 1990), chapter 5. Mudd himself cashes this out in terms of Karen Barad’s account of intra-action, suggesting that we understand human-instrument relations as involving an entangled human and non-human ‘agencies’ which only together constitute the specific performance situation. Mudd, ‘Material-Oriented,’ 131; Karen Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning (Durham: Duke University Press, 2007). 14. See John Cage, Empty Words: Writings ’73-’78 (Middletown, CT: Wesleyan University Press, 1981), 7–9. 15. See the account in Wilson, New Music, chapter 3. 16. John Cage, Silence: Lectures and Writings (London: Marion Boyars, 2009), 5 17. Mudd, ‘Material-Oriented,’ 124.



How the Performer Came to Be Prepared

139

18. John Cage, For the Birds: In Conversation with Daniel Charles (London: Marion Boyars, 1981), 73. 19. Ihde, Technology, 108. 20. Ihde highlights that background relations are not exactly either transparent or opaque at all, as they are not the kind of explicit relation that would permit such a description. What I wish to highlight here is that such background relations may permit other kinds of technological relation to take hold. As Dominic Smith has argued, in the philosophy of technology that Ihde develops and informs, the tendency may be toward a focus on individual technological artifacts rather than ‘transcendental’ questions of this kind. Dominic Smith, “Rewriting the Constitution: A Critique of ‘Postphenomenology’,” Philosophy of Technology, 28, no. 4 (2015): 533–51. 21. On a comparable distinction between everyday technologies and a planetary-scale ‘sensing layer,’ see Smith, ‘Ugly David and the Magnetism of Everyday Technologies: On Hume, Habit, and Hindsight,’ in this volume. 22. While in the first collection of Cage’s texts, 1961’s Silence: Lectures and Writings, this essay is said to have been delivered in 1937, Leta Miller’s study has shown that it in fact appears to have been written no earlier than 1940. Leta Miller, ‘Cultural Intersections: John Cage in Seattle (1938–1940),’ in John Cage: Music, Philosophy, and Intention, 1933–1950, edited by David W. Patterson (New York: Routledge, 2002), 15–46. 23. Cage, Silence, 4. 24. Ibid, 6. 25. Ibid, 4. 26. Ibid, 5. 27. Cage, For the Birds, 73. 28. Richard H. Brown, Through the Looking Glass: John Cage and Avant-Garde Film (Oxford: Oxford University Press, 2019), 22. 29. Ibid., 34. 30. Notation may then be thought of as the kind of artistic technology analysed in Lushetich, ‘The Given and the Made: Thinking Transversal Plasticity with Duchamp, Brecht and Troika’s Artistic Technologies’ in this volume. See also Sha, ‘Adjacent Possibles: Indeterminacy and Ontogenesis’ in this volume on mathematical notation as a kind of speculative abstraction. 31. Cage, Empty Words, 8. 32. Cage, Silence, 10. 33. A major critical perspective on Cage concerns this question of whether he retains a goal of capturing every sound, such as from the purview of reclaiming everything that was once ‘noise’ into the far from neutral category of ‘music.’ His most famous piece, 4’33” (1952), encompassing the ‘silence’ of any possible environmental sound, has presented a stage on which to consider this question. See, for example, Douglas Kahn, Noise, Water, Meat: A History of Sound in the Arts (Cambridge, MA: The MIT Press, 1999), chapter 6. 34. Cage, Silence, 28. See also Daniel Charles, ‘Figuration and Prefiguration: Notes on Some New Graphic Notions,’ in Writings about John Cage, edited by Richard Kostelanetz (Ann Arbor: The University of Michigan Press, 1993), 258; Joe

140

Chapter 8

Panzner, The Process That Is the World: Cage/Deleuze/Events/Performances (New York: Bloomsbury, 2015), 49–52. 35. For a detailed study see You Nakai, Reminded by the Instruments: David Tudor’s Music (Oxford: Oxford University Press, 2021), chapter 2; James Pritchett, ‘David Tudor as Composer/Performer in Cage’s Variations II,’ Leonardo Music Journal, 14 (2004): 11–16. 36. Nakai, Reminded, 135. 37. Ibid, 134. 38. Pritchett, ‘David Tudor,’ 14. 39. John Driscoll and Matt Rogalsky, ‘David Tudor’s Rainforest: An Evolving Exploration of Resonance,’ Leonardo Music Journal, 14 (2004): 25–30. 40. See Bruce Clarke and Mark B.N. Hansen, eds., Emergence and Embodiment: New Essays on Second-Order Systems Theory (Durham, NC: Duke University Press, 2009). 41. For an inside account see Billy Klüver, Julie Martin, and Barbara Rose, Pavilion: Experiments in Art and Technology (New York: E.P. Dutton, 1972). 42. W. Patrick McCray, Making Art Work: How Cold War Engineers and Artists Forged a New Creative Culture (Cambridge, MA: The MIT Press, 2020), 117. 43. David P. Miller, “Indeterminacy and Performance Practice in Cage’s ‘Variations’,” American Music, 27, no. 1 (2009): 74. 44. John Beck and Ryan Bishop, Technocrats of the Imagination: Art, Technology, and the Military-Industrial Avant-Garde (Durham, NC: Duke University Press, 2020), 88. 45. The most prominent example of this is in Pepsi’s withdrawal of funds from the Pavilion E.A.T. had planned for Expo ’70 in Osaka, Japan. See Fred Turner, ‘The Corporation and the Counterculture: Revisiting the Pepsi Pavilion and the Politics of Cold War Multimedia,’ The Velvet Light Trap, 73 (Spring 2014): 66–78. On Tudor’s place in this, see Driscoll and Rogalsky, ‘David Tudor’s Rainforest.’ 46. Beck and Bishop, Technocrats, 5. 47. Ibid, 86. 48. Ibid, 7. 49. McCray, Making Art Work, 110. 50. Georges Canguilhem, ‘The Problem of Regulation in the Organism and in Society,’ in Writings on Medicine, translated by Stefanos Geroulanos and Todd Meyers (New York: Fordham University Press, 2012), 67–77. 51. Geroulanos, Transparency, 209. 52. Yuk Hui, Recursivity and Contingency (London: Rowman & Littlefield, 2019); Yuk Hui, On the Existence of Digital Objects (Minneapolis: University of Minnesota Press, 2016). 53. Andrew Pickering, ‘Cybernetics as Nomad Science,’ in Deleuzian Intersections: Science, Technology, Anthropology, edited by Casper Bruun Jensen and Kjetil Rödje (New York: Berghahn Books, 2010), 155–62. 54. See, for example, Ronald R. Kline, The Cybernetics Moment: Or Why We Call Our Age the Information Age (Baltimore: Johns Hopkins University Press, 2015),



How the Performer Came to Be Prepared

141

chapter 2; Sebastian Franklin, Control: Digitality as Cultural Logic (Cambridge MA: The MIT Press, 2015), chapter 2. 55. Eric Drott, ‘Music and the Cybernetic Mundane,’ Resonance: The Journal of Sound and Culture, 2, no. 4 (2021): 581. 56. Ibid, 592. 57. Ibid, 582. 58. Ibid, 590. 59. Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (London: Profile Books, 2019). 60. John Cage, A Year from Monday: New Lectures and Writings (Middletown, CT: Wesleyan University Press, 1969). 61. Ron Kuivila and David Behrman, ‘Composing with Shifting Sand: A Conversation between Ron Kuivila and David Behrman on Electronic Music and the Ephemerality of Technology,’ Leonardo Music Journal, 8 (1998): 14, 15. 62. ‘David Behrman: Interview by Jason Gross (August 1997),’ Perfect Sound Forever, http:​//​www​.furious​.com​/perfect​/behrman​.html. 63. George E. Lewis, ‘Too Many Notes: Computers, Complexity and Culture in Voyager,’ Leonardo Music Journal, 10 (2000): 33. 64. Kuivilia and Behrman, ‘Composing,’ 15. 65. Lewis, ‘Too Many Notes,’ 36. 66. Ibid, 33–34. 67. Ibid, 34. 68. ‘Tectonics: Festival of New & Experimental Music, 4–5 May 2019,’ http:​//​2019​ .tectonicsfestival​.com​/assets​/SSO​_Tectonics19​_Schedule​_Single​_(1)​.pdf.

Chapter 9

The Given and the Made Thinking Transversal Plasticity with Duchamp, Brecht, and Troika’s Artistic Technologies Natasha Lushetich

Chance is the measure of our ignorance. – Henri Poincaré1

The first word that usually comes to mind when we say ‘artistic practice’ is technique, rather than technology. Both of course derive from the Greek tekhne: ‘art, skill, craft in work, a system or method of making or doing.’2 Both systematise and organise, though in everyday parlance, ‘technique’ is predominantly used to signify a particular way of doing something—from the embodied techniques of walking to painterly techniques such as tromp l’oeil—while technology is used to signify device- and, more recently, network-based mediation, of which the computer, the internet, and generative adversarial neural networks are examples. The reason why I will use ‘technology’ rather than ‘technique’ to discuss intermedial3 artistic practices that re-position (material and immaterial) objects to articulate a particular form of plasticity is fourfold. First, ‘logos,’ which, after Heraclitus, refers to an organised, reason-based study, has another origin in Hellenic Judaism, some four hundred years after Heraclitus; namely, in Philo’s thought. Following Plato’s distinction between the imperfect matter and perfect form, Philo used ‘logos’ to refer to a divine intermediary whose purpose was to bridge the gap between immaterial 143

144

Chapter 9

perfection and the material world.4 Divinities and Platonic dichotomies aside, this mediatory use of ‘logos’ resonates strongly with the work of Niels Bohr. For Bohr, quantum physicist and philosopher, known for the theory of entanglement of the observed, observation methods, and apparata, concepts are material arrangements.5 They are not ideational matrixes that exist in people’s heads but rather concretely organisational matrixes, always already active in the world. Second, the artistic technologies I wish to discuss, those of Marcel Duchamp, George Brecht, and the artistic collective Troika, are technologies (repeatable technical logics) in Gilbert Simondon’s sense of the word, explained in more detail later. Third, these practices emerged in three very different epochs—respectively, the period between World War I and II, the post-1950s era, and the second decade of the twenty-first century. Though different, all three are characterised by an unprecedented hegemony of deterministic technologies, whether of a military, social, or digital kind. Fourth, the work of Duchamp, Brecht, and Troika is in direct dialogue with science as a socio-cultural nexus, not in a straightforwardly collaborative way6 but in a manner that distils an artistic technology from a transposition of the existing social-scientific technologies. In a nutshell, the argument of this chapter is that Duchamp’s erotic accidentals in n+ dimensions, Brecht’s event scores (four-dimensional performative ready-mades), and Troika’s dice-based cellular automata challenge three socio-scientific dogmas that order the world. These are: (1) space is inert and three dimensional—time is ‘added’ as the fourth dimension; (2) language is static and atemporal; and (3) evolution is temporally linear—phenomena develop from simple to complex. Duchamp, Brecht, and Troika’s artistic technologies, by contrast, articulate transversal plasticity while also revealing the underlying indeterminacy of space-time, as a phenomenon and concept.7 In what follows, I will briefly sketch the context, then discuss Duchamp, Brecht, and Troika’s practices to show how they reframe space-time’s fundamental plasticity. Before I move on, however, a working notion of plasticity is in order. In artistic practice in particular, elasticity is sometimes confused with plasticity.8 The difference between the two is that whereas elasticity denotes the relationship between thresholds and ruptures, constraints and relaxations, plasticity describes both a system’s property (its ability to adapt to environmental stimuli), and its own dynamics—the interpenetration of formlessness and in-formation, the innate and the acquired, morpho- and ontogenesis. Plasticity is thus both an ability/capacity, and a meta-dynamic process since it reflects (on) itself, much like the recursive function in computing calls itself. This is why, contrary to popular examples—leaves, roots, and the human brain, all of which are of the same genus—plasticity is also a transversal phenomenon. As processual modulation, it unfolds as a result of the interpenetration of different onto-epistemological domains. Many

The Given and the Made

145

technologies fuse the scientific, biosocial, and political by re-organising both the domain of knowledge and the domain of being.9 Technologies that articulate the mutual modulation of space-time and social-scientific notions and methods, additionally (re-)articulate technological mediation. TECHNOLOGY, READY-MADES, ART-SCIENCE BINARISM For Simondon, mediation is not a ‘simple structuring of the universe’ since it possesses a ‘certain density’ which becomes objectified in technics and subjectified in religion.10 In religion, which, we could say, accounts for (the coming into being of) the given—both as the phenomenological es gibt11 and les données12—‘particular being is understood in relation to a totality in which it participates, but which it can never completely express.’13 In the domain of the technical by contrast—where the given becomes the made—there is no ‘absolute adequacy’ in the sense that the technical object is ‘particularized.’14 This is also why Simondon claims that ‘adding technical objects one to another,’ cannot ‘remake the world.’15 However, in a later, posthumously published text, ‘Technical Mentality,’ Simondon notes that the technical domain ‘has not yet properly emerged.’16 He also adds that ‘the technical mentality’ unites ‘cognitive schemas, affective modalities and norms of action: that of the opening’ and ‘lends itself remarkably well to being continued, completed, perfected, extended.’17 His example of choice comes from the arts and architecture: Le Corbusier’s Convent of La Tourette18 which ‘within its plan has its proper line of extension.’19 In other words, the technical domain is seen not only as productive and future-orientated, as is usually the case, but also as creative. I would add that creativity arises from a particular form of conceptual-technical mediation between the given and the made, characteristic of the ready-made, as well as from repetition and extension. Le Corbusier was no Ferdinand Cheval. Cheval, a postman by profession, was much admired by the Surrealists and the Situationists International. Having spotted an anomalous but beautiful stone on one of his delivery rounds, he subsequently spent thirty-three years building his ‘ideal palace’ at the Hauterives in southeast France, finally completing it in 1912. Le Corbusier’s name, by contrast, is synonymous with the managerial approach to architecture and is associated with pronouncements such as ‘a house is a machine for living in’ that needs to be designed on the ‘same principles as the Ford car,’ by applying ‘simplicity and mechanization.’20 Simondon’s prescient suggestion that the open character of technical reality will manifest in a ‘virtual network to come,’21 defined as a ‘multifunctional network that marks the key points of

146

Chapter 9

the geographical and human world,’22 is particularly important in the context of the ready-made and the art-science debate. Ready-mades, of which Duchamp’s 1917 Fountain is perhaps the most famous example23—a urinal submitted for an exhibition of The Society of Independent Artists in New York and christened ‘The Fountain’—are ordinary functional objects removed from the domain of daily life and placed in an aesthetic context, that of a gallery or a museum. In the gallery/museum, they are semantically re-contextualised as objects of aesthetic (and often also monetary) value. Performative ready-mades or event scores, which are the hallmark of Brecht’s work, and will be discussed in more detail later, are similarly instructions for actions that re-contextualise an everyday activity or event.24 Virtual-actual ready-mades, the hallmark of Troika’s work, remediate a virtual object in an actual one. They also use the aesthetics of scientific visualisation as a ready-made. By definition, all ready-mades reflect a particular mode of production—repetition and multiplication, mechanical or automated—as well as the reproduction of value, which is why the (historical) appearance of specific ready-mades is invariably coterminous with the prevalent economic order: commodity economy in the case of the Duchampian ready-made, service economy in the case of the Brechtian performative ready-made, and knowledge economy in the case of Troika’s virtual-aesthetic ready-made. All also blur the boundary between the given and the made. While we can state with certainty that the urinal in The Fountain is a manufactured object, we cannot say that re-contextualising an already existing object semantically and aesthetically is not a form of making, given the extensive history of conceptual art.25 But equally, we cannot say that it is a form of making, in the strict (material) sense of the word. Rather, in a move similar to that of Object-Oriented Ontology,26 these artistic technologies bring forth the hitherto unknown—and unused—actual-virtual dimensions of the (material or immaterial) object, inscribed in a long tradition of questioning the physical and virtual dimensions of objects in the arts, as can be seen, among other examples, from Takehisa Kosugi’s (ready-made) chairs that open onto (to humans) imperceptible dimensions.27 It may be useful to mention at this point that the divide between art and science emerged in the nineteenth century, as a result of the Industrial Revolution.28 It was only after the Industrial Revolution that the arts came to be regarded as idiosyncratically creative, and that science came to be viewed as the domain of objective and progressive knowledge.29 The dichotomy of art—supposedly reliant on intuition, impulse, and inductive logic—and science—supposedly reliant on analytical, systematic, and deductive logical processes30—persisted until the late twentieth century. Despite the efforts of theorists such as Gaston Bachelard, who argued for the creative dimension of scientific thought,31 it was not until the 1990s, that a study of the conditions

The Given and the Made

147

that make objects visible in culture, and their subsequent categorisation as ‘science’ or ‘art,’ emerged in the social sciences.32 This new, non-binary frame, which foregrounded the socio-cultural conditions of emergence of the object of knowledge (or aesthesis) is crucial to understanding how Duchamp, Brecht, and Troika’s artistic technologies reposition material-immaterial relations and socio-scientific representations. DUCHAMP’S EROTIC ACCIDENTALS AND THE PRODUCTION OF N+ DIMENSIONS Much of Duchamp’s work was a direct response to the scientific work of the time, such as Henri Poincaré’s investigations of non-Euclidian geometry, which refuted the invariant nature of geometrical theorems.33 In his 1914–1916 Three Standard Stoppages Duchamp used a string metre, both as a reference to the platinum metre conserved in the Parisian library, and as a purposefully malleable material. Having dropped three strings of one meter in length from the height of one meter onto a black canvas, he used the obtained shapes, full of twists and curves—and thus neither of the same shape nor of the same length—as a template to reproduce the three meters in wood, which he subsequently encased and titled Three Standard Stoppages. In the 1915–1923 Bride Stripped Bare by her Bachelors, Even or The Large Glass, Duchamp similarly relied on chance operations, however this time to expand into n+ dimensions. In this (famously opaque) work small cause-effect inter-relationships were displaced within (what are usually considered) higher-order principles.34 As Duchamp repeated many times in his career, ‘three dimensions can only be the beginning of a fourth, fifth and sixth dimension, if you know how to get there.’35 Referring to the fact that Albert Einstein called the fourth dimension the ‘fourth coordinate,’ and not the fourth dimension, and that time exists even ‘in a thin line,’36 and is not ‘added’ to (static) space, Duchamp insisted that all objects had n+ dimensions, only that humans ‘lacked a sense to perceive them.’37 In his experimentations with imperceptible dimensions, he turned to the Renaissance mathematician Girard Desargues and his analysis of the conic geometry of Renaissance perspective. Desargues held that the lines emanating from a centre of perspective create cones through which planes can be intersected at various angles, and images ‘placed’ in a painting. The theorem he derived from this was: ‘two triangles are in perspective axially if they are in perspective centrally.’38 The instability of the various perspectival elements suggested by Desargues’s theorem is precisely what we find in The Large Glass. The only difference is that The Large Glass is not a Renaissance perspective painting but an intermedium39 which oscillates

148

Chapter 9

between installation, sculpture, and performance, demonstrating the contingency of what are taken to be the immutable elements of both geometry and art-making: horizon lines, centres of perspective, objects, forms, axes, and sign systems. The plasticity of every single component of the work is also the reason why The Large Glass bears no relation whatsoever to received ideas about aesthetics, form, content, or artistic medium. Rather, the work stages interdimensional conversations between space-time, movement, and geometry, explored through a sequence of micro-logical, cause-and-effect steps. The only macro-logic is found in the title: Bride Stripped Bare by her Bachelors, Even, which brings together the mechanics of diverse erotic forces (a long-standing preoccupation of Duchamp’s) through a series of arbitrarily determined procedures, interconnected via language games and chance operations.40 For example, glass was used to durably capture ephemeral performative actions; in one part of the work, the nine bachelors’ desire is materially embedded in the canvas with the aid of nine malic moulds from which matches, dipped in colour, were fired at a photograph of white gauze creased by the wind using a children’s toy gun. The chance operations of the captured wind were amplified by the chance operations of the firing mechanism and transposed onto the glass in which holes were drilled. In other words, The Large Glass is a residue of procedurally determined, micro-causally related actions, executed with specific materials whose combinatory ‘architecture’ relied on a ‘three-beat rhythm’ and Desargues’s theorem.41 The mapping of the movement of chance, as in Three Standard Stoppages, and of erotic forces, as in The Large Glass, entwines actual (and literal) with virtual (and metaphoric) elements, material and immaterial objects and logics, in a manner that questions modes of production and/or deconstruction of representational structures that create material arrangements in the world. This relationship—as well as Duchamp’s continuing preoccupation with the given and the made—can also be seen in his 1919 Unhappy Readymade, a piece for which he sent instructions to his sister, Suzanne Duchamp: a geometry book was to be tied at one corner and hung from her balcony, on Rue La Condamine in Paris, and exposed to the natural elements—wind, rain, and or/snow. The (seemingly) two-dimensional parts of the geometry book—the flat pages—were here crumpled by the wind as the book spun on its wire axis and turned into three-dimensional forms via interdimensional movement. Throughout his career, Duchamp considered works as diverse as The Large Glass and the 1946–1966 Étant donnés (a sculptural tableau, visible through a pair of peepholes, in a wooden door, of a nude lying on her back with her legs spread),42 a series despite the fact that they look nothing like a series. Étant donnés was, for him, The Large Glass reproduced inside-out and back-to-front.43 How this might work becomes easier to understand if we see

The Given and the Made

149

these works as a movement. For example, in a pair of gloves, the right-hand glove doesn’t fit inside the left-hand glove, because the ‘thumbs’ are on opposite sides. However, if one glove is pulled inside out, the two gloves can be superposed, one inside the other. Similarly, vastly visually different works can be seen as a series via n+ dimensionality—an actual-virtual dynamic that doesn’t change one or more (discrete) elements but rather their inter-relations. Duchamp’s transversal modulation of materials, objects, concepts, and actions is much clearer today, in the context of string theory. In string theory, infinitesimally small dimensions are folded into each other and vibrate both hyper- and interdimensionally. More precisely, little vibrating loops, seen as the fundamental objects of reality, vibrate with different modes in a way similar to the way guitar strings vibrate with different tones. When a string vibrates in a particular way, it ‘manifests’ as an electron; when it vibrates in another way, it ‘manifests’ as a proton. Strings merge and drift apart in six extra dimensions, in addition to the ‘four’ usual/observable ones.44 Theodor Kaluza’s 1921 discovery of the fifth dimension, based on the cylinder condition and electromagnetism,45 enabled Oskar Klein to conceptualise n+ dimensions, not as existing in the way the three dimensions do, but as ‘wrapped around themselves.’46 The process of wrapping occurs in a particular set of configurations, also called ‘manifolds.’47 Not only does each manifold afford a different way for the dimensions to wrap in or around themselves, but the number of manifolds is potentially infinite. For Duchamp, who gave much thought to infinity, artistic technologies that articulate transversal plasticity bring together cognitive schemas, affect, and event-hood. His name for this was cerebral erotics.48 However, his ready-made-tisations, which re-combine structural elements and relations of material to immaterial (conceptual) arrangements are also very similar to Simondon’s ‘opening,’ predicated on technics since the process of ready-made-tisation includes strategies as well as physical objects. In the following section, we will see how this aspect of ready-made-tisation can be further deployed by language as socio-linguistic technology. BRECHT’S EVENT SCORES: DIMENSIONS OF TEMPORISATION In a 1966 book entitled Chance Imagery, Brecht argues for a concurrent movement toward chance-acknowledging and chance-based methods in science and art. Discussing the second law of thermodynamics, gas kinetics, energy oscillation, and Werner Heisenberg’s principle of indeterminacy,49 he shows that chance is an underlying principle of the world from both the physical and chemical perspectives.50 As a practicing chemist and artist,

150

Chapter 9

Brecht considered artistic works to be ‘products of the same complex welter of cause and effect out of which came the results of mathematical physics.’51 Having originated in John Cage’s Class in New Composition at the New School in New York from 1957 to 1959, his event scores first manifested as verbal pictograms such as: TWO DURATIONS - Red - Green 1961 These scores were ‘an extension of music,’52 only, instead of music, they scored everyday occurrences and actions, which is why they may be called performative ready-mades.53 Though they originated in the Cagean score, they show a form of transitivity far more expansive than the Cagean score. For example, in Cage’s famous 4’33,” which consists of three movements marked by the Latin word tacet that, in music, indicates a period when an instrument or orchestra is not required to play, the word tacet acts as a temporal framing device for all events in time-space. Comparable to Robert Rauschenberg’s White Paintings, blank panels of white oil on canvas, which were, for Cage, ‘ways of seeing . . . mirrors of the air,’54 the reflective timespace of 4’33’’ was framed by music as much as the reflective time-space of White Paintings was framed by painting. However, Brecht’s Two Durations trigger a different structuring process and an explicit ‘intermedial dialectic.’55 The score applies the rules traditionally associated with music to ‘properties’ traditionally associated with other media: colour in the case of painting, composition in the case of dance. The result is a dissolution of media-inherent properties into a web of mutually structuring relations. While the Cagean score hinges on the structural logic of a single medium—music—Brecht’s score engages multiple media and modalities of perception and execution. More importantly, the Cagean score uses language as a vehicle of perceptual transition from one medium to another. The event score, on the other hand, extracts an artistic technology from the social (communicational and semantic) technology of language by assimilating its notion-formulating, world-making, and behaviour-enticing functions. Defined, by Brecht, as ‘the smallest unit of a situation,’56 the event score typically consists of almost imperceptible occurrences and actions. It brings into evidence small segments of lived reality through a particular actual-virtual entwinement, spatialisation and temporisation, as can be seen from his:

The Given and the Made

151

THREE GAP EVENTS - missing-letter sign - between two sounds - meeting again 1961 Three Gap Events have no semantic content. The title points to a seeming contradiction in terms, since a gap is ‘an unfilled space or interval, a blank, a gorge, a pass, a break in continuity,’57 and an event is ‘an occurrence, a thing that happens, a fact of a thing’s happening.’58 While we could say that a gap is a spatio-temporal disruption of the continuity of forces and phenomena, an event is a spatio-temporal eruption of forces and phenomena. This settles the words’ apparent irreconcilability and indicates a semantic vacuum. In the three categories of sign, sound, and action—namely ‘missing-letter sign,’ ‘between two sounds,’ and ‘meeting again’—we can detect a variation: a written sign (an object), an acoustic occurrence (a performance), and a situation (which may include both objects and performances). However, the obvious suspension of semiosis also points to its stabilising function as a determination of a centre. As Jacques Derrida notes, the function of a centre is not only to orient and organize the structure . . . but above all make sure that the organizing principle would limit what we might call the play of the structure. By orienting and organizing the coherence of the system, the centre of a structure permits the play of its elements inside the total form.59

While the play of semiosis allows for a plurality of interpretations, its structure—or its ‘total form,’ which is what determines the spatio-temporal scope of play—is itself predetermined by the governing principle to which semiosis is subordinated. The quest to determine the meaning of a word or sentence is thus an attempt to delineate the ‘fundamental ground’ where play is ‘constituted on the basis of a fundamental immobility which itself is “beyond the reach of play”.’60 And this is precisely where Brecht’s event scores operate as a technology that de-centres the playing field. If we approach this score as an activity rather than a static semiotic given, we will see that it consists of two parallel structuring processes—spacing and temporisation. Though intertwined, these processes can, for clarity’s sake, be described as spatio-temporal extensions of gap-ness and event-ness. On the one hand, there is the process of withdrawing, of dropping into a temporary caesura or absence. On the other, there is the process of amplifying the duration of a thing’s happening. Thus seen, the first of Brecht’s Three Gap

152

Chapter 9

Events—‘missing letter’—functions as an ongoing structuring activity of eventful suspension of that which once occupied or may have occupied the place of the missing letter sign. ‘Eventful suspension’ can be compared to the movement of ascending a descending staircase where motion may appear ‘arrested’ from the point of view of linear progression, but is, in fact, minutely structured in spatio-temporal terms. The second gap event—‘between two sounds’—operates through a similarly minutely structured, expectant suspension of audible sensations and is, in this sense, pregnant with sound, the absence of which it articulates. The third gap event—‘meeting again’— unfolds as an eventful suspension of physical, visual, aural, and/or tactile contact between beings and/or objects, thus making the act of ‘meeting again’ pregnant with the ‘volume’ of being apart. Seen in this way, each of the three contexts constituting the playing field of Three Gap Events emerges as a vibrant nexus of spatio-temporal structuring, happening, and event-hood. Brecht was of course familiar with the popular notion of events derived from physics, where an event is a point taken from three dimensions to four dimensions; a light switch is, according to this view, a three-dimensional situation, but the switching on of light is a four-dimensional one, as can be seen from his: ‌‌THREE LAMP EVENTS On / Off Lamp Off / On 1963 ‌‌ Like any other four-dimensional situation, a simple event such as the switching on of light is prone to felicitous and infelicitous occurrences. This further amplifies the event’s essentially contingent nature. For Derrida, an event is ‘nothing of which it might be possible to say: “this is” (with any complement of being) in the way that Hegel was able to say “this is an oak in which we discern the development of an acorn”.’61 Rather, an event is ‘the evaporation (échéance) of any variety of acorns . . . unique and unforeseeable, free of ulterior expectations and in no way subject to teleological maturation: an oak that has nothing to do with any acorn.’62 The reason why Three Gap Events, like many other event scores, explodes the playing field of language and articulates its (spatio-)temporisation through transversal plasticity of sound, image, word, movement, and duration, is that, like Duchamp’s cerebral erotics, it stages indeterminacy with the aid of deterministic parameters while also reframing the relationship between the given and the made. However, the event score also foregrounds vehicularity as iterative, extendable,

The Given and the Made

153

recombinant, and repeatable movement that is the Simondonian ‘opening.’63 Troika’s artistic technology takes vehicularity in a different direction. It relies on re-mediation to destabilise the correlation between scientific representation and the modelling of knowledge, similarly to the way Brecht’s event scores de-centre the fundamental ground of language, only using cellular automata (units in a mathematical system) to challenge the determinism of unilinear time and evolution seen as progression from the simple to the complex. TROIKA’S REVERSE REMEDIATION: DIMENSIONS OF DEEP TIME Between 2013 and 2016, the artistic collective Troika—Eva Rucki, Conny Freyer, and Sebastien Noel—produced a series of works entitled Hierophany (2013), from the Greek hiero, for sacred, and phainein, to show: Calculating the Universe (2014); and A New Kind of Fate (2016). Consisting of framed compositions of black and white dice, these works were made following the rules of cellular automata. A New Kind of Fate is additionally a reference to mathematician Stephen Wolfram’s famous 2002 book about cellular automata: A New Kind of Science. The rules governing the replication of cellular automata are simple, which is why they are often used as a visual representation of complex systems, the assumption being that behaviour ‘progresses’ from simple to complex, and that this progression/evolution can be unproblematically modelled. Cellular automata are thus used in epidemiology (to study the evolution of disease epidemics), in biology (to simulate living systems), and physics (to model fluid dynamics). They consist of cells on a grid, which exist in a finite number of dimensions. Each cell has a state. While there are numerous finite possibilities of the state, the simplest is usually 0 or 1, ON or OFF. Although the simplest cellular automaton is one-dimensional, in theory, it can have any number of dimensions. The state of each cell changes in discrete steps at regular time intervals, the assumption being, again, that time is linear and progressive. At each interval, the state is dependent on the cell’s own state in the previous time step, and on the states of its immediate neighbours. The same rules can be iteratively applied to as many generations as required; the evolution of a one-dimensional cellular automaton can be illustrated by starting with generation zero in the first row, continuing with the first generation in the second row, and so forth. As there are eight possible binary states for the three cells neighbouring a given cell, there is a total of elementary 256 cellular automata.64 In A New Kind of Science, Stephen Wolfram divides cellular automata into four classes. In the first class, almost all patterns evolve into a stable,

154

Chapter 9

homogenous state; in the second, they transform either into a stable or oscillating state; in the third class, initial patterns transform into a chaotic state; and in the fourth, they transform into complex structures that interact in increasingly complex ways.65 For Wolfram, there is a large class of systems—biological ones being a prime example—that ‘spontaneously generate structure with time, even when starting from disordered initial states.’66 This conclusion is derived from studying the underlying forms of proto-computation behind emergent order in the various shell and animal coats’ patterns, and from Warren Weaver’s notion of organised complexity.67 A diagram of the evolution of Rule 30, which belongs to the third class of cellular automata rules described by Wolfram, and is the mirror complement of several other rules such as 86, 135, and 149, shows a chaotic sequence.68 However, this sequence can nevertheless be modelled, and, to a degree, determined. Indeed, since Wolfram’s ground-breaking book, Rule 30 has been key to the proto-computational view of nature,69 based on the progression from simplicity to complexity. The famous comparison of Rule 30 to the pattern of the cone snail species’ aligns computation with nature in a simple, though by no means unproblematic way.70 Troika’s transposition of abstract, computational phenomena used in complexity modelling, which, like language, is a social technology, to simple physical yet symbolically loaded objects—dice—the hallmark of chance operations and games of probability, is, like Duchamp’s Unhappy Readymade and Brecht’s Three Lamp Events, a form of fusing the literal and the metaphorical. It is also a form of remediation. Remediation is traditionally, though not solely, seen as a ‘progression’ from an older medium to a newer one. For example, digital photography remediated analogue photography, which remediated painting, while video conferencing remediated telephony and televisual communication.71 Aligned side by side in Troika’s compositions with only one side facing the viewer, the square edges of the dice form a grid, which acts both as a literal and structural containment of randomness. In one of the Hierophany series images, triangular forms expand and disintegrate. Another seems to vibrate with monochrome noise like an old television set. In all these images—which, like Duchamp and Brecht’s artistic technologies, and like Le Corbusier’s Convent of La Tourette, lend themselves to endless modulation and recombination—the graphic alternation of black and white squares is used in a reverse remediatory gesture, similar, say, to the Balanescu Quartet (a classical string quartet) playing Kraftwerk’s 1980s electronic music.72 Other works from the same series, such as Calculating the Universe, are images constructed from thousands of black and white dice that actualise a simple binary programme or algorithm. In A New Kind of Fate, the dice are dispersed from a central axis proceeding upwards and downwards, and laid,

The Given and the Made

155

line by line, through discrete time steps according to the cellular automata rules and based on the states of neighbouring cells. Historically, dice have been thrown to determine fate since they rely not on chance, but on probability laws.73 However, when oracular practices apply the probabilistic logic of dice to human life, the connection is arbitrary; the six potential results can be related to any sphere of human life: work, health, family, and so on. Algorithms, by contrast, which these works reference, are a powerful technology for predicting and in fact creating everyday reality. The fate of (at least some) humans is certainly determined by algorithmic crime predictions, insurance premiums, or offender’s recidivism, on the basis of postcodes, race, gender, and education, which effectively penalise the underprivileged part of the population74 while also reducing the opportunity for change as they blatantly assume that the future will be no different from the past.75 In A New Kind of Fate, Troika use tetrahedral dice, first used in 2600 BC, in the Royal Game of Ur, the ancient board game found in the Royal Tombs of Ur in Iraq in the 1920s.76 The reason why these three works operate as an artistic technology, rather than works that ruminate on the history of algorithms or their medially efficacious working,77 is that two kinds of ready-mades are used: dice and aesthetics. The use of tetrahedral dice points to the fact that what was once a game, or, we could say, an oracular technology, is now considered informed modelling. As Orit Halpern notes in Beautiful Data, cybernetics trained people to analyse the world on the basis of what is taken to be a communicative objectivity, itself based on the repetition and sequencing of patterned forms of visualisation.78 Seen as that which guarantees ‘objectivity,’ this quintessentially aesthetic (rather than scientific) process marks an ideological shift from demonstrating to determining the processes of knowledge formation through models designed for the visualisation of complex systems, such cellular automata.79 The aesthetics of modelling and prediction is, in this sense, a cognitive/affective ready-made. Through a reverse remediatory gesture, it shows the production of facticity through virtual means as a new relationship of the given and the made where aesthetics—as a combination of arrangements and sequences, thus both spatial and temporal—flattens the indeterminacy of space-time into a manipulable Heideggerian world-as-picture.80 By relating ancient objects-devices (dice) to cellular automata-based modelling in a temporal loop that brings deep time and the past, with all its sedimentations and iterations81 to bear on the manufacturing of the future, Troika’s artistic technology reiterates the transversal plasticity of space-time and articulates its virtual dimension. Perhaps more than Duchamp and Brecht’s technologies, it shows that concepts, as material arrangements, reveal a vehicular and propensive domain of technics.

156

Chapter 9

CONCLUDING THOUGHTS The spatio-temporisation of non-evident dimensions that the discussed technologies exhibit may be compared to a relational technology called Legba. In anthropologist Marc Augé’s, and Félix Guattari’s writing, Legba is a mobile spatio-temporal system that regulates ontological and social relations, those between humans and other-than-humans, the past and the future, fecundity, opportunity, and infinity.82 In Guattari’s rendition, Legba is variously a ‘dimension of destiny, a vital principle, a materialised god, a sign of appropriation, an entity of individuation, and a material object placed at the entrance to the village or house.’83 This is similar to the Jewish mezuzah—a piece of parchment in a decorative case, inscribed with verses from the Torah. Unlike the mezuzah, however, which doesn’t change shape or material, Legba can be a handful of sand, a receptacle, a movement, or an expression of a relationship to others, living or not. In other words, Legba is a latent vehicular organiser of life—a multi-modal relational technology which ‘corresponds to the obvious fact that the social is not simply of a relational order but of the order of being.’84 Mediating between exteriority and interiority, identity and alterity,85 Legba’s working, like that of Duchamp’s cerebral erotics, Brecht’s event scores, and Troika’s reverse remediation, is transversally plastic. Yet the technical, vehicular aspect of Legba’s operation, which expands and amplifies space-time and its observable and unobservable dimensions, re-combining and/or extending them in the manner of Le Corbusier’s Convent of La Tourette, is essentially the same, in other words: repeatable. Like numerous other Indigenous technologies with an explicitly articulated onto-epistemic dimension86 whose purpose is to navigate the manifolds,87 the technical-vehicular aspect of Duchamp, Brecht, and Troika’s artistic technologies brings the deterministic, micro-causal framing of indeterminacy to bear on multiple dimensional registers through (a process of) ready-made-tisation. It goes without saying that what is considered as the made in one generation becomes the given in the next. What is much more difficult to perceive, however, because of its complexity, is the idea of technology as that conjunction of dynamics and operational mechanisms that continually (re)arranges the world according to sur-chaotic principles88 through a ready-made-tisation of the processes of layering and change.89 NOTES 1. Henri Poincaré, The Foundations of Science: Science and Method, Book I, translated by George Bruce Halsted (New York: The Science Press, 1913), 395.

The Given and the Made

157

2. Oxford Research Encyclopedias, 2019, Ian James ‘Techne,’ https:​//​doi​.org​/10​ .1093​/acrefore​/9780190201098​.013​.121. 3. Initially coined by Samuel Taylor Coleridge, the term ‘intermedia’ was extensively theorised by the Fluxus artist Dick Higgins, for whom it denotes the space between media, such as visual poetry and action music, as well as between art media and life media, such as the ready-made. See Dick Higgins, ‘Theory and Reception,’ in The Fluxus Reader, edited by Ken Friedman (Hoboken, New Jersey: Willey, 1998), 222. 4. Frederick Copleston, A History of Philosophy, volume 1 (London: Continuum, 2003), 458–62. 5. Niels Bohr, ‘Causality and Complementarity,’ in Causality and Complementarity: The Philosophical Writings of Niels Bohr, edited by Jan Faye and H.J. Folse (Woodbridge, CT: Ox Bow Press, 1998), 83–92; and ‘Discussions with Einstein on Epistemological Problems in Atomic Physics,’ in Albert Einstein, Philosopher–Scientist: The Library of Living Philosophers, volume 7, edited by P.A. Schilpp (Evanston: Open Court, 1949), 201–41. See also Karen Barad’s commentary on this in Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning (Durham: Duke Press, 2007), 195–96 and 348–50. 6. Such as working in bio-labs as bio-artists Eduardo Kac, Oron Catts, and Ionat Zurr do. 7. Here I am relying on Barad’s transdisciplinary interpretation of Bohr’s thought, which brings critical theory to bear on quantum mechanics and differs significantly from other notable interpretations, for example, Jan Faye’s in such works as Niels Bohr: His Heritage and Legacy, an Anti-Realist View of Quantum Mechanics (Dordrecht: Kluwer Academic Publishers, 1991). 8. See, for example, Mónica Amor’s recent article on Philippe Parreno, among others: ‘From Plasticity to Elasticity: Philippe Parreno’s Permanent Revolution,’ OBOE Journal, 2, no.1 (2021). 9. An example of this is the use of biometrics in refugee management, which welds data collection and somatic quantification to such existential concerns as food and shelter. See Btihaj Ajana, ‘Biometric Datafication in Governmental and Personal Spheres,’ in Big Data—A New Medium?, edited by Natasha Lushetich (London: Routledge, 2020), 63–79. 10. Gilbert Simondon, Extract of On the Mode of Existence of Technical Objects, translated by Nandita Mellamphy et al, Deleuze Studies 5, no. 3 (2011): 415. 11. Usually translated as ‘there is’ or ‘it gives,’ es gibt refers to the given-ness of the world. In Martin Heidegger’s work, ‘es gibt’ implies that there is something, which gives this ‘gift’ or given-ness to something or someone. See Being and Time, translated by John Macquarrie and Edward Robinson (Oxford: Blackwell Publishers, 1962). 12. For Henri Bergson, ‘les données,’ the given, is associated with the immediate data of consciousness. The given is a temporal multiplicity. See Time and Free Will: An Essay on the Immediate Data of Consciousness, translated by F.L. Pogson (Montana: Kessinger Publishing Company, 1910).

158

Chapter 9

13. Simondon, Extract of On the Mode, 419. See also the thematisations of the given in Parisi, ‘Transcendental Instrumentality and Incomputable Thinking,’ and Sha, ‘Adjacent Possibles: Indeterminacy and Ontogenesis,’ both in this volume. 14. Simondon, Extract of On the Mode, 421. 15. Ibid. 16. Gilbert Simondon, ‘Technical Mentality,’ translated by Arne de Boever, Parrhesia 7 (2009): 17 17. Ibid, 24. 18. Sainte-Marie de La Tourette, located near Lyon and completed in 1961, was constructed with rectangular concrete blocks that indicate a modifiable structure. 19. Simondon, ‘Technical Mentality,’ 25. 20. Le Corbusier, Towards a New Architecture, translated by John Goodman (Mineola: Dover Publications, 1986 [1927]). 21. Simondon, ‘Technical Mentality,’ 22. 22. Ibid. 23. Although Marcel Duchamp is generally credited with the ‘invention’ of the ready-made, the first ready-made was in fact Alphonse Allais’s Pimps in their Prime, Belly in the Grass, Drinking Absinth, a green carriage curtain, dated before 1897. See Michel Onfray, Les anartistes (Paris: Albin Michel, 2022), 144. Translation mine. 24. For more information see Natasha Lushetich, ‘The Event Score as a Perpetuum Mobile,’ Text and Performance Quarterly, 32, no. 1 (2012): 1–19. 25. In conceptual art, concepts are treated as artistic materials. See Sol LeWitt, ‘Paragraphs on Conceptual Art,’ Artforum, 5, no. 10 (Summer 1967): 79–84. 26. In Object-Oriented Ontology, objects exist independently of human perception, are not identical with their properties, and cannot be ontologically exhausted by their relations to humans or other objects. See, for example, Graham Harman, The Quadruple Object (New York: Zero Books, 2011). 27. Takehisa Kosugi notes: ‘take this chair as an example. Maybe it has another part and it is exposed to another dimension, but we cannot see it. On the physical stage, it’s just a chair.’ Kosugi cited in David T. Doris, ‘Zen Vaudeville: A Medi(t)ation in the Margins of Fluxus,’ in The Fluxus Reader, edited by Ken Friedman (Chichester: Academy Editions, 1998), 104. 28. See C.P. Snow, The Two Cultures and the Scientific Revolution (Connecticut: Martino Fine Books, 2013 [1959]). 29. See Thomas Kuhn’s The Structure of Scientific Revolutions (Chicago: University of Chicago Press, 1996 [1962]). 30. Both conceptual art, which rarely relies on induction, and Bruno Latour and Steve Woolgar’s analysis of scientific processes as, often, inductive, are examples of the limitations of this claim. See Bruno Latour and Steve Woolgar, Laboratory Life: The Construction of Scientific Facts (Princeton: Princeton University Press, 1986 [1976]). 31. See for example Cristina Chimisso, Gaston Bachelard: Critic of Science and Imagination (London: Routledge, 2014). 32. See Peter Galison and Caroline A. Jones, eds., Picturing Science, Producing Art (London: Routledge, 1998)

The Given and the Made

159

33. Henri Poincaré, The Three-body Problem and the Equations of Dynamics, translated by Bruce D. Popp (Cham, Switzerland: Springer, 2017). 34. This work, also referred to as Le Delai (Delay), consists of many small causal relationships; however, the overall impression is often one of incongruity. 35. Calvin Tomkins and Marcel Duchamp, Marcel Duchamp: The Afternoon Interviews (New York: Badlands Unlimited, 2013), 93. 36. Ibid. 37. Ibid. 38. Dimitrios Kodokostas, ‘Proving and Generalizing Desargues’ Two-Triangle Theorem in 3-Dimensonal Projective Space,’ Geometry, 2014, https:​//​doi​.org​/10​.1155​ /2014​/276108. 39. See note 3. 40. An example of a language game is the bride—whose ‘bodily envelope,’ to use Duchamp’s words, is ‘auto-mobilistic in nature.’ Marcel Duchamp, La Première recherché (Paris: Editions du Centre Pompidou, 1994), 62. 41. Duchamp quoted in Arturo Schwarz, La Mariée mise à nu chez Marcel Duchamp, même (Paris: Georges Fall, 1974), 157. 42. This work consisted of an old wooden door, nails, bricks, brass, aluminium, velvet, leaves, a female form made of parchment, hair, glass, oil paint, a landscape composed of photographed elements, and an electric motor rotating a perforated disc. 43. This is a theme Duchamp keeps going back to, in, among other sources, Tomkins and Duchamp, The Afternoon Interviews (New York: Badlands Unlimited, 2013). 44. John H Schwarz, ‘Recent Developments in Superstring Theory,’ Proc Natl Acad Sci USA, 95, no. 6 (1998): 2750–57. 45. M.Y. Thiry, ‘Les équations de la théorie unitaire de Kaluza,’ Compt. Rend. Acad. Sci. Paris, 226 (1948): 216–18. 46. Finn Ravndall, ‘Oskar Klein and the Fifth Dimension,’ (September 2013), https:​//​doi​.org​/10​.48550​/arXiv​.1309​.4113. 47. Hyun Seok Yang and Sangheon Yun, ‘Calabi-Yau Manifolds, Hermitian Yang-Mills Instantons and Mirror Symmetry,’ (July 2011), https:​//​doi​.org​/10​.48550​ /arXiv​.1107​.2095. 48. Duchamp and Tomkins, The Afternoon Interviews, 43. 49. Which showed that it was not possible to determine both the position and the momentum of an electron at the same time: as the measuring precision of an electron momentum was increased, the precision with which the position of the electron was measured decreased, and vice versa. 50. George Brecht, Chance-Imagery (New York: Great Bear/Something Else Press, 1966), 14–16. 51. Ibid, 16. 52. Brecht in Henry Martin, An Introduction to George Brecht’s Book of the Tumbler on Fire (Milan: Multhipla Edizioni, 1978), 84. 53. See Lushetich, “The Event Score.” 54. John Cage quoted in Rudolf Frieling et al., The Art of Participation from 1950 to Now (San Francisco: Museum of Modern Art, 2008), 82.

160

Chapter 9

55. Dick Higgins, The Dialectic of Centuries: Notes towards a Theory of New Arts (Rochester: BOA Editions, 1999 [1978]). 56. George Brecht, ‘The Origin of Events,’ in happening & fluxus (exhibition catalogue), edited by Hans Sohm (Cologne: Kolnischer Kunstverein, 1970). 57. H.W. Fowler and F.G. Fowler, The Concise Oxford Dictionary of Current English (Oxford: Oxford University Press, 1964), 503. 58. Ibid, 416. 59. Jacques Derrida, Writing and Difference, translated by Alan Bass (London: Routledge, 2008 [2001]), 352. 60. Ibid. 61. Jacques Derrida, Mémoire pour Paul de Man (Paris: Galilée, 1988), 152. 62. Ibid. 63. Simondon, ‘Technical Mentality,’ 24. 64. Stephen Wolfram, ‘Statistical Mechanics of Cellular Automata,’ Rev. Mod. Phys., 55, no. 601 (July 1983), https:​//​journals​.aps​.org​/rmp​/abstract​/10​.1103​/RevModPhys​ .55​.601; Stephen Wolfram, A New Kind of Science (Wolfram Media, 2002). 65. Wolfram, A New Kind of Science. 66. Wolfram, ‘Statistical Mechanics,’ 16. 67. In his 1948 article ‘Science and Complexity,’ Weaver suggests that whereas in disorganised complexity the interaction between different elements is random, organised complexity establishes a relationship of simplicity to complexity. 68. Wolfram, A New Kind of Science. 69. See, for example, Eric J. Chaisson, Cosmic Evolution: The Rise of Complexity in Nature (Cambridge: Harvard University Press, 2001). 70. See, for example, Pierre Huyghe’s work, such as the 2017 After ALife Ahead: https:​//​www​.estherschipper​.com​/artists​/41​-pierre​-huyghe​/works​/15049​/. 71. Richard Grusin and Jay David Bolter, Remediation: Understanding New Media (Cambridge, MA: MIT Press, 1999). 72. See Nick Collins et al., Electronic Music (Cambridge: Cambridge University Press, 2013), 91. 73. And their use in experimental composition, for example, John Cage’s dice-based compositions. 74. Cathy O’Neill, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (London: Penguin, 2016), 41. 75. Ibid. 76. Troika, ‘A New Kind of Fate,’ http:​//​troika​.uk​.com​/work​/troika​-new​-kind​-fate​ -dice​-work​/. 77. See Natasha Lushetich, ‘Algorithms and Medial Efficacy,’ The Philosophical Salon, January 13, 2020, https:​//​thephilosophicalsalon​.com​/algorithms​-and​-medial​ -efficacy​/. 78. Orit Halpern, Beautiful Data: A History of Vision and Reason since 1945 (Durham: Duke University Press, 2014), 148. 79. See, for example, Paolo Cirio on challenging the depict-ability of complex systems through simple visual forms: https:​//​paolocirio​.net​/press​/texts​/evidentiary​ -realism​.php.

The Given and the Made

161

80. See Martin Heidegger, The Question Concerning Technology and Other Essays, translated by William Lovitt (New York: Harper Colophon Books, 1977), 129–30. 81. On the entanglement of sedimentation, duration, and perception, see Henri Bergson, Matière et mémoire (Paris: Les Presses Universitaires de France, 1939). 82. See also David Zeitlyn’s chapter on oracular technologies in this volume. 83. Felix Guattari, Chaosmosis: An Ethico-Aesthetic Paradigm, translated by Paul Bains and Julian Pefanis (Bloomington: Indiana University Press, 1993), 45–46. 84. Ibid, 46. 85. Ibid. 86. See, for example, Gregory Cajete’s ‘Relational Philosophy: The Stars are Our Relatives,’ in Distributed Perception: Resonances and Axiologies, edited by Natasha Lushetich and Iain Campbell (London: Routledge 2021), 17–30. 87. See note 46. 88. See the prologue in this volume and Quentin Meillassoux, ‘Métaphysique, spéculation, corrélation,’ in Ce peu d’espace autour: Six essais sur la métaphysique et ses limites, edited by Bernard Mabille (Paris: Les Éditions de la Transparence, 2010), 299. 89. Gilbert Simondon, Du mode d’existence des objets techniques (Paris: Aubier, 1989 [1958]).

Chapter 10

Ananke’s Sway Architectures of Synaptic Passages Stavros Kousoulas

Do gods have mothers? The ancient Greeks would respond affirmatively; moreover, they would claim that the primordial deity, the mother of gods, humans, and their fates alike is the goddess Ανάγκη (Ananke). Ananke emerged at the very dawn of creation, in a serpentine entanglement with her brother Chronos, the personification of Time. Interestingly, Ananke stands for a very particular notion, one that might surprise the reader who wonders why she would be acknowledged as the mother of gods. The etymology of her name stems directly from the noun that stands for ‘necessity’: the mother of gods, the sister of time, is necessity. To make things even more peculiar, the same noun has more meanings, standing simultaneously for necessity, force, and, crucially for my argument, constraint. From a philosophical perspective, dating from early ancient Greek philosophy up to Sigmund Freud and Norbert Wiener, Ananke has often been presented as the personification of determinism. Especially for Wiener, the prominent figure of early cybernetics, Ananke always stood opposed to Tyche, the goddess of chance and unpredictability, or what he called quantum indeterminacy.1 Through an architectural reconsideration of necessity, I will highlight that it should be neither conflated with determinism nor opposed to indeterminacy. On the contrary, by connecting necessity with constraints and both with a radical understanding of synapses, as well as by referring to figures such as Gilles Deleuze, Félix Guattari, and Raymond Ruyer, I will propose a Simondonian—and therefore, informational—re-evaluation of architecture that makes constraints a presupposition for the emergence of any sensitivity to what we call indeterminacy.2 163

164

Chapter 10

THE TRUTH IS DEAD, LONG LIVE THE TRUTH As the artist and researcher Patricia Reed reminds us, in the most straightforward way, necessity can be understood as that which cannot be otherwise: anything that is not necessary is contingent.3 Reed continues by claiming that necessity is axiomatic, insofar as what is necessary remains so regardless of situational specificity, and furthermore it is resistant to contradiction, logically speaking. Necessity, writ large, operates as a conceptual and/or material constraint, since it determines what is not freely negotiable, nor subject to alterability.4

Following Reed, the connection between necessity and constraints becomes more obvious: if something cannot be otherwise, then its resistance to change stands as a determining factor for any potential interaction with it or any system that it is part of. Moreover, for Reed there are two fundamental types of necessities: alethic and non-alethic, both originating etymologically from the Greek work αλήθεια (aletheia), which stands for truth. In other words, there are necessities that are absolutely true regardless of context and necessities that are non-absolute and context sensitive.5 Following this distinction, to claim, for example, that human life depends on nutrition or that plants depend on solar energy is an alethic necessity; to claim that a good human life depends on this kind of nutrition or that kind of architecture is a non-alethic necessity, precisely because it essentialises and reifies a particular (any particular, for that matter) understanding of what it entails to live a good life. The ingenuity of Reed is that she makes clear—through a line of argumentation that for practical reasons I will not follow here—that there has been a continuous naturalisation of necessity, especially since Darwin’s theory of evolution started gaining ground in the late nineteenth century, leading to a biologically based reconsideration of many other disciplines. Put succinctly, there is a constant and deliberate confusion where non-alethic necessities are intentionally taken for alethic ones and serve as the supreme source of legitimacy for diverse contemporary political, economic, and social structures.6 Perhaps another way to address non-alethic necessities would be as norms and values: established patterns of action and desired outcomes of actions. Why would one suggest such a twist? Because it is through action itself that the binary between alethic and non-alethic necessities can be dismantled, allowing for neither a naturalisation of necessity nor for all kinds of anthropomorphic reductionisms. On the contrary, by developing an account of norms and values that presupposes only the sheer affective power of action, a different, reverse trajectory may be outlined: how can one de-naturalise alethic necessities, in that sense destabilising the very foundation of any (scientific

Ananke’s Sway

165

or other) determinism? An illuminating—pun intended—example in this direction is given by philosopher Joel White: what if we were to seriously consider that the Sun is actually dying?7 By reviving Nietzsche’s interest in the heat death of the Sun, White claims that ‘if the Sun as a metaphor of the Form of Forms grounds and engenders thought’s truth, but it is seen to be dying, then, as Nietzsche argues, epistemological truth inverts into a lie.’8 However, White will use the inevitable (?) entropic death of the Sun to take this inversion of truth one step further: it is not only epistemological truth that is inverted, but all kinds of truth, especially ontological truths—or, said differently, alethic necessities. As White writes the truth of entropic living/dying analogically represented by this living/dying star that we call our Sun, allows us to grasp the truth of entropic form. While heat death cannot be for us an object of experience, it is certainly with us. The entropy of the universe increases each time we heat our houses, think our thoughts, and let our coffees get cold. Indeed . . . this absolute oblivion is the ‘unpalatable truth’ at the end of our search for knowledge. It is this truth that inverts all other truths to lies.9

What White essentially claims is that if we agree on the truth of the Second Law of Thermodynamics—what is called entropy—then we are unwittingly abolishing the very notion of truth itself: being able to envision an entropic death is only possible because of the continuous, constant, and stubborn efforts to avoid the inevitable, to avoid entropy itself. How does one avoid the inevitable? By introducing negative entropy. If negative entropy is what makes any form of structural and operational organisation possible—from rocks and houses to our hearts and our institutions—then truth itself is no longer stable and fixed: truth becomes metastable and auto-normative. THE SPINDLE OF ANANKE If, therefore, the truth of negative entropy—which, following its use in information theories and for the sake of this chapters’s economy, I will call negentropy—is the only viable truth, in the ontological sense of the term, then, recounting Reed’s argument, negentropy becomes the primary alethic necessity out of which all our non-alethic necessities emerge. In other words, our norms and values are continuously produced negentropically. This implies a drastic shift: if negentropy is the only truth then no truth is ever stable, or, in terms closer to our interest, pre-determined. Alethic necessities are de-naturalised on the basis of the immanent contingency of negentropic interventions on the very fibres of the cosmos. Perhaps it is not a coincidence

166

Chapter 10

that our primordial goddess, Ananke, was always depicted holding a spindle. The spindle of Ananke was assembled by a shaft and a steel hook, while the whorl was made of different metals and other materials. Moreover, the whorl was of an unusual kind, being made of many entangled pieces, that in their entanglement compose the cosmos itself. This is how Plato describes it in his Republic: The nature of the whorl is like this: its shape is like those we have here; but, from what he said, it must be conceived as if in one great hollow whorl, completely scooped out, lay another like it, but smaller, fitting into each other as bowls fit into each other; and there is a third one like these and a fourth, and four others. For there are eight whorls in all, lying in one another with their rims showing as circles from above, while from the back they form one continuous whorl around the stem, which is driven right through the middle of the eighth.10

Each of these whorls stand for different planetary bodies, from the moon and the Sun to Saturn and Mercury, Venus, Mars, and Jupiter, while the largest is the cosmos, the fibrous universe itself. Adding to the list of coincidences, philosopher Raymond Ruyer, author of the seminal Neofinalism, has also conceptualised the universe as one of fibrous structure. His approach, nonetheless, prioritises a much more active and immanent entanglement. Ruyer claims that the fibrous structure of the cosmos is one that structures itself in time, with each fibre expressing a continuous line of an individuation.11 With Ruyer, the universe starts to fold, in space and in time, and what is produced in these folds is never separated from the whole itself, even though it allows for the whole to differentiate, to rearrange itself in the particular manner that each fold is actualised and expressed. Besides understanding the universe as fibrous, Ruyer also asks us to understand matter as activity. Doing so, matter becomes inseparable from time, since ‘time can no longer appear as an empty and foreign frame; the time of action is inherent to this action as a temporal melody.’12 Therefore, for Ruyer, the universe in its fibrous structure is: the expression of lines of activity and not lines of subsistence. The subsistence of things derives from their activity; it is not required a priori by reason or virtue of a principle such as ‘nothing is lost, nothing is created.’ Activity in its unfolding is not subject to deterministic causality.13

The very folding activity of those fibrous individuations generates a rhythm that pulsates through them and potentialises their futural movement: the manner, the style of Ananke’s intricate hand movements, her bodily, affective engagements with the spindle itself, express the cosmos as it is, not because it is supposed to be expressed as such, but because the only truth it obeys is the truth of negentropic contingency, expressed in what Ruyer understands

Ananke’s Sway

167

as lines of activity. If we wish to examine the complexity of activity—neither the infinitely small nor the sublimely vast—then we need to focus on the cross-scale interactions between different fibres, between the different whorls, from large to small and back, that in their coming together through constraints eventually produce new constraints themselves. To do so, let us take a basic science detour. The First Law of Thermodynamics postulates the conservation of matter and energy, claiming that neither can be either created or destroyed. While they shift from one form to another, the total amount of matter and energy within the universe will always remain the same. Consequently, the Second Law claims that all energy tends gradually to more diffused and less structured forms. Even though the certainty of the Second Law is almost inevitable, it is this almost absolute certainty that allows for the emergence of any form that has a degree—even a minimum degree—of structural and operational coherence and consistency. This movement against the grain is what we have already called negentropy. At this moment things take an interesting turn: what assists in the preservation of the Second Law (increase in entropy) or what contributes to its disrespect (negentropy) is how different constraints— from different scales as well—are coupled together. How the large-scale constraints of Ananke couple with the small-scale constraints of her hand movements as well as with the constraints of the spindle itself determine—as in produce—the very activity of the cosmic whorls, and, in doing so, stick out their tongue to the Second Law. What, however, do constraints do? Following neuro-anthropologist Terrence Deacon, constraints can be colloquially understood as an external limitation that acts as an imposed factor that reduces options and possibilities.14 Deacon immediately suggests that we should refrain from referring to constraints as external, since any extrinsic account of constraints assumes that there is always an n + 1 dimension that imposes them. Instead, as Deacon claims, it is useful to have in mind the etymology of the term: it comes from the Latin constrictus, past participle of constringere and standing for ‘that which binds together.’15 In other words, constraints bring strings together, binding fibres of individuation while Ananke spins her spindle. Consequently, one way to understand constraints is as reduced variety that, however, allows for the emergence of novelty. This might seem contradictory, since reduction in variety implies a decrease in attributes. However, as Deacon underlines when some process is more constrained in some finite variety of values of its parameters or in the number of dimensions in which it can vary, its configurations, states, and paths of change will more often be ‘near’ previous ones in the space of possibilities, even if there is never exact repetition.16

168

Chapter 10

This is the entry point in understanding constraints, especially in their connection to synapses: much to the satisfaction of philosopher Gilles Deleuze, constraints allow for a difference to repeat itself, forcing itself to differ so as to cross through the fibrous spinning of the spindle. Consequently, by forcing itself to differ, difference generates the capacity to intensify activity, precisely because it generates the need for the creation of new constraints that will regulate its passages. As Deacon explains it is only because of a restriction or constraint imposed on the release of energy . . . that a change of state can be imposed by one system on another. It is precisely by virtue of what is not enabled, but could otherwise have occurred, that a change can be forced. . . . So, the nature of the constraint . . . indicates which differences can and cannot make a difference in any interaction. This has two complementary consequences. Whenever existing variations are suppressed or otherwise prevented from making a difference in any interaction, they cannot be a source of causal influence; but whenever new constraints are generated, a specific capacity to do work is also generated.17

Through activity—wild and free from anything external to it—constraints are imposed on one another, folding onto each other, being of the whole while allowing the whole to rearrange itself precisely because a constraint at one level will assist in the emergence of a constraint on another. It is in the manner which constraints fold that a negentropic stubbornness allows for the synaptic passage of an intentionality that is dependent only on its sensitivity to indeterminacy; and that manner, that specific style that each negentropic effort expresses, is what allows for the passing through of an intensity that literally informs the cosmos. ONE TECHNICITY AWAY One might wonder what this has to do with architecture proper. The response will not be clear—it is not meant to be clear—but it involves a radical re-evaluation of architecture itself. The reason for such a re-evaluation is not discursive, not confined in the all-too-rigid boundaries of architecture as a discipline. On the contrary, by understanding architecture otherwise, we will be able to get further attuned with Ananke’s delicate movements. The first and most crucial step is to pluralise the architectural act itself and position it in terms of what philosopher Gilbert Simondon calls technicities. If we aim to avoid reductionism, we should, Simondon advises us, take our study beyond technical objects to the technicity of these objects as a mode of relation between humans and world.18 In this sense, one can move from architectural

Ananke’s Sway

169

objects to architectural technicities that operate in terms of reticularity: the immediate relation of events and actions that occur in a given structure, which is, however, understood in terms of its potentials for action and has to be studied in affective terms. In his contribution to this volume, Andrej Radman briefly and accurately claims that technicity could be understood as ‘evolution by means other than life.’ Simply put, technicity deals with how humans relate to and transform their environment through technology and how these relations transform each in turn—humans, technology, and environment.19 As such, thinking with technicities is a radically immanent way to approach the coupling of different constraints—from anatomical to technological to environmental—without imposing any n + 1 dimension; in this regard, architecture—in the very act of architecting—is privileged with an extremely valuable insight into both the folding of constraints and in the negentropic rearrangement of the cosmos. Let us examine an architectural technicity: the process of tiling a floor. Strangely, this humble example explains, in an astonishing manner, the coupling between constraints. It does so because it responds to the most basic cohomological problem: how do modular quantities, distributed under only local constraints, fit together globally over the manifold that they attempt to cover? Moreover—and this is where any technicity emerges—how might the shape of the manifold be remodelled so that previously ill-fitting modules now cover it perfectly?20 If we assume that there is a limited number of tiles (First Law), the problem is how to cover a floor with given dimensions using that exact number (Second Law). In this sense, ‘the cohomology problem is how to find a distribution function by which the tiles will exactly fit the room without being added to or subtracted from.’21 The initial condition is always constrained by what the final condition must be, while the boundaries set by the First Law can be satisfied or violated by the Second Law only to give birth to radically unexpected boundary conditions.22 This is what the problem of cohomology—and architecture’s capacity to resolve it—teaches us: the cosmos might be materially and energetically closed, but it always remains relationally open. One can always relate the tiles differently (in a different manner, a different style) in order to go against the Second Law while respecting the necessity of the First. In other words, while no matter or energy can be introduced or disappear, there is literally no limit when it comes to the potential differential relations between them: the cosmos has an infinite potential for individuating otherwise precisely because its motor is information. Novelty is always one technicity away. What I argue is that architecture, in its technicities, harvests differential relations and therefore produces information. Simondon claims that information—far from its unfortunate confusion with data—is a universal process that concerns all being, and is the formula for individuation, the sense according

170

Chapter 10

to which a system individuates.23 It is a requirement for individuation, but it is never a given thing to be measured in bits and bytes, words or numbers. In simple terms, information is a difference that can make a difference.24 In even simpler terms, it is the potential that can energise a potential: what sort of and how much intensity in the differential relations between matter and energy is needed for a transformation to occur. As such, for Simondon information becomes synonymous with significance, with meaning. Nothing is informational out of birth right, nor does anyone get to be informed in the same way. What matters is neither the emitter, nor the message, but a particular state of the receiving system that needs to be metastable enough in order to make becoming-informed possible. A metastable system ‘is transversed by potentials and powers, or by energy gradients and inherent tendencies,’ so that at any moment ‘the most minute imbalance, or the most fleeting encounter, can be enough to set things in motion’ and lead to a systemic transformation.25 Therefore, as Simondon writes: information is never relative to a single and homogeneous reality but to two orders in a state of disparation: information . . . is never deposited in a form that is able to be given . . . it is the signification that will emerge when an operation of individuation will discover the dimension according to which two disparate reals can become a system.26

It is for this reason that Simondon asks us to replace the notion of form with that of information, and to suppose the existence of a system in metastable equilibrium that has the energetic potential to further individuate.27 Systems that are governed by linear causality, systems that are full of comfortable and familiar alethic necessities, are ultra-stable, and, simply put, do not evolve; they merely succumb to entropy. On the contrary, stubborn systems, those that Ananke has blessed, are metastable, and, because of that, full of information: no longer supposedly pregnant geometrical forms—those that all architects are familiar with—but significative forms that establish a transformative order within a system that has the capacity to transform both itself and its world.28 In a system of stable equilibrium, the tiles match the floor perfectly and therefore make the whole fully homogeneous. In such an exhausted (and exhausting) homogeneity there is no activity precisely because it is not needed: stable systems are systems where there are no necessities; Ananke no longer sways and weaves her whorls. It is for this reason that Simondon will claim that evolution has nothing to do with perfection—which is just a fancy word for absolute homogeneity, and therefore a system’s death. For Simondon, evolution is an informational integration, the maintaining of a metastability that settles more and more upon itself and, in doing so,

Ananke’s Sway

171

accumulates potentials for further structural and operational individuations.29 To counter death, to fight the inevitable, Simondon suggests that: from the άπειρον [apeiron] before individuation to the άπειρον after life, from the undetermined of the before to the undetermined of the after, from the first dust to the last dust, an operation is carried out that does not break down into dust; life is in its present, in its resolution, not in its remainder.30

Life is in its negentropic activity, in its present that is informationally meaningful, because it allows for yet another and yet different attempt at tiling the cosmic floor. However, once again we are in for a surprise: something is meaningful only when it is constrained. As biologist Stuart Kauffman writes, ‘constraints are information and information is constraint.’31 One of architecture’s greatest lessons is that in order to enhance life, in order to make it meaningful, you need to negentropically constrain it. From gathering around a fire in the middle of the night to erecting skyscrapers or arranging the placement of our everyday furniture, architecture is always a process of cohomological floor tiling. In the architectural coupling of existing constraints, new constraints are introduced that get to become informative—get to be meaningful—by reducing our options (from infinity to infinity minus one) and, ironically, proliferating our affective capacities in doing so. It is with architecture—what Deleuze calls the first Art—that constraints are acting for what they truly are: synapses.32 OF NORMS AND VALUES As hinted, one way to understand non-alethic necessities—everything that the alethic constraint of negentropy produces—is as norms and values: established patterns of action and desired outcomes of actions. For Simondon, it is the act itself that produces and is simultaneously produced by norms and values. As he claims: values are that through which the norms of a system can become the norms of another system through a change of structures; values establish and make possible the transductivity of norms, not as a permanent norm that is nobler than the other—for it would be quite difficult to discover a norm that was already truly given—but as a meaning of the axiomatic of becoming that is conserved from one metastable state to the next.33

Close to Reed’s argument against the naturalisation of alethic necessities—which is essentialist and therefore moralistic—Simondon examines

172

Chapter 10

individuation only on the principle of individuation itself, developing an ontogenetic account of the acts, norms and values that propel it. To do so, he clarifies that: [N]orms and values do not exist prior to the system of being in which they appear; they are becoming, instead of appearing in becoming without being part of becoming; there is a historicity of the emergence of values, just as there is a historicity of the constitution of norms. Ethics cannot be recreated based on norms or based on values, no more than the being can be recreated based on the forms and matters to which abstractive analysis reduces the conditions of ontogenesis. Ethics is the requirement according to which there is a significative correlation of norms and values. To grasp ethics in its unity requires that one accompany ontogenesis: ethics is the meaning of individuation, the meaning of the synergy of successive individuations.34

In the coming together of norms and values that the act of any technicity implies, a sheer affective power emerges: a technicity’s potentia and potestas. As philosopher David Scott explains when summarising Simondon’s argument, potentia is operational and pre-individual power while potestas is structural and actualised power.35 In his words, ‘structural power (potestas) organizes operational power (potentia) by structuring it; however, potentia is the engendering determination of a determinable potestas, structure.’36 As such, Simondon will claim that, similar to the way that potentia informs potestas and vice versa, norms and value possess no moral degrees. This will allow him to claim that even if there was such thing as an alethic moral constraint, then it would be neither in the norms or the values alone, but in their differential relation—their intensive informational exchange.37 In an informational account of non-alethic necessities, Simondon highlights the importance of what he terms auto-normativity.38 To explain what auto-normativity stands for, Simondon uses the example of a hiker in a forest. Each step a hiker takes when walking in the woods is its own consequence: it is self-constitutive. The act of walking itself does not include any intrinsic directionality, any form of inherent compass that will orient the hiker.39 Likewise, if the hiker gets lost, it is not possible to depend on any familiar and recognisable exterior norm. In other words, for a hiker in the woods there are ‘no norms, no set rule of direction, every step, in every direction, is equiprobable and equivalent at once.’40 From an infinity of directions, the first step—as the act of hiking-in-the-woods—becomes the norm itself: every step that follows it builds on the relation of the step before it, one after the other leading the hiker to the edge of the forest. This is what Simondon has in mind when he claims that ‘the norm is derived from the act. . . . Every act, anomic

Ananke’s Sway

173

from its absolute origin, valorises itself in an autogenous fashion because it continues and rests, consequently, more and more on itself.’41 As such, the norms and values of any technicity—including those architectural ones that dictate how to tile the cosmic floor—are not merely co-determinable; they are fundamentally contingent. What is crucial, however, is how the act itself will allow for the synaptic passage of a mnemic theme (a memory of the future) that will fold constraints upon constraints and, in doing so, will produce novel necessities that in their informational intensity demand a new rearrangement of the cosmos. Therefore, the act of any technicity in its eventuating power becomes the a praesenti principle of individuation, the moment where the given a posteriori becomes the giving a priori. The mnemic theme that synaptically crosses through is a virtual theme (in the Deleuzian use of the term) and as such a theme of potentia, of operational power. Consequently, what becomes crucial is the act of the step itself: the moment where the cosmos is still undecided as to what it was and what it will be, the moment where Ananke blinks for a second. If every step in the dark cosmic forest is equiprobable and equivalent at once, it is because every step is equipotential. It is not yet what it will become when it is put in circuit with a virtual mnemic theme, with the rhythms of the technicities in the a praesenti of their inventive capacities. Nonetheless, what is at stake is the question of how to be placed in contact with this virtual theme and its productive contingencies. How can we approach the synaptic passages of this cosmic futural memory and, out of them, intuit the lines of individuation that they catalyse? SYNAPTIC PASSAGES Simondon will claim that what one perceives is neither outlines nor shapes, but thresholds of intensity, pointing out that sensation is simultaneously intensive and differential; sensation is the ‘grasping of a direction, not of an object.’42 But the question remains of how we can examine the sensation of a direction that does not address the present but rather that which is yet to come. To do so, one can approach it as an issue of synapses. A synapse is a junction, an almost imperceptible gap through which an impulse of intensity passes by. Beyond the modal temptations of placing it in space or time, the synaptic moment (or the synaptic location) is nothing but pure action and, therefore, pure relationality: both a material object and a figure of thought, the complementarity of an actual brain and a virtual mind.43 As such, synapses manage to capture both the passage of an intensity (as a synaptic moment) and the formation of an extensity (as a synaptic location). We would

174

Chapter 10

therefore be correct to describe them as electric thought.44 As philosopher Félix Guattari writes: a-signifying synapses, which are simultaneously irreversibilizing, singularizing, heterogenesizing and necessitating, push us from the world of memories of redundancies embedded in extrinsic coordinates, into Universes of pure intensive iteration, which have no discursive memory since their very existence acts as such.45

In other words, synapses can be understood as a constraint: they delimit the field of the possible while reinforcing the virtual.46 To understand this, we can follow Guattari in the manner that he connects the function of the synapses with speed. Guattari claims that synapses not only bring together the Chronic—as the time of lived experience—and the Aeonic—as the time of pre-individual potentials—but they also formulate a bridge that connects molar extensities with molecular intensities.47 It is therefore a matter of a disparate relation between the finite speed of the molar and the infinite speed of the molecular, and how through a synapse the two are bound together, or, true to the Latin etymology of the term, how the two are constrained. As such, synapses are essentially constraints that act as intensity regulators. They determine how much, how fast, and how intense a play of limits can be sustained before crossing the threshold that demands a new differential relation between matter and energy. Therefore, having in mind our previous definition of information, synapses can literally be understood as informational constraints. As Simondon explains: the regime of information is what defines the degree of individuality; in order to appreciate it, we must establish a rapport between the propagation speed of information and the duration of the act or event to which information is relative.48

In the synaptic location the speed of information is determined, while in the synaptic moment the duration of its intensive passage is regulated. In the relation between the two that any technicity catalyses, architecture turns into something much more significant than the simple construction of space: it becomes a synapse in its own right. It allows for both the formation of an extensive space—to be lived, experienced, destroyed, praised, and condemned—and for the very possibility of intuiting a space yet to come, and, consequently, a subject yet to individuate, precisely because architectural technicities allow for a certain degree of indeterminacy. As Simondon writes: the true progressive perfecting of machines, whereby we could say a machine’s degree of technicity is raised, corresponds not to an increase of automatism, but

Ananke’s Sway

175

on the contrary to the fact that the operation of a machine harbours a certain margin of indeterminacy. It is this margin that allows the machine to be sensitive to outside information.49

Therefore, architecture transforms information into forms by allowing its technicities to affectively open up to the indeterminacy of a differential influx. This influx of differences is nothing but an influx of intensities; it is the gathering of memorial traits of earlier states of existence. Within architectural technicities one can locate a dynamism, especially regarding the capacity of architecture to invent anything novel. This dynamism entails the reticular synaptic relation between an actual architectural technicity and a virtual architectural product: between the limited number of cosmic tiles and the unlimited ways of placing them next to each other, between Ananke’s spindle and her seductive swaying. Paraphrasing Simondon, for an architect to invent is to make one’s thought function as architecture might function, not according to causality, which is too fragmentary, but ‘according to the dynamism of lived functioning, grasped because it is produced, accompanied in its genesis.’50 It is on the basis of this lived functioning that Simondon will define invention. For him, invention will appear as the discovery of a way to restore the continuity of action.51 Take two simple examples: an organist that needs to both play the instrument and turn the score’s pages and a rockfall that blocks one’s journey. The first entails an intrinsic incompatibility, and the latter an extrinsic incompatibility.52 In both cases, the incompatibility is resolved by the invention of a technicity that acts as synapse in constraining formerly distinct sets of actions and binding them together into a novel, continuous dimension. Finger technique and hydraulic winches are both expressions of a synaptic passage that introduces a novel constraint which provokes a qualitative change in an operative system, restoring the compatibility between sensory-motor subsets of action as well as between action and the environment.53 Consequently, Simondon claims that: invention is the appearance of the extrinsic compatibility between the milieu and the organism and of the intrinsic compatibility between the subsets of action. Detour, instrument crafting, collective association are different ways to restore the intrinsic and extrinsic compatibility. . . . Solutions appear as continuity restitutions allowing the progressivity of operative modes, according to a progression previously invisible in the structure of a given reality.54

In the schism between lines of action, a virtual and pre-individual pool of potentials is expressed, making invention a matter of degrees of openness to it. After all, what does that schism consist of but a disparation between

176

Chapter 10

norms and values that invention attempts to resolve. In the moment that an established action (a norm) encounters an obstacle that disrupts its dynamic continuity, then the intention of a desire (a value) that wishes to overcome it emerges. As a matter of fact, more than a wish, it is an issue of a demanding necessity, the Spinozian conatus of restoring the active entanglement of the cosmic fibres and furthering individuation according to its own immanent potentials; our primordial goddess reigns in full force. Therefore, one needs to be affectively sensitive to the indeterminacy that any discontinuity of action implies, since there is simply never one and only solution; there is never one and only manner to constrain action back into its dynamic and differential flow. As such, to be sensitive to indeterminacy means to be able to first of all localise it. Simondon is explicit about this: to receive information (and therefore be susceptible to change) one needs to be able to localise its indeterminacy.55 Synapses are crucial in this, not only because they allow for a memory of the past to pass through, but also because they catalyse transduction: the informational exchange of the intensive with the intensive, of a synapse with another synapse. Consequently, a synaptic constraint belongs: neither to the domain of potential energy nor to the domain of actual energy; it is truly the mediator between these two domains, but it is neither a domain of the accumulation of energy, nor a domain of actualisation: it is a margin of indeterminacy between these two domains, that which brings potential energy to its actualisation. It is during the course of this passage from potential to actual that information comes into play; information is the condition of actualisation.56

Following Ananke in her indeterminate whorls, the future is allowed to inform the present: a virtual affair of states informing an actual state of affairs. Through synaptic constraints, the certainty of the one is exchanged for the uncertainty of the other, without assigning primacy to any of them. It is perhaps time to come full circle and think again of our dying star. Is the heat death of the Sun inevitable? If we follow Ananke’s sways, then we might hesitate to respond. Our best answer would be that the death of the Sun is almost inevitable. In this almost certainty lies the very reason—and the motivation—of making it yet another day, of constraining the cosmic fibres differently, of intensifying the synaptic passages of a virtual memory in a different manner, of always being one technicity away. After all, every day we rotate around this dying star with the help of small, actual synapses, almost insignificant to the eye. Bizarre as it sounds, of literally everything else we know in the universe—from our Sun to an aeroplane, from our brains to our cars—a computer chip is what can conduct the most energy flowing through a gram of matter per second.57 The narrower the synaptic passage, the denser

Ananke’s Sway

177

the folds between constraints, the greater the differential between matter and energy; consequently, the more potential for a difference to make a difference, for an informational rearrangement of the cosmos. If we trust Ananke, then we can only keep on moving our steps, one after the other, building upon themselves in the cosmic forest. In other words, there can never be a certain past and an uncertain future (nor the other way around), but rather a constant synaptic exchange between indeterminate constraints that belong to the present of activity itself. The memory of that ongoing activity cuts both ways. It plunges toward the past, questioning any norm; simultaneously, it takes a leap to the future, enunciating values that will literally change the cosmos. Thankfully, how we bring them together will always be up to us, escaping any alethic burden besides the truth of our negentropic determination. NOTES 1. Norbert Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine (Cambridge, MA: The MIT Press, 2019). 2. See also Radman, ‘Allagmatics of Architecture: From Generic Structures to Genetic Operations (and Back),’ and, with another perspective on Simondon, information, and Ananke, Woodward, ‘Information and Alterity: From Probability to Plasticity,’ in this volume. 3. Patricia Reed, ‘The Valuation of Necessity,’ in Block Chains and Cultural Padlocks, edited by Jesse McKee (Vancouver: 221A, 2021), 124. 4. Ibid; emphasis in original. 5. Ibid. 6. Ibid, 132. 7. Joel White, ‘Analogy of a Dying Star: Entropic Formness,’ Pli 32. See also White, ‘Outline to an Architectonics of Thermodynamics: Life’s Entropic Indeterminacy’ in this volume. 8. Ibid, 94. 9. Ibid, 112; emphasis in original. 10. Plato, Republic, translated by Allan Bloom (Philadelphia, PA: Basic Books, 1968), 299–300. 11. Raymond Ruyer, Neofinalism, translated by Alyosha Edlebi (Minneapolis: University of Minnesota Press, 2016), 142. 12. Ibid, 149. 13. Ibid, 152. 14. Terrence Deacon, Incomplete Nature: How Mind Emerged from Matter (New York: W.W. Norton & Co, 2012), 192. 15. Ibid, 193. 16. Ibid, 195. 17. Ibid, 198.

178

Chapter 10

18. Gilbert Simondon, On the Mode of Existence of Technical Objects, translated by Cecile Malaspina and John Rogove (Minneapolis: Univocal, 2017), 162. 19. Stavros Kousoulas, Architectural Technicities: A Foray into Larval Space (London: Routledge, 2022). 20. Peter N. Kugler and Robert E. Shaw, ‘Symmetry and Symmetry-Breaking in Thermodynamic and Epistemic Engines: A Coupling of First and Second Laws,’ in Synergetics of Cognition, edited by Hermann Haken and Michael Stadler (Berlin: Springer, 1990), 324. 21. Ibid, 325. 22. Ibid. 23. Gilbert Simondon, Individuation in Light of Notions of Forms and Information, translated by Taylor Adkins (Minneapolis: University of Minnesota Press, 2020), 12. 24. Gregory Bateson, Steps to and Ecology of Mind (New York: Random House, 1972), 453. 25. Steven Shaviro, Discognition (London: Repeater, 2016), 197. 26. Simondon, Individuation, 11; emphasis in original. 27. Ibid, 16. 28. Ibid. 29. Ibid, 237. 30. Ibid; emphasis in original. 31. Deacon, Incomplete Nature, 392. 32. Gilles Deleuze and Félix Guattari, What Is Philosophy?, translated by Hugh Tomlinson and Graham Burchell (New York: Columbia University Press, 1994), 186. 33. Simondon, Individuation, 375. 34. Ibid, 377. 35. David Scott, ‘How Do We Recognise Deleuze and Simondon Are Spinozists?,’ Deleuze Studies, 11, no. 4 (2017): 569. 36. Ibid. 37. Simondon, Individuation, 377. 38. Scott, ‘Spinozists,’ 571. 39. Ibid. 40. Ibid. 41. Gilbert Simondon, Sur la Technique: 1953–1983 (Paris: Presse Universitaires de France, 2014), cited in Scott, ‘Spinozists,’ 571. 42. Simondon, Individuation, 287. 43. Hanjo Berressem, ‘Degrees of Freedom: Félix Guattari’s Schizoanalytic Cartographies,’ in Schizoanalysis and Ecosophy, edited by Constantin V. Boundas (London: Bloomsbury, 2017), 142. 44. Ibid, 142. 45. Félix Guattari, Schizoanalytic Cartographies, translated by Andrew Goffey (London: Bloomsbury, 2013), 178. 46. Ibid, 165. 47. Ibid, 177. 48. Ibid, 211. 49. Simondon, On the Mode of Existence, 17.

Ananke’s Sway

179

50. Ibid, 151. 51. Emilien Dereclenne, ‘Simondon and Enaction: The Articulation of Life, Subjectivity, and Technics,’ Adaptive Behaviour, 29, no. 5 (2021): 453. 52. Ibid. 53. Ibid. 54. Gilbert Simondon, Imagination et invention (Chatou, France: La Transparence, 2008), 139, cited in Dereclenne, ‘Simondon and Enaction,’ 453. 55. Simondon, On the Mode of Existence, 153. 56. Ibid, 155. 57. Kevin Kelly, What Technology Wants (New York: Viking, 2010), 59.

PART III

Epistemic Technologies

181

Chapter 11

Outline to an Architectonics of Thermodynamics Life’s Entropic Indeterminacy Joel White

The current urgency of ecological matters has initiated a new philosophical concern with the relation between life and thermodynamics (the science of energy and entropy). French philosopher Bernard Stiegler has consequently proposed that anthropogenic climate change—the Anthropocene—be renamed the ‘Entropocene.’1 His justification for offering another kenos to the already long list of alternatives is that anthropogenic climate change is largely conditioned by an acceleration in the rate of entropy production—thermodynamically definable as the unidirectional (from hot to cold) transfer of heat between systems (classical thermodynamics) or the probabilistic distribution of particle energy (statistical mechanics). In support of Stiegler’s proposition, theoretical biologist Maël Montévil argues that this acceleration of entropy is being produced at multiple levels from the thermodynamic to the ‘biological and the social.’2 Stored energy in the form of fossil fuels continues to be dissipated at an industrial rate, biodiversity and ‘anti-entropy’3 (what Montévil, after Giuseppe Longo and Francis Bailly, defines as the functional complexity that contributes to an organism’s ‘persistence’ through time) is declining faster than expected and political alternatives to capitalism, ones that might offer actual socio-ecological solutions have all but disappeared. Following Stiegler, this chapter argues that a philosophy of entropy is needed. It is needed not just to philosophically respond to the Entropocene but because, as Isabelle Stengers and Ilya Prigogine argue in Order out of Chaos, for the most part, philosophy remains stuck in an eighteenth-century Newtonian framework where the irreversibility and absolute finitude that 183

184

Chapter 11

entropy entails are still considered impossible—a naïve entropic optimism that still stubbornly persists in philosophy, economics, and science alike.4 Because all entropic phenomena in the world—from coffee pots to complex ecosystems—adhere to what Rudolf Clausius, the originator of the concept of entropy, calls a ‘fundamental law of the universe,’5 or as one might prefer: a cosmological epistemic certainty (epistemic certainties are ‘certain in so far as we know’), an architectonic method is appropriate to the task of formulating this philosophy. As Immanuel Kant writes in the ‘Transcendental Doctrine of Method’ to his Critique of Pure Reason, an architectonic system is one where there is a ‘unity of the manifold of cognitions under one idea.’6 The Idea shall be that of entropic ‘heat death’ and the manifold of cognitions shall be local entropic phenomena. Since historically the cosmological Idea of entropy has been termed ‘heat death,’ I shall retain this term—the history of its development shall be explicated in the first part of this chapter, especially in its relation to animate living-beings and the determinate end of life. The concept of entropy is understood as that which governs entropic beings— it pertains to possible experiences—and the Idea of heat death is that which regulates this experience—it pertains to the absolute totality of entropy. While the full development of this philosophy of entropy is beyond the scope of what I will outline here, I shall nonetheless sketch out how the concept of entropic living-beings at a local level is related to the Idea of heat death at a general level, and what this means for living-beings in the present and for the future of life. As Romanian economist Nicholas Georgescu-Roegen writes in the afterword to Jeremy Rifkin and Ted Howard’s Entropy: A New World View: ‘[t]he Entropy Law in its extensive form sets material limits to the specific mode of life of the human species, limits that tie together present and future generations in an adventure without parallel in our knowledge.’7 In constructing a critical philosophy of entropy, it is precisely this ‘tie’ between the future and the present that is important, especially if a practical philosophy is to be built on the back of a theoretical one. In short, and as I hope to initially outline here, while the Idea of heat death speculatively defines the determinate limit for the future of life, the intermediate future of life is, as Henri Bergson argues in his 1907 L’Evolution créatrice, nonetheless, ‘indeterminable.’8 THE IDEA OF HEAT DEATH AND THE END OF LIFE The history of the relation between the study of life and thermodynamics is as old as the develop of thermodynamics itself. The thermodynamics of life should, therefore, not just be seen as an applied theory, one where the concepts of energy and entropy are applied analogically to living systems.

Outline to an Architectonics of Thermodynamics

185

Prior to the founding text of thermodynamics, Sadi Carnot’s 1824 Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance, English chemist Adair Crawford had already experimentally demonstrated two important thermodynamic vital realities in his 1777 Experiments and Observations on Animal Heat: (1) the respiratory system of animals is a form of combustion and (2) without a constant supply of nutrition, the temperature of bodies and their quantities of heat tended to ‘diffuse’ toward equilibrium.9 Relating combustion, respiration, and heat dissipation, Crawford also recognised that without some form of external reanimating efficient cause, all bodies of temperature would finally ‘remain in a state of rest,’ signalling the end of life.10 Crawford’s ‘state of rest’ can be read as one of the earliest formulations of what by the end of the nineteenth century came to be known as the ‘heat death of the universe’—the moment in the universe’s timeline where, because entropy is at a maximum, life and all energetic systems that require energy differentials to operate cease to be possible—as William Rankine describes it, it will entail ‘the end of all physical phenomena.’11 William Thomson, in his 1852 ‘On a Universal Tendency in Nature to the Dissipation of Energy,’ is often credited with having first outlined heat death. Being a devout Presbyterian, Thomson starts his paper by stating that by ‘Creative Power’ alone, which is to say, only by external metaphysical means, can ‘mechanical energy’ be called into ‘existence’ or ‘annihilated.’12 Although steeped in religious overtones, this describes something like the first law of thermodynamics—that, as Clausius will define it: ‘The energy of the universe is constant.’13 The importance of outlining that the quantity of the energy in the universe cannot be created or destroyed derives from the fact that ‘Carnot’s proposition’ regarding the ‘waste of mechanical energy’ is not a proposition concerned with absolute waste (as if energy were lost or ceased to exist when used), but one that is concerned with the transformation of the quality of energy ‘available to man.’14 It is, therefore, the availability of a store of energy from which something may occur that is at stake as well as the reversibility of restoring said store to its initial qualitative state. As Thomson repeats throughout ‘On a Universal Tendency,’ since mechanical effects are possible only due to the unidirectional passing of energy from a warmer to a colder body—an irreversible process that is itself ‘dissipation’—and because this process diminishes the future capacity for dissipation to occur, ‘perfect restoration is impossible.’15 Dissipation is, therefore, both the condition of possibility and impossibility of mechanical effect.16 It is because, as Thomson writes, ‘there is at present in the material world a universal tendency to the dissipation of mechanical energy’ that ‘without more than an equivalent of dissipation’ absolute restoration by either an inanimate or animate process

186

Chapter 11

is ‘impossible.’17 It is because restoration is impossible without concomitant dissipation, that one cannot perpetually refill the store house, that ‘within a finite period of time’ a period where the earth will be ‘unfit for habitation’ will come to pass.18 What is implicit in Thomson’s notion of the earth becoming ‘unfit for habitation’ is that there is a concomitant relation or tension between energetic processes in the present and the future possibility of those processes. In the same way that dissipation is the condition of possibility and impossibility of mechanical effect, inhabiting or living in the present forecloses eternal inhabiting or living in the future. It is with a similar sentiment to Thomson that Hermann von Helmholtz begins his ‘The Interaction of Natural Forces,’ the second of the two founding texts of the Idea of heat death. Like Thomson, Helmholtz opens his lecture by foreclosing the possibility of eternal cyclical processes by using the example of perpetual motions machines, the search for which he describes as ‘fable-rich’ and comparable to the alchemical search for the ‘philosopher’s stone of the seventeenth and eighteenth centuries’19—a reference that is likely an allusion to Leonardo da Vinci: ‘Oh ye seekers after perpetual motion, how many vain chimeras have you pursued? Go and take your place with the alchemists.’20 The reference to the philosopher’s stone—that which would alchemically grant immortality, among other promises—is not without its significance vis-à-vis Helmholtz’s understanding of the thermodynamics of life. In quite explicit disagreement with the notion that the ‘quintessence of organic life’ is to be found in the idea that living-beings move ‘themselves energetically and incessantly as long as they [live],’ Helmholtz argues that the search for perpetual motion in living processes is futile.21 Mockingly, he writes: ‘A connexion between the supply of nourishment and the development of force did not make itself apparent.’22 For Helmholtz: the continuation of life is dependent on the consumption of nutritive materials: these are combustible substances, which, after digestion and being passed into the blood, actually undergo a slow combustion, and finally enter into almost the same combinations with the oxygen of the atmosphere that are produced in an open fire.23

If the continuation of life is dependent on the presence of combustible substances, then it follows that in their absence the endurance of life comes to a halt. Like Thomson, it is also through an appeal to ‘Carnot’s principle,’ which he defines as: ‘Only when heat passes from a warmer to a colder body, and even then, only partially, can it be converted into mechanical work,’24 that perpetual motion machines of both the first kind (machines that self-perpetuate or produce more energy than is put in) and the third kind (machines that

Outline to an Architectonics of Thermodynamics

187

function without dissipation) are said to be futile. Significant to the construction of an architectonic of thermodynamics, Helmholtz, who himself was a neo-Kantian, argues that the genius of Carnot was to have inverted the dogmatic question of ‘How can I make use of the known and unknown relation of the natural forces so as to construct as perpetual motion machine’ to the critical question ‘If a perpetual motion be impossible, what are the relations which must subsist between natural forces?’25 The inversion of this question, and the fact that its point of departure is related to the epistemic impossibility of perpetual motion, is significant. It functions as a ‘discipline’ in the Kantian sense of term, shifting the argument from the local impossibility of actually existing perpetual motion machines to the general implications of this impossibility. Stated differently: entropy (though perhaps more precisely the Idea of heat death) becomes the limit principle which restrains the construction of an architectonic that might want to make dialectical (illusionary) judgments about the eternality of life, thought, and action. It is with the impossibility of perpetual motion machines in mind that Helmholtz then begins his speculative description of heat death. The entirety of his description is worth citing since it is precisely this paragraph that is cited by Friedrich Lange in his 1866 History of Materialism (the first philosophical text concerned with the concept of entropy and the Idea of heat death)26 and subsequently by Friedrich Nietzsche in his 1870s lectures on Heraclitus27: If the universe be delivered over to the undisturbed action of its physical processes, all force will finally pass into the form of heat, and all heat come into a state of equilibrium. Then all possibility of further change would be at an end, and the complete cessation of all natural processes must set in. The life of men, animals, and plants could not of course continue if the sun had lost its high temperature, and with it [its] light,—if all the components of the earth’s surface had closed those combinations which their affinities demand. In short, the universe from that time forward would be condemned to a state of eternal rest.28

More radical than Thomson’s description of inhabitability, Helmholtz’s description of heat death is one whereby the possibility of change or transformation ends; that is, heat death is the moment in the universe’s history where the complete cessation of all natural processes must set in—a moment from which there is to be no cyclical rebirth or redemption but: ‘eternal death.’29 Significantly, Helmholtz’s Idea of heat death does not appeal, as Thomson does in his 1862 ‘On the Age of the Sun’s Heat,’ to the notion that ‘it is impossible to conceive either the beginning of the continuance of life, without an overruling creative power,’ a Creative Power that will, when judgment day comes, save us from the ‘dispiriting views’ of ‘universal rest and death.’30

188

Chapter 11

Helmholtz’s Idea of heat death is one which we must endure since it constitutes the strange guiding telos of the universe, present at all times and in all processes. Indeed, it is hard to read Helmholtz’s declaration that the continuation of life is contingent on combustion without likewise understanding this to mean that living-beings contribute to their own downfall by exhausting their own conditions of possibility. In Being and Time, Martin Heidegger argues that individuals must learn to accept their own finitude through an authentic relation to the certainty of their death, which he terms ‘being-towards-death.’31 Helmholtz offers a similar thought at the end of his lecture, when he writes: ‘As each of us singly must endure the thought of his death, the race must endure the same.’32 So radical is the Idea of heat death that while is shall never be for us, it is always with us. No matter the energetic system, whether that be reheating food on a stove or supplying electrical energy for a town, the Idea of heat death governs the possibility of their endurance. THE CONCEPT OF ENTROPY, DETERMINATION, AND INDETERMINISM The future, speculatively speaking, is determinate for life. Indeed, it is so for any energetic system that requires not-yet-dissipated energy to persist through time, from everyday technologies to the most complex of machines. While I do not wish to grant heat death the status of a cosmological necessity—its architectonic status as an Idea prohibits such an entitlement—it is worth saying that heat death remains the most probable of cosmological outcomes given what we currently know about the universe; it is an epistemic certainty. This certainty is due to the fact that the universe is observably ‘open’; its expansion is both infinite and accelerating. 2011 Nobel Prize winners Saul Perlmutter, Brian P. Schmidt, and Adam G. Riess demonstrated, through observing distant supernovae, that the universe’s expansion was accelerating too fast for energy to be reconcentrated via gravitational force into a new singularity. This cyclical hypothesis of the universe, often termed the ‘Big Crunch,’ hence offers little hope for those desiring eternal life.33 In many ways, I have already drawn out the main features of the structural logic of the concept of entropy insofar as it governs entropic phenomena and beings. For instance, Thomson demonstrated how dissipation is the condition of possibility and impossibility of mechanical effect, and Helmholtz viewed ‘Carnot’s principle’ as governing the possibility and impossibility of ‘change,’ theorising that living-beings contribute to their own passing-away by exhausting their own nutritive conditions of possibility. It is worth, nonetheless, explicitly synthesising these points into a working definition so that

Outline to an Architectonics of Thermodynamics

189

it may guide our inquiry into and judgment of living-beings. Before moving to this working definition, it is also worth noting that the concept of entropy at-work in this chapter is thermodynamic; it pertains to the unidirectional and irreversible dissipation of energy differentials and not information manipulation.34 Furthermore, and importantly, entropy is not a substance or substantial. It is a property of a system. While it can be quantified in physics; it is a quantity that defines the relation between bodies and the quality of the energy of those bodies—whether that energy is (probabilistically) dissipated or not. The necessity of defining entropy conceptually, transducing entropy from the realm of science into philosophy, is that one does away with the need for the mathematisation of entropy as the only criterion of predication. The absence of the capacity to easily calculate entropy for certain ‘open’ systems such as living-beings has given rise to the criticism that entropy cannot be predicated of these systems. While granting entropy a calculable quantity is important for the realm of science, philosophy, among other tasks, seeks to offer conceptual schemata that may facilitate predication not mathematical quantification. Concept of Entropy: 1.  Entropy is the dissipative condition of possibility and impossibility of any energetic system. 2.  A phenomenon or an energetic system can be predicated entropic insofar as it exhausts its own dissipative condition of possibility. This working definition requires some explanation. I take an ‘energetic system’ to be any meta-stable/homeostatic system that consists of at least two energetically related components that are not at thermal equilibrium in relation to each other or at their lowest energy level and that requires an energetic influx from outside the system to endure through time. This endurance is conditioned by a process of entropic displacement. Entropic displacement enables any local energetic system to endure through time by increasing the global entropy of that which is outside of this system. As Boltzmann writes, ‘entropy can diminish only if in return some other system gains the same or a greater amount of it.’35 The most abstracted example of entropic displacement is Carnot’s heat engine. The engine’s endurance is conditioned by the maintenance of the energy difference between a hot body and a cold reservoir.36 To maintain this energy difference, one is required to ‘feed’ the engine from the global store house of not-yet-dissipated energy—often in the form of chemical energy—to maintain the temperature of the hot body. The entropy of the local system is, therefore, kept from increasing by displacing it to its surroundings. Because the process of ‘feeding’ the engine irreversibly transforms not-yet-dissipated energy into already-dissipated energy, the global

190

Chapter 11

store house of not-yet-dissipated energy is depleted or exhausted through the very same process of maintenance. This renders the process finite and explains why all energetic systems, if they are to endure, exhaust their own conditions of possibility. While this chapter is not directly concerned with the Idea or the concept of energy, it is worth noting, as indicated earlier, that energy (matter is here equivalent due to special relativity) is either probabilistically already-dissipated or not-yet-dissipated (I shall also develop this formal difference later). Furthermore, because energy is constant and indestructible, when absolutised as an Idea, it can be understood as a cosmological substratum. The Idea of the conservation of energy/matter is, therefore, what Aristotle, in On Coming-to-be and Passing Away, would term a hupokeimenon—the Idea of energy/matter governs the realm of Being—while the quality of the energy changes, the quantity of energy remains the same.37 Energy governs Sameness through Difference. The concept of entropy, however, since it governs the unidirectional and irreversible dissipative movement of energy from the not-yet-dissipated to the already-dissipated governs the realm of Becoming and Difference. The concept of entropy, which is to say, precisely ‘Entropie,’ as it was first coined in 1865 by Clausius, was chosen to describe the ‘transformational content’ of a given system: to what extent a closed system had transformed energy from the not-yet-dissipated to the already-dissipated—whether the system was at thermal equilibrium or not.38 To use Aristotelian language, again, the concept of entropy concomitantly governs geneseos (coming-to-be), alloiôsis (alteration), as well as phthoras (passing-away). The latter (phthoras) is conditioned by the former processes (geneseos) and (alloiôsis): coming-to-be and alteration entropically exhaust their own condition of possibility, they engender their own passing-away.39 Having now laid out the concept of entropy, it is opportune to define what I mean by determinate and indeterminate, so that using the concept of entropy, living-beings (locally defined) and life (generally defined) can be predicated as either entropically determinate or indeterminate. Notably, I take determination to mean not apodictic necessity but epistemic certainty (see previous discussion). As such, the concept of determination (or for something to be determinate) means: S is (certainly) P; and S can only be P (so far as we know). The first proposition is theoretically determinate (determination) and the second is practically determinate (there is no degree of practical freedom).40 Furthermore, I likewise do not take indetermination to be assertoric, which would be S is either truly P or not truly P; and S can only be truly P or not P.

Outline to an Architectonics of Thermodynamics

191

The concept of indetermination is problematic or subjunctive in nature. For something to be indeterminate means that: S could be P but could be (or become) Pn. ‌‌ Theoretically there is a margin of ‘uncertainty,’ as Prigogine terms it, and practically there is a ‘margin of indetermination’ (a property of a system that Gilbert Simondon uses to distinguish automata from reflective evolutionary systems—technical, or otherwise—capable of feedback).41 These working definitions of the concept of entropy, determination, and indetermination shall now guide an enquiry into living-beings. Since living-beings are energetic (they consume and endure through the dissipation of not-yet-dissipated energy), the task will be to judge to what extent living-beings can be predicated ‘entropic’ and whether this is determinate or indeterminate. Ludwig Boltzmann’s 1886 lecture the ‘Second Law of Thermodynamics’ together with more contemporary work on the thermodynamics of life shall be explored to this end. What Boltzmann’s lecture indicates is that living-beings can be determined as synchronically entropic— living-beings at a local level exhaust their condition of possibility since, as Boltzmann writes, entropy is their condition of existence—but life in general is diachronically indeterminate insofar as there is a degree of evolutionary freedom relating to how life is entropic. Lastly, it is important to recall that both life and living-beings remain governed by the Idea of heat death as the disciplinary principle of the architectonic. While life is indeterminate in so far as it is entropic, this indetermination is folded within the general certainty of life’s entropic finitude. THE ECOLOGICAL WEB OF ENTROPIC DISPLACEMENT Although Boltzmann’s ‘atomic view of the world’ is often cited as offering a probabilistic conceptualisation of entropy that might save us from the irreversible steady degradation of energy, his 1866 lecture ‘The Second Law of Thermodynamics’ offers a more entropically pessimistic worldview: ‘All attempts at saving the universe from this heat death [Wärmetode] have been unsuccessful, and to avoid raising hopes I cannot fulfil, let me say at once that I too shall here refrain from making such attempts.’42 Having ruled out the possibility of ‘saving the universe’ from heat death, Boltzmann, with Charles Darwin’s notion of the ‘struggle for existence’ in mind (he had previously

192

Chapter 11

argued that the nineteenth century would be crowned as the ‘century of Darwin’), offers the following theory regarding life’s relation to entropy: The general struggle for existence of animate beings is therefore not a struggle for raw materials—these, for organisms, are air, water and soil, all abundantly available—nor for energy which exists in plenty in any body in the form of heat (albeit unfortunately not transformable), but a struggle for entropy, which becomes available through the transition of energy from the hot sun to the cold earth. In order to exploit this transition as much as possible, plants spread their immense surface of leaves and force the sun’s energy, before it falls to the earth’s temperature, to perform in ways as yet unexplored certain chemical syntheses of which no one in our laboratories has so far the least idea. The products of this chemical kitchen constitute the object of struggle of the animal world.43

Here, the ‘general struggle for existence of animate beings’ refers to both a synchronic notion of actually living-beings and a diachronic evolutionary theory of life. In the broadest sense, it means the struggle for the ‘physical conditions of life,’ as Darwin defines it in his On the Origin of Species. The struggle for life’s existence refers thus to the survival of progeny as well as to the survival of individual living-beings in so far as they are related to other living-beings for their nutrition: ‘Nearly all [living-beings] either prey on or serve as prey for others; in short, that each organic being is either directly or indirectly related in the most important manner to other organic beings . . . on which it depends, or by which it is destroyed, or with which it comes into competition.’44 Boltzmann’s statement that the ‘struggle for existence’ is a ‘struggle for entropy’ implies, therefore, that entropy is the nutritive physical condition both for living-beings and for the evolution of living-beings. Entropy is, therefore, both the process through which one living-being is ‘destroyed’ as ‘prey’ for the sake of the survival of the other as well as the survival of this living-being’s progeny. Boltzmann is careful not to write that this struggle is ‘for energy.’ If one were to use energy in its general conceptual sense, then one would mean both not-yet-dissipated and already-dissipated energy—the latter of which although ‘abundantly available’ as Boltzmann writes, and increasingly so in the form of ‘heat,’ cannot be transformed through dissipation into living-beings.45 What Boltzmann terms ‘entropy’ is, therefore, more precisely the capacity for entropy production. This capacity has several other terms. Erwin Schrödinger in his 1944 What is Life? The Physical Aspect of the Living Cell defines this capacity as ‘negative entropy’: ‘Thus a living organism continually increases its entropy—or, as you may say, produces positive entropy— and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e., alive, by continually drawing from its

Outline to an Architectonics of Thermodynamics

193

environment negative entropy.’46 Schrödinger will admit in a later ‘Note’ that what he meant by negative entropy (or ‘negentropy’ as it comes to be known) is something akin to Gibbs’s ‘free energy.’47 Gibbs’s free energy is mathematically measured as the change in the quantity of enthalpy (the sum of the system’s internal energy and the product of its pressure and volume) minus the temperature of the system multiplied by the change in entropy of the system. It gives a measurement of the energy available to do work: the capacity to do work or capacity for entropy available in the system. For living-beings, this is the capacity to maintain the homeostasis of the living-being. And since living-beings are dissipative open-systems, this capacity for entropy equates to the nutrition of its environment. As Prigogine and Grégoire Nicolis write in their 1977 Self-Organisation in Nonequilibirum Systems: ‘From the simplest bacterial cell to man, maintenance of life requires a continuous exchange of energy and matter with the surrounding world.’48 There are many other thermodynamic terms for this capacity for entropy (some more conceptually confusing than others). These include ‘essergy,’ ‘utilizable energy,’ ‘available useful work,’ ‘maximum (or minimum) work,’ ‘maximum (or minimum) work content,’ and ‘exergy.’ All signify more or less: not-yet-dissipated available energy capable of work. As I have written elsewhere, exergy, coined by Zoran Rant in 1956, shall be the chosen term for this capacity.49 I take it to be the least confusing and closest to what is mathematically as well as philosophically at stake. Again, while this chapter is not concerned with the concept of energy per se, it is worth mentioning that the opposite to exergy is termed ‘anergy.’ Although this is now a rather redundant thermodynamic term (since mathematically one can quantify entropy), its conceptual import for the architectonics of thermodynamics is that it offers a term for already-dissipated energy. As Josef Honerkamp writes, connecting the two terms: ‘The maximum fraction of an energy [that] . . . can be transformed into work is called exergy. The remaining part is called anergy, and this corresponds to the waste heat.’50 For Boltzmann, then, the exergy which living-beings both synchronically and diachronically struggle for has one principal source: the ‘chemical syntheses’ that derive from the chlorophyllic capture of solar energy.51 Although Boltzmann was ignorant of the photosynthetic mechanisms at work, he was correct to argue that the ‘products of this chemical kitchen constitute the object of struggle of the animal world.’ Indeed, the vast majority of nutritive exergy (biomass) available for living-beings (‘≈80%’)52 is produced by photosynthesizing plants whose primary position on the trophic pyramid can be regarded as the living sine qua non of the entire ecological web of production and consumption.53 Plants synchronically condition a web of entropic displacement, where organisms higher on the trophic pyramid displace their increase in local positive entropy, onto organisms lower down the pyramid,

194

Chapter 11

avoiding ‘maximum entropy qua death.’ As American biophysicist Alfred J. Lotka writes in his seminal 1925 Elements of Physical Biology (a text inspired by Boltzmann): ‘The animal consuming the grass and deriving from its oxidation the requisite energy for further activity has not initiated any revolutionary process, but has merely helped nature in its course, has merely rolled the ball downhill . . . [to levels of lower energy availability (higher entropy)].’54 Thus, if one were to take into consideration the animal kingdom alone, from herbivores to apex predators, it would be possible to predicate them as synchronically entropic. Both anaerobic and aerobic respiring living-beings cannot capture (whether actively through hunting or passively through scavenging) and digest their food twice. Entropy, qua the irreversible destruction of the exergy necessary for their survival, is, therefore, their condition of possibility and impossibility—they exhaust the trophic level below their own. Life, however, is not restricted to the animal kingdom. Furthermore, since photosynthesizing plants ground the ecological web of entropic displacement, it is pertinent (perhaps more so though than with respiring animals) to determine the entropic status of plants. While plants­do ‘consume’ the sun’s photons (they cannot exploit photons twice), the consumptive relation that plants have to the sun is not the same relation as respiring living-beings have with their nutritive conditions. Whereas an increase in the quantity of consuming respiring living-beings on earth decreases the amount of nutritive exergy for those living-beings, an increase in photosynthesis does not decrease the amount of exergy in the sun. Plants do not accelerate the degradation of the sun’s supply of hydrogen. Plants are, therefore, non-entropic in relation to the sun since they do not their exhaust this condition of possibility. Plants are, however, entropic in and of themselves. As many biochemists have pointed out, for photosynthesis to take place, more spontaneous rather than non-spontaneous reactions occur—Gibbs’s free energy decreases overall.55 Not only is photosynthesis inefficient (the photon energy captured is ≈11 percent), but none of the exergy produced by plants is returned to the sun.56 Plants are not perpetual motion organisms. They irreversibly consume the exergy of the sun. In many ways, then, and as both Boltzmann and Lotka point out, it is the entropic sacrifice of the sun that conditions the ecological web of entropic displacement. The ‘colossal’ reduction of the energy-gradient between the sun and the earth is the fundamental dissipative condition of possibility and impossibility of living-beings since photosynthesis relies, for the most part, on this reduction. Photon radiation is, however, not the only nutritive necessity for photosynthesis. As ecological economist Robert U. Ayres writes, ‘photo-synthetic organisms’ function ‘only with the help of organic compounds containing phosphorus,’ a compound that is exhausted in the process. Thus, not only are respiring consumptive living-beings entropic,

Outline to an Architectonics of Thermodynamics

195

but the entire ecological web is too since plants exhaust their condition of possibility—mineral compounds cannot be used twice. The ecological web of entropic displacement is, however, subject to evolution. In 1922, Lotka revised Boltzmann’s statement regarding the ‘struggle for entropy’ into a principle of evolution, now known as the maximum em-power principle (as restated by Howard. T. Odum).57 Lotka writes: ‘In accord with [Boltzmann’s] observation is the principle that, in the struggle for existence, the advantage must go to those organisms whose energy-capturing devices are most efficient in directing available energy into channels favorable to the preservation of the species.’58 What Lotka means here by efficiency and its relation to Darwinian evolution (adaptation through variation) is twofold. First, since exergy is finite, there is an evolutionary advantage to an ‘organism which is most efficient, most economical, in applying to preservative uses such energy as it captures.’ Living-beings (both internal to and external to different species) whose energetic systems of preservation are ‘efficient’ do not exhaust their condition of possibility at the same rate as others. As such, they secure their persistence through time longer than other living-beings (variation). Second, if two organisms are equally efficient but compete for the same source of exergy, then the organism that is more efficient at capturing ‘previously unutilised sources of available energy’ may open an evolutionary niche particular to them and hence persist to exhaust a different nutritive condition of possibility (adaptation). Certain living beings (whether through pre-existing genetics, mutations, or symbiogenetics) are thus naturally selected and, as Jeffery Wicken writes, ‘serve as raw material for creating something entirely new [(speciation)].’59 CONCLUSION: A FINITE ZONE OF INDETERMINATION It is now possible to determine that living-beings (S) are (certainly) entropic and can only be entropic (P) (in so far as we know). I take this to be epistemically determinate both synchronically and diachronically. Indeed, the ecological web of entropic displacement is itself a mechanism of evolutionary difference. If this were not the case, it would be possible for living-beings to violate the second law of thermodynamics through evolution. How life is entropic is, however, indeterminate. There is a degree of freedom regarding the way in which life is entropic: life (S) could be (or become) entropic in another way (Pn). As Bergson writes in Creative Evolution, there is, therefore, a ‘zone of indetermination’ that surrounds life, one in which there is a ‘canalisation’ of accumulated ‘energy’ into ‘variables and indeterminable

196

Chapter 11

directions.’60 This zone of indetermination is, however, finite since entropy is its condition of possibility and impossibility; these ‘indeterminable directions’ remain, nonetheless, entropic. Even faraway planets with novel biospheres dependent on yet-to-be-born stars are regulated by the focus imaginarius of heat death. NOTES 1. Bernard Stiegler, The Neganthropocene (Open Humanities Press, 2018). 2. Maël Montévil, ‘Entropies and the Anthropocene Crisis,’ AI and Society (May 2021): 2. 3. Giuseppe Longo and Francis Bailly, ‘Biological Organisation and Anti-Entropy,’ Journal of Biological Systems, 17, no. 1 (2009): 63–96. 4. Isabelle Stengers and Ilya Prigogine, Order out of Chaos (London: Verso, 2017). 5. Rudolf Clausius, The Mechanical Theory of Heat (London: John van Voorst, 1867), 365. 6. Immanuel Kant, ‘Transcendental Doctrine of Method,’ in Critique of Pure Reason, translated by Norman Kemp Smith (London: McMillian and Co, 1929), 653 [A 832; B 860]. 7. Nicholas Georgescu-Roegen, ‘Afterword,’ in Entropy: A New World View, edited by Jeremy Rifkin and Ted Howard (New York: Viking Press, 1980), 269. 8. Henri Bergson, Creative Evolution, translated by Arthur Mitchell (London: Random House, 1944), 278. 9. Adair Crawford, Experiments and Observations on Animal Heat (London: Jo Johnson, 1777), 13. For a useful historical overview of the genesis of thermodynamics, see Peeter Müürsepp’s chapter in this volume (‘Irreversibility and Uncertainty: Revisiting Prigogine in the Digital Age’). 10. Crawford, Animal Heat, 13. 11. William Rankine, ‘On the Recognition of the Mechanical Energy of the Universe,’ in Miscellaneous Scientific Papers, translated by J.W. Millar (London: Charles Griffen & Co., 1881), 201. 12. William Thomson, ‘On a Universal Tendency in Nature to the Dissipation of Energy,’ Proceedings of the Royal Society of Edinburgh, 3 (1857): 139. 13. Rudolf Clausius, ‘Ninth Memoir,’ in The Mechanical Theory of Heat, edited by T. Archer Hirst (London: Jon Van Voorst, 1867), 365. 14. Thomson, ‘On a Universal Tendency,’ 139. 15. Ibid, 140. 16. In a similar fashion, Martin Heidegger describes ‘death’ as the ‘possibility of the impossibility of any existence at all.’ See: Heidegger, Being and Time, translated by John Macquarrie and Edward Robinson (Oxford: Blackwell Publishers, 2001), 307. Likewise, Jacques Derrida’s notion of ‘différance’ qua the condition of possibility and impossibility (a ‘quasi-’ or ‘ultra-transcendental’) of signification should also

Outline to an Architectonics of Thermodynamics

197

be kept in mind. Jacques Derrida, Of Grammatology, translated by Gayatri Spivak (Baltimore: John Hopkins University Press, 1997), 61. 17. Thomson, ‘On a Universal Tendency,’ 141. 18. Ibid, 142. 19. Hermann von Helmholtz, ‘The Interaction of Natural Forces,’ in Science and Culture: Popular and Philosophical Essays, edited by David Cahan (Chicago and London: Chicago University Press, 1995), 19, 21. 20. Leonardo da Vinci, Leonardo Da Vinci’s Note-books, translated by Edward McCurdy (New York: Duckworth & Company, 1906), 64. 21. Helmholz, ‘The Interaction of Natural Forces,’ 19. 22. Ibid, 19. 23. Ibid, 36. 24. Ibid, 29. 25. Ibid, 26. 26. Friedrich Lange, ‘The Scientific Cosmogony,’ in The History of Materialism Vol. III, translated by Ernst Chester Thomas (New York: Harcourt, Brace & Co, 1925), 11. 27. Friedrich Nietzsche, ‘Heraclitus,’ in The Pre-platonic Philosophers, translated by Greg Witlock (Urbana: Illinois Press, 2006), 62. 28. Helmholtz, ‘The Interaction of Natural Forces,’ 30. 29. Ibid, 30. 30. William Thomson, ‘On the Age of the Sun’s Heat,’ Macmillan’s Magazine, 5 (March 1862): 388. 31. Heidegger, Being and Time, 279. 32. Helmholtz, ‘The Interaction of Natural Forces,’ 43. 33. Saul Perlmutter, Brian P. Schmidt, and Adam G. Riess, “Nobel Prize in Physics 2011 ‘for the discovery of the accelerating expansion of the Universe through observations of distant supernovae’,” The Royal Swedish Academy of Sciences. 34. Informatic entropy differs from thermodynamic entropy both theoretically and practically. See Claude E. Shannon, ‘A Mathematical Theory of Communication,’ Bell System Technical Journal, 27, no. 3 (July 1948): 379–423. 35. Ludwig Boltzmann, ‘The Second Law of Thermodynamics,’ in Theoretical Physics and Philosophical Problems: Selected Writings, edited by Brian McGuiness and translated by Paul Foulkes (Dordrect, Boston: D. Reidel Publishing Company, 1974), 22. 36. Sadi Carnot, Réflexions sur la puissance motrice du feu et sur les machines propres à développer cette puissance (Paris: Chez Bachelier, 1824). 37. Aristotle, ‘On Coming-to-be and Passing Away,’ in Leob Classical Library Aristotle: On Sophistical Refutations, On Coming-to-be and Passing Away and On the Cosmos, translated by E.S. Forster (Cambridge: Harvard University Press, 1955), 169. 38. Clausius, ‘Ninth Memoir,’ 357. 39. Aristotle, ‘On Coming-to-be and Passing Away,’ 165. 40. The copula, here, predicates property not identity.

198

Chapter 11

41. Ilya Prigogine, The End of Certainty (New York: The Free Press,1997); Gilbert Simondon, Du mode d’existence des objets techniques (Paris: Aubier et Montaigne, 1958). 42. Boltzmann, ‘Second Law,’ 19. In 1898, Boltzmann writes that ‘the discovery of a satisfactory way of avoiding [heat death] would be very desirable.’ Ludwig Boltzmann, Lectures on Gas Theory, translated by Stephen Brush (New York: Dover Publications, 1964), 402. 43. Ibid, 24. 44. Charles Darwin, On the Origins of Species (Oxford: Oxford University Press, 2008), 132. 45. Boltzmann, ‘Second Law,’ 24. 46. Erwin Schrödinger, What is Life? (Cambridge: Cambridge University Press, 2013), 71. 47. Schrödinger, What is Life?, 74 48. Ilya Prigogine and Gregoire Nicolis, Self-Organisation in Nonequilibirum Systems (New York: John Wiley & Sons, 1977), 24. 49. Joel White, ‘On Significative Exergy: Toward a Logomachics of Education,’ Educational Philosophy and Theory (2021). 50. Josef Honerkamp, Statistical Physics: An Advanced Approach with Applications (Berlin: Springer, 1998), 278. 51. Boltzmann, ‘Second Law,’ 24. 52. Yinon M. Bar, Rob Phillips, and Ron Milo, ‘The Biomass Distribution of Earth,’ PNAS, 115, no. 25 (June 2018): ‘≈450 Gt C; SI Appendix, Table S2),’ with bacteria at ‘≈15%’ and ‘fungi, archaea, protists, animals, and viruses’ accounting ‘for the remaining | for all p in the multiplicity M} contains the space of different fields of gradients (vector fields) acting on M. The action of any particular vector field V(p) is the flow across M generated by V.

260

Chapter 15

Figure 15.3 Tangent spaces (e.g., planes) over a manifold (e.g., sphere). Copyright Sha Xin Wei.

A flow is an actualization of the potential dynamic—the propensity—via the mathematical inverse to differenciation: integration. TM ↓π

M (at a point p) Before moving on, we note methodologically that there are good abstractions and bad abstractions; the latter we relabel as formalisations. Good abstractions, to adopt Isabelle Stengers’s characterisation, are those that act as lures for adventurous thought lived out in a changeful, experienced world. As she put it in an essay on A.N. Whitehead’s Process and Reality: ‘[t]he aim of the abstractions that Whitehead designed is not to produce new definitions of what we consensually perceive and name, but to induce empirically felt variations in the way our experience matters.’33 Formalisations are ideal forms that stop thought and replace adventurous speculation by mechanical re-arrangements of signs. Equipped with its unique notational instruments,34 contemporary mathematics can serve either way. As a creative practice mathematics exemplifies how abstractions—mathematical entities, structures, morphisms, theorems, and proofs—act as lures for further adventurous thought.

Adjacent Possibles

261

In that spirit, we take a bit of care at this juncture with regard to the implicit presence of a notion of time suggested by the word ‘trajectory.’ The typical scaffolding model for these related structures is indeed extensive motion of particles across space, but even in physics when we go to the situation of the ‘movement’ of observers across spacetime, the parameter of a trajectory as a one-dimensional curve in a spacetime manifold is no longer simply the ‘time’ of a universal clock. Generally, the parametrisation of any curve in a manifold is arbitrary. Just because a structure, such as a curve, is unidimensional, there is absolutely no reason to think if its parametrisation, should it have one, as time. Nonetheless, under the caveat about the arbitrariness of such parameterisations, we nonetheless can characterise them as part of Chronos since they are all extensive quantities. However, each section of a vector bundle could induce a differentiation with ontogenetic power, living in the realm of Aion not Chronos.35 We turn next to elaborate this with a bit more nuance, acknowledging the boundless heterogeneity of processually developing multiplicity. DIFFERENTIAL HETEROGENESIS Furnished with these basic notions from Riemannian geometry, let me introduce recent work by mathematicians—Sarti, Citti, and David Piotrowski— who have constructed a model of what they call differential heterogenesis,36 inspired by Deleuze and Guattari, and Gilbert Simondon. They begin with Deleuze and Guattari’s adaptation of fibre bundles via Albert Lautman in the last chapter of A Thousand Plateaus, and proceed motivated by Jean Petititot’s, and independently Elias Zafarias and Michael Epperson’s more explicit adoption of fibre bundles as a way to register sense.37 Here a section of the bundle is a mapping from each monad or perspectival locus p into a corresponding ‘fiber’ Tp of percepts and observations relative to that p. Across the sites/points of view p in the multiplicity, each fiber Tp associated with the base-point p is isomorphic to a common, fixed space. Thus, even though the multiplicity itself, the magma of bio-social semiotic materials, may be heterogeneous, in the classical model of A Thousand Plateaus there is but one model space of potential, a homogeneous model of differentiation. As Muriel Combes pointed out in her book on Gilbert Simondon and the Philosophy of Transindividuation, Simondon starts by orienting to individuation rather than an a priori principle of individuation.38 It is in this same spirit that rather than formalising a principle of differentiation, we draw inspiration from the notion of differential operators generating gradients from the multiplicity. One insight synthetic philosophy can draw from contemporary mathematics is that G.W. Leibniz’s differential ‘dx’ which inspired Deleuze’s

262

Chapter 15

‘difference in itself’ is just one of a whole infinite space—an algebra—of differential operators defined relative to the multiplicity. Indeed Deleuze glimpsed this infinite algebra: Every time we find ourselves confronted or bound by a limitation or an opposition, we should ask what such a situation presupposes. It presupposes a swarm of differences, a pluralism of free, wild or untamed differences; a properly differential and original space and time; all of which persist alongside the simplifications of limitation and opposition. A more profound real element must be defined in order for oppositions of forces or limitations of forms to be drawn, one which is determined as an abstract and potential multiplicity. Oppositions are roughly cut from a delicate milieu of overlapping perspectives, of communicating distances, divergences and disparities, of heterogeneous potentials and intensities. Nor is it primarily a question of dissolving tensions in the identical, but rather of distributing the disparities in a multiplicity.39

Recalling that the model of the differential on the level of the potential field to which Deleuze and Guattari appealed has a uniformity in that every fibre is isomorphic, enabling and requiring that the differential operators are (uniformly elliptic, uniformly parabolic, or uniformly hyperbolic). What Sarti, Citti, and their collaborators have built is an account of differential heterogenesis that better adequates Deleuze and Guattari’s magmatic heterogeneity: A different point of view has been recently considered by Hörmander . . . and Rothshild and Stein. . . . They introduced a class of operators that are degenerate, since they are defined on a differential structure that can have different behaviour from one point to an other. . . . Here we introduce a multiplicity of operators (Api)i=1,2,··· that are different from a point to the other by removing the assumption that all the operators have the same formal expression.40

Removing the condition of same or uniformly bounded formal expression replaces and extends the Leibnizian ‘dx’ to which Deleuze refers in Difference and Repetition by a boundlessly heterogeneous milieu of potential ‘divergences and disparities.’ This profoundly enriches the notion of difference-in-itself. There are two large practical implications to their approach to differential heterogenesis. First, they employ a technical device of ‘lifting’ which in effect forms the union of formally distinct differential operators into a uniform space of higher dimension that contains all the potential divergences in play over the actual multiplicity. If the multiplicity is locally compact then their approach can be carried out in a bounded way.41 However this lifting can be a strong supposition in general unbounded, non-compact multiplicities. Second, as Sarti and the co-authors themselves point out, the

Adjacent Possibles

263

degeneracy of the differential operators allows for shocks and discontinuities. So we have smoothness, striation, and rupture cheek by jowl. We return to the empirical work of experimental experience and experiments in whole experience. Architects specify the physical forms, materials, physical proportions, and relations of a built environment: they more or less precisely condition but do not determine the actions of those who inhabit the environment. Just as an architect can specify the acoustics of an environment which sets the potential response to any particular sonic activity, rather than determine particular fields of sound, the composer/designer of an augmented environment can, instead of pre-specifying the activity of the environment’s inhabitants, hint, tint, tune the potential response of the environment to arbitrary activity that has not been foreseen by the designer. In other words, we condition but do not determine the open and indeed indeterminate range of activities.42 Think of building a floor raked at a certain small angle, say five degrees. The raked floor does not, cannot, determine all the open-ended range of activities that may occur on it—people can run across it, sit in a circle to talk, change the diapers on a baby, set fire, build further structures atop the floor, just as with a level floor. However, all their activities will unfold differently because their relations to gravity and to ground are different than if they were on a level floor. Synthesis and its predecessor, the Topological Media Lab, were built to invent techniques for the experimental, experiential study of sense-making collective gesture and movement, where the very manner in which the experimental conditionings of experience was itself subject to institutional and methodological variation. This reflexively abductive approach acknowledged the irreducible indeterminacy of collective relational experience, ever-developing technology and ever-developing method. We conducted these experiments always in the form of live event in order to focus on the processuality of experience. We adapted techniques from the most highly

Figure 15.4 ‘Lifting,’ from Alessandro Sarti, Giovanna Citti, and David Piotrowski, ‘Differential Heterogenesis and the Emergence of Semiotic Function,’ 13.

264

Chapter 15

developed of human forms of modulating event: performing arts. We employed real-time responsive media—in lighting, video, sound, air currents, and water—in order to palpably modulate events for people in non-predetermined embodied engagements with their ambient and each other. And we employed computational media not to reproduce the event, but to precisely reproducibly vary the conditions of experience in an event, while allowing visitors as free as possible to do whatever they felt like doing without any a priori schema.43 The Synthesis media choreography system enables this mode of composition for a wide range of improvisatory activity in environments whose latent responsivity—what one might call evental acoustics—can evolve according to contingent, ad hoc activity of people, media, and things, as well as to prior design.44 The designer composes overlapping regions of potential states of events, glossed by metaphorical labels of their mutable choice. This indeterminacy of differentiation of possible topologies of states is a wholly other order from the differenciation of actualities according to a specific topology of states; in Deleuzian terms, differentiation is the virtual part of the real. These topologies of states can be created for animating improvisatory occasions using machines for media choreography created at the Topological Media Lab and refined at Synthesis.45 Although they were posed as technologies for animating playspaces, these evental forms, techniques, and technologies for media choreography were designed from the very first as palpable experiments in collectively enacting life-affirming occasions of sense-making alternative to both the object-oriented logics of deterministic algorithms as well as equally meaningless stochastic distributions of random parameters. A key feature of these technologies is that they condition but do not determine events; indeed, they can reproduce experiential conditions to any degree of possible precision, yet every occasion is nonetheless ad hoc and unique. One can arrange every piece of furniture, every pen and paper, every object in a conference room identically for a series of diplomatic meetings, yet the course of negotiation in each occasion can unfold uniquely and even radically differently depending on contingent differentiations of sense. HAPTICALITY AND ONTOGENESIS How could such technologies for the joint articulation of sense in open-ended occasions enable the play of ethico-aesthetic and political relations, especially among differential fields—more precisely spaces of differential operators—over non-congruent, even disproportionate frames of reference? Bordeleau summarises Giorgio Agamben’s concept of destituent power as ‘a political practice that calls out the contingent dimension and arbitrariness of

Adjacent Possibles

265

government actions’ requiring a hapticality—the capacity to feel the world through others. Citing the Invisible Committee, Bordeleau elaborates that this ‘haptic or processual mode of perception’ is “a capacity for ‘perceiving a world peopled not with things but with forces, not with subjects but with powers, not with bodies but with bonds’.”46 This turn from things, subjects, and bodies to forces, powers, and bonds motivates the approach we have taken in this chapter to indeterminacy and ontogenesis. Rather than pre-state an a priori schema of bodies, subjects, and things, to which we can apply indifferent norms distinguishing good entities from bad entities, good actions from bad actions, if we focus on differentials of relational bonds, powers, and forces, we can make more palpable certain propensities or enabling constraints that condition but do not determine ontogenetic process, the sine qua non of life. Every dynamical field of forces co-constitutes the entities whose heterogenesis, differentiated constitute the field. In this sense one can not only feel the world through others, but also feel others through the world, a dynamical, more-than-human and differently humane hapticality. NOTES 1. Erik Bordeleau, ‘Abstracting the Commons?’ Common Conflict (2016): note 6. 2. Invisible Committee, To Our Friends, translated by Robert Hurley (Cambridge, MA: Semiotext(e) MIT Press, 2015), 33–34. 3. Kurt Gödel proved in 1931 of the incompleteness and inconsistency of any axiomatic theory—in particular an instantiation of mechanical algorithmic (computational) procedure—that contains at minimum the arithmetic of integer addition and multiplication. This means that any mathematical theory containing arithmetic there are statements (theorems) that can be proven and whose negations can also be proven in the theory. This inconsistency is itself a theorem. (Kurt Gödel, ‘On Formally Undecidable Propositions of Principia Mathematica and Related Systems I,’ in From Frege to Gödel; a Source Book in Mathematical Logic, 1879–1931, edited by Jean van Heijenoort [Cambridge, MA: Harvard University Press, 1967 (1931)], 596–616). Mathematics and meta-mathematics are sub-sets of all human thought. Therefore, incompleteness is inextricably part of human thought, in fact in the core of even the most “purified” systems of reasoning—mathematical proof, which is a very particular part of the diversity of human thought. 4. This takes the middle way between correlationism and anti-correlationism. Giuseppe Longo, Ontogenesis Lectures, 2017 Synthesis ASU, video, https:​//​vimeo​ .com​/showcase​/4500874. 5. Synthesis @ ASU, https:​//​vimeo​.com​/synthesiscenter​/demo. 6. Giorgio Agamben, ‘What is a Destituent Power?,’ translated by Stephanie Wakefield, Environment and Planning D: Society and Space, 32 (2014): 65–74, in particular 71.

266

Chapter 15

7. I thank Xiang Fan for this formulation, and recall A.N. Whitehead’s discussion of presentational immediacy in the more general setting of Process and Reality: For the organic theory, the most primitive perception, is ‘feeling the body as functioning.’ This is a feeling of the world in the past; it is the inheritance of the world as a complex of feeling; namely, it is the feeling of derived feelings. The later, sophisticated perception is ‘feeling the contemporary world.’ Even this presentational immediacy begins with sense-presentation of the contemporary body. The body, however, is only a peculiarly intimate bit of the world. Just as Descartes said, ‘this body is mine’; so he should have said, ‘this actual world is mine.’ My process of ‘being myself’ is my origination from my possession of the world. (Alfred North Whitehead, Process and Reality [New York: Free Press, 1978], 81) 8. For related points, see Müürsepp’s chapter in this volume. 9. Space precludes extended recapitulation of digital computer representation on which all algorithmic technology acts. But it is vital to recall that in Claude Shannon’s urtext in which he defines information for discrete signals—the ‘bit’—he states: ‘The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem’ (Claude E. Shannon, ‘A Mathematical Theory of Communication,’ Bell System Technical Journal, 27 [July and October 1948]: 379). Shannon saw that his mathematical definition of digital information explicitly excluded what many computer scientists and media theorists later presumed could be represented digitally: meaning. 10. Daniela Voss, ‘Deleuze’s Rethinking of the Notion of Sense,’ Deleuze Studies, 7, no. 1 (2013): 17. 11. Ibid, 18. 12. Ibid, 22. 13. This in fact reflects a structural dichotomy that appears in any differentiable dynamical system in general: a sub-manifold in the space of trajectories passing through an arbitrary initial point in which trajectories with nearby initial data tend to diverge, and another disjoint sub-manifold in which trajectories with nearby initial data tend to converge. 14. Peter Grassberger, ‘On the Hausdorff Dimension of Fractal Attractors,’ Journal of Statistical Physics, 1 (1981); Steven H. Strogatz, Nonlinear Dynamics and Chaos, second edition (Boulder, CO: Westview Press, 2015). 15. One of most important theorems in probability concerning random variables is the Central Limit Theorem: Suppose Xn is a sequence of independent and identically distributed random variables with mean c and variance s2. Then as n approaches infinity, the mean of the random variables converge in distribution to a normal distribution. What is extraordinary that this holds regardless of the distribution of those random variables, and thus independent of whatever process generated those distributions of variables. Patrick Billingsley, Probability and Measure, third edition (New York: Wiley, 1995), 357.

Adjacent Possibles

267

16. Stuart Kauffman, ‘Beyond Physics: The Emergence and Evolution of Life,’ Institute for Systems Biology, slides for talk at CERN, 29 March 2017. (Personal communication, August 2020.) 17. Ibid. 18. Stuart A. Kauffman develops the ‘adjacent possible’ extensively in Reinventing the Sacred (New York: Basic Books, 2008), especially 64–65. 19. Ibid. 20. Ibid, slide 41. 21. Stephen Jay Gould and R.C. Lewontin, ‘The Spandrels of San Marco and the Panglossian Paradigm: A Critique of the Adaptationist Programme,’ Proceedings of the Royal Society of London, Series B, Biological Sciences, 205, no. 1161 (1979): 584. 22. Anthony Dunne and Fiona Raby, Speculative Everything: Design, Fiction, and Social Dreaming (Cambridge, MA: The MIT Press, 2013), 4. 23. Natasha Jen, ‘Design Thinking is Bullsh*t,’ 99U Conference (2017), https:​//​99u​ .adobe​.com​/videos​/55967​/natasha​-jen​-design​-thinking​-is​-bullshit. 24. Gilles Deleuze, Difference and Repetition (London: Continuum, 2004), 209. 25. Ibid. 26. Ibid, 206. 27. Ibid, 244. 28. Ibid, 207, emphasis added. 29. Michael Douma, ‘What is Refraction?’ (2008), http:​//​www​.webexhibits​.org​/ causesofcolor​/13A​.html. 30. For a mathematical source on Riemannian manifolds, and more generally on topology and differential geometry, see B.A. Dubrovin, A.T. Fomenko, and S.P. Novikov, Modern Geometry—Methods and Applications (New York: Springer-Verlag, 1984). For a philosophical approach, see Arkady Plotnitsky, ‘Manifolds: On the Concept of Space in Riemann and Deleuze,’ in Virtual Mathematics: The Logic of Difference, edited by Simon Duffy (Clinamen Press, 2006), 187–208. 31. Throughout this chapter, I will use the non-standard spelling of differenciation to align mathematical notions of differentiation to Deleuze’s concept, which has been translated into English spelled with a ‘c’ to distinguish from differentiation which is an operator of the virtual. 32. A semi-colloquial description of classical Riemannian geometry: A differentiable manifold M of dimension n is a set that is locally diffeomorphic to a domain in a Euclidean space Rn. It has local geometry as defined by metric curvature, which is not necessarily globally uniform. A vector field on the manifold M is an assignment to each point on M a vector in a given vector space V. In the canonical case, V is the tangent space, with dimension n. Informally one can think of V as the linear approximation to M. A more supple approach is to think of V as the span of differential operators corresponding to partial differenciation along a maximal net of independent directions in the manifold, defined at each point. It is a theorem to be proven that these two formulations are equivalent. 33. Isabelle Stengers, ‘A Constructivist Reading of Process and Reality,’ Theory Culture Society, 25, no. 4 (2008): 96.

268

Chapter 15

34. A notation is more than a representation, but also not identical with material—energetic | affective—configurations that it is used to articulate. The key here is that a notation is used by a living intention to shape, in-form, material. Think of how a musical score is used by performers, or a recipe is used by cooks, or algebraic symbols are used by mathematicians. See chapter 2 in Sha, Poiesis and Enchantment in Topological Matter (2013). 35. Daniela Voss elaborates: “Chronos is defined as the time of the present which encompasses past and future as horizons relative to the present. According to Deleuze, Chronos designates the empirical or physical aspect of time, insofar as Chronos captures the physical changes in things, their interactions and mixtures. Aion, on the contrary, is defined as a ‘virtual time’ that slips away from the present by extending indefinitely into the past and the future. Aion is the time of pure events” (Voss, ‘Deleuze’s Sense,’ 15). 36. Alessandro Sarti, Giovanna Citti, and David Piotrowski, ‘Differential Heterogenesis and the Emergence of Semiotic Function.’ 37. Gilles Deleuze and Felix Guattari, ‘1440: The Smooth and the Striated,’ in A Thousand Plateaus: Capitalism and Schizophrenia, translated by Brian Massumi (Minneapolis: University of Minnesota Press, 1987), 474–500; Jean Petitot, ‘Morphological Eidetics for Phenomenology of Perception,’ in Naturalizing Phenomenology: Issues in Contemporary Phenomenology and Cognitive Science, edited by Jean Petitot, Francisco J. Varela, Jean-Michel Roy, and Bernard Pachoud (Stanford: Stanford University Press, 1999), 330–71; Michael Epperson and Elias Zafiris, Foundations of Relational Realism. 38. Muriel Combes, Gilbert Simondon and the Philosophy of the Transindividual (Cambridge, MA: MIT Press, 2013), 2. 39. Gilles Deleuze, Difference and Repetition (London: Continuum, 2004), 50. 40. Alessandro Sarti, Giovanna Citti, and David Piotrowski, ‘Differential Heterogenesis and the Emergence of Semiotic Function,’ Semiotica, no. 230 (2019): 9. 41. Compactness is a topological property of manifold defined by the existence of a finite sub-covering for every infinite cover. This is a very powerful property. In the case that there is some minimal immanent structure (like a topological vector space), it is a theorem that compactness is equivalent to closed and bounded. Beyond its prolific precise mathematical uses, the concept of compactness distills an essential capacity: to be able to derive statements about regions or even wholes from statements, observations, facts, or qualities about purely individual loci. In Genesis, Michel Serres wrote: ‘The arithmetic of whole numbers remains a secret foundation of our understanding: we’re all Pythagoreans. We think only in monadologies. Nevertheless, we are as little sure of the one as of the multiple’ (Michel Serres, Genesis [Ann Arbor: University of Michigan Press, 1995], 3). The topological concept of compactness equips the beginning of a response to Serres’s challenge. 42. Indeterminate means something profoundly different from infinite. The set of integers is infinite and ordered. More fundamentally, the membership of a set is not contingent: the truth of ‘p is a member of set S’ is independent of when I ask this question, or of how I feel about it, or my relation to you, or whether I have asked this question before. However, in his talks, Stuart Kauffman likes to ask of his audience:

Adjacent Possibles

269

‘Think of all the uses of a hammer,’ pause, and then ask ‘Can you feel your mind going blank?’ As we put this question to more people, we obtain more different responses, and more types of responses with no a priori limit. The range of possible uses of a hammer is not a set: identifying one use does not constrain what another use could be. There is no definitive rule by which we can generate all uses. And there is no successor function such that given one use, we can determine a successor use. By a fundamental theorem of set theory, any set whatsoever can be well-ordered. An ordered set has a successor function, and a successor function for a set induces an ordering. Therefore if the uses of a hammer has no successor function, then it is not a set. 43. See videos of the TML research-creation: vimeo.com/tml/research2007 and examples of some techniques in synthesiscenter.net/techniques. 44. Synthesis, video of experiential experiments as improvisatory events in responsive environments: vimeo.com/synthesiscenter/demo. 45. Sha Xin Wei, ‘Theater without Organs: Co-Articulating Gesture and Substrate in Responsive Environments,’ in Living Architecture Systems Group White Papers, edited by Philip Beesley and Ala Roushan (Waterloo, Ontario: Riverside Architectural Press, 2016), 276–91; Brandon Mechtley, Julian Stein, Connor Rawls, and Sha Xin Wei, ‘SC: A Modular Software Suite for Composing Continuously-Evolving Responsive Environments,’ in Living Architecture Systems Group White Papers, edited by Philip Beesley, Sascha Hastings, and Sarah Bonnemaison (Kitchener, Ontario: Riverside Architectural Press, 2019), 197–206; Synthesis state play test, https:​//​vimeo​.com​ /synthesiscenter​/stateengine. 46. Erik Bordeleau, ‘Abstracting the Commons?’ Common Conflict (2016): 4.

Epilogue Schrodinger’s Spider in the African Bush: Coping with Indeterminacy in the Framing of Questions to Mambila Spider Divination David Zeitlyn

The background to this epilogue is a large and developing literature on uncertainty. Mary Douglas was a key figure in giving the concept purchase in qualitative studies. As she argued, ‘the splendid thing about indeterminacy for anthropology is that our arcane problems about other people’s thought suddenly become common to us all as human beings.’1 For Douglas, we are all ‘creatures that live in uncertainty,’ while ‘we cope with uncertainty as best we can, we go on seeking certainty. We create institutions that protect our valued ideas. We use analogies to build them up like a house of cards . . . with a few central ideas holding them in place like a roof.’2 The currency of such approaches can be seen in Sandra Calkins’s more recent use of them: uncertainty is ‘a lived experience, an unease about acting in view of an unpredictable future . . . a rendering of realities, which can lead to innovations and creative solutions, but also can debilitate people through fear or unease.’3 While uncertainty is ‘an element of all action, because outcomes are always unknown,’ it is not ‘evenly distributed across time and space’; it is not ‘a uniform property of action.’4 When framing questions for Mambila spider divination (ŋgam dù: a form of oracle) in Cameroon, both clients and diviners may come close to determinacy but by and large avoid it, except when talking in the abstract to impertinent anthropologists or similar. In ŋgam dù divination, binary questions 271

272

Epilogue

are posed to a spider (or crab) in a hole in the ground. The alternatives are associated with a stone and a stick placed on either side of the hole, which is covered by some marked leaf cards. After asking the questions, the area near the hole is covered with a pot and those present wait for the spider to emerge. When the spider comes out it disturbs the cards and the pattern of these cards relative to stone and stick is interpreted to answer the question posed. Mambila divinatory practice includes repeated reformulations of questions as problems and possible solutions are considered and refined in the light of other results in the same session. Hence divination provides a means to deal with or accommodate real world indeterminacy, providing paths to action, resolving some of the aporias of the instant with sanctioned advice: the spider says ‘do this!’ It acts as a boundary object, something that diviner and client can concentrate on while discussing important and sometimes contentious decisions beset with great uncertainty. Life histories of sample individuals show that even some diviners either do not consult or do not follow the advice given. This puts a different interpretative frame on the determinacy that can be easily elicited. Mambila people talk determinacy while walking cautiously, acting to implicitly maintain indeterminacy, and to keep futures open. There are several large and conflicted literatures on determinism, others on uncertainty, and still others on time and futurology. Just on determinism there is a literature in theology, a literature in the philosophy of science, and another one in the philosophy of history. Writing as an anthropologist, my starting point is a second order one, as alluded to in the opening quotations. In other words, my starting point is in discussions of how we should think about how different or other people think about determinism and indeterminacy, rather than about the topics themselves. That difference can be a fine one but is important nonetheless. So my starting point is with work such as that of Rebecca Bryant and Daniel Knight5 writing on the anthropology of the future, which includes a chapter on fate. This sort of semi-detached discussion forms a setting for the following. As a way of helping readers consider the issues from a somewhat different vantage point let me start by briefly mentioning two common metaphors for determinism: that fate has been written in stone or simply that ‘it is written.’ As responses to these metaphors, I note first that what is written in stone may not be immutable. For all that something has been chiselled into a stone slab, nonetheless the words may change. The narcoterrorist Pablo Escobar illustrates this. He started his criminal career as a grave robber; more accurately, he started as a gravestone robber: stolen gravestones could be sanded down to erase the inscription, ready to be resold as new and to receive a new inscription. Moral: what is ‘written in stone’ may not be forever. The second metaphor was ‘It is written.’ It may be written—but what is written may be

Epilogue

273

rewritten, it may be unintelligible or a palimpsest. Consider some fragments of writing from two thousand years ago (Figure E.1). IT IS WRITTEN (MORE THAN TWO THOUSAND YEARS) ‘It’ may well be written but what we have to read may be unintelligible: think of the Linear A script (still undeciphered in 2022 mainly for lack of material to analyse). Alternatively, it may resemble the example in Figure E.1, where the writing we want to read is the scratch marks of different episodes of writing in a Roman wax tablet case or frame (sometimes the writer pressed too hard and scratched the wood beneath the wax). When people usually say ‘it is written,’ I don’t think they are thinking of such cases. Nonetheless it may be an accurate metaphor for how writing captures but yet does not capture the future: our futures (with an emphasis on the plural) may be written but that writing may be unintelligible (strictly: illegible). We may always be dealing with a palimpsest (especially if the medium on which ‘it’ is written may have been a hastily erased gravestone). This is to unsettle and add some indeterminacy to conventional thinking about determinate futures or about present knowledge of determinate futures.

Figure E.1. Stylus tablet 836, one of the most complete examples excavated at Vindolanda. From Melissa Terras, ‘Towards a Reading of the Vindolanda Stylus Tablets: Engineering Science and the Papyrologist,’ Human IT: Journal for Information Technology Studies as a Human Science 4, no. 2–3 (2000), http:​//​www​.hb​.se​/bhs​/ith​/23​ -00​/mt​.htm. Used with kind permission of The Vindolanda Trust.

274

Epilogue

There is an epistemological question and a separate ontological question about the future: what we know, what we can know about future things (futures) at the ‘time of knowing’ may be of varying status, and those may shift over time. This is different from whether the entire history of the universe has always been decided, laid out or charted as it were, with events following their ineluctable course. Some followers of some Semitic religions hold such ontological views, with which I will leave those in theology to engage. My concerns are more parochial and down to earth. How ordinary, non-expert people go about making every day but for them important decisions, and how in the course of such decision-making they sometimes consult divination, oracles, and other forms of occult technologies. AMBLING INTO DIVINATION (OR: AFTER THE PREAMBLE COMES THE AMBLE) I study divination as practised by Mambila people in Cameroon.6 Of the many different forms of divination available to them, they consider most reliable a form of divination known as spider divination (ŋgam dù). Variants of this type of divination are found among a wide range of ethnic groups in southern Cameroon and also in eastern Nigeria. In ŋgam dù divination binary questions are posed to a spider (or crab) that lives in a hole in the ground. The binary alternatives are associated with a stone and a stick placed near the spider hole, which is covered by marked leaf cards. After covering the area with a pot and waiting for the spider to emerge, the pattern of the cards relative to stone and stick is interpreted to answer the question posed.

Figure E.2 Palm tree cards: positive and negative (approximately actual size), 2022. Copyright David Zeitlyn.

Epilogue

275

The cards have symbols cut into them, but these are only rarely mentioned in the Mambila form of spider divination (in the spider divination practised among neighbours of Mambila, the meanings of the cards seem more important). The divination leaf-cards used are obtained from a shrub7 whose leaves are doubled over, pressed flat, and then stored over a fire. A template is used to cut the outline shape which is common to all the cards, then ideograms are incised into the cards with a razorblade. Mambila divination cards are similar to those illustrated by the missionary Paul Gebauer who worked with the neighbouring Yamba people (then known as Kaka).8 The cards are all of the same form9 and are similar to those found further south in Cameroon (as discussed by Paul Gebauer, and others cited in the next endnote). Each ideogram occupies two cards, appearing on one card only on the left of the central rib (negative) and on the other on both sides of the rib (positive). The ideograms are called ŋgə̀ə, which also means ‘symbol.’ For instance, a card with one palm tree symbol refers to a problem associated with palms (such as the danger of falling while harvesting palm-nuts from a tall tree), while the card with two palm tree symbols means that something

Figure E.3 Divination setup, 2022. Copyright David Zeitlyn.

276

Epilogue

positive such as a good harvest is to be expected from palm trees. A diagrammatic version of an example pair of cards are shown in Figure E.3. Several sets of cards were collected and others documented in the field. Of the eight sets fully documented, none had more than thirty-eight different ideograms, which means they are less varied than sets from other groups in Cameroon, discussed and comprehensively illustrated elsewhere.10 Table E.1 lists the names and meanings of a typical set of divination cards. In addition to these cards, each set has a blank card at top and bottom. However, as already noted, the meanings of the cards are only rarely referred to in the course of Mambila ŋgam dù divinatory practice, so they are not further discussed here. Cards are kept in holders (kup ŋgam) made from raffia pith. These are also similar to those illustrated in Gebauer’s book. Often several sets of cards are kept in one holder and used in the simultaneous consultation of different spiders. The standard interpretations of the placing of the cards once the spider has emerged can be seen in Figure E.4 and the photos of actual instances (Figure E.5). The question for divination is posed aloud: a small stone held in the diviner’s right hand is tapped on the pot following the rhythm of his speech, which is often muttered. I was told that actual vocalisation is unnecessary. Moreover, when I stumbled over the phrases in Jù Bà (the Mambila language) I was told that I could speak English, and divination would understand. Sometimes the Fulfulde language (a regional lingua franca) is used in consultations by non-Mambila clients. Whatever question is at issue, and whatever language is used, questions follow a fixed schema: a binary choice allows two possible responses, one associated with the stick and one with the stone. The general form of a question is as follows: My divination, you shape-changer, you witch, if XX then take the stick, my divination. No, it is not that, not-XX/YY/divine further, then take/bite the stone, my divination. Mambila text: Ŋgam mò, wò fum, wò sar ‘XX,’ wò sie tuú, ŋgam mò. Sam ŋgwə, ‘XX’ -sam/‘YY’/mbɔ̀ mbɔ̀, wò sie/numa ta, ŋgam mò. The choice given is between one option (XX) and either its direct negation (not-XX) or an element from its contrast set (YY) which may be more or less precisely specified. However, we should note that it is common to offer the vague alternative ‘mbɔ̀ mbɔ̀’ (divine further) which always has a negative connotation. In this context further divination is about a ‘problem,’ something bad.

Table E.1 A typical set of Mambila divination card names and their meanings. Leaf Name

Commentary

Animal Belly (pregnancy)

Referring either to hunting or to animal husbandry. Pregnancy, and sometimes its cause, often provide reasons to divine. As well as referring to chickens, which are vulnerable to predation and disease, this can refer to the use of chickens in ritual. The chief presides over the village court and symbolises authority; by extension this card can refer to authorities outside the village. (The negative version is sometimes called Marenjo: the senior ranking woman.) Classically symbolising choice. Some rituals are performed at crossroads outside the village, where the roads are exhorted to carry malign influences away. Sadness and worry. This has a strip of another divination card leaf inserted. Self-referential, it is taken to confirm or refute the results of the divination session in which it figures. Fufu (maize porridge) is the stereotype of food. Food preparation is regarded as women’s work. Gifts such as those accompanying a marriage. This may refer to climbing up the escarpment to Nigeria, or to Adamaoua. Journeys are dangerous and often divined about. This is now the stereotypical farming crop, so may be taken to refer to all farm work. Most traditional remedies have special fireplaces reserved for them. These cards can refer to any traditional treatment. Contrasting with the card for sun, or used to refer to months rather than days. Referring either to the general state of the palm harvest or to the dangers of harvesting palm nuts by climbing tall trees. Moving from one territory to another. Rivers formed traditional frontiers. From the point of view of male diviners both refer to affinal relations: closely related but somewhat distanced. Once the stereotypical farming crop (now replaced by maize). Now grown only for use in rituals. Referring to other types of divination, as well as spider divination. This stick is used in sùàgà blessing rituals performed following adultery accusations. Contrasting with the card for moon, or referring to days rather than months.

Chicken feather Chief

Crossroads Crying eyes Divination leaf Fufu preparation Hands (that give) Hilltop Journey Maize Medicine cooking stones Moon (evening) Palm tree River crossing Sister’s son or women’s clothing Sorghum Spider (divination) Suàgà stick Sun (day)

278

Epilogue

Figure E.4 Basic result patterns, 2022. Copyright David Zeitlyn.

The opening phrase can be extended to include other sorts of witches and idioms for witchcraft, thus becoming a list of possible sources of danger. The spider is sometimes described as being a witch: ‘how could it otherwise know about witchcraft?’ (In other words it is held that ‘it takes one to know

Figure E.5 Two unusual results in which the cards are propped up on each other, 2022. Copyright David Zeitlyn

Epilogue

279

one.’ When I asked about this, I was reminded that people who have inherited witchcraft have ‘open eyes,’ and can detect witches without necessarily practising witchcraft themselves.) Once the question has been put, the lid is replaced and the diviner(s) wait for ten or fifteen minutes at some distance (e.g., under shade ten or twenty metres away) to allow the spider to emerge and disturb the cards, thus giving its answer. Often another pot is inspected and further questions are put while the answer from the first pot is awaited, so a set of parallel questions may be operated. This provides a consistency check on the veracity of the divination. A new line of questioning is marked by the diviner breaking a twig or a piece of grass and throwing away the fragments as he states that he will ask a fresh question, and that divination is to follow that (rather than remaining focused on previous questions). LOGGING QUESTIONS AND ANSWERS As we have seen, Mambila divinatory practice in spider divination consists of repeated reformulations of questions as problems and possible solutions are considered and refined. In this way divination provides a means to deal with or accommodate real world indeterminacy, by providing paths to action, resolving some of the aporias of the instant with sanctioned advice: the spider says ‘do this!’ Life histories of some sample individuals show that even some diviners either do not consult or do not follow the advice given. This puts a different interpretative frame on the determinacy that can be easily elicited. As I have said, Mambila people tend to talk determinacy while walking cautiously, acting in ways that implicitly maintain indeterminacy; in other words: keeping the future open. With the help of a family of diviners who kept a record of the questions asked by their clients over a period of years, I have compiled a dataset of more than six hundred cases of Mambila divinatory consultation. This allows us to summarise the pattern of Mambila concerns, the distribution of question types revealing a sociology of troubles and uncertainties. It also allows us to examine the boundary between anticipation and prediction, between people’s attitudes to the near and the far future. Mambila divination clients tend to bring questions about immediate problems, things that can be clearly anticipated, rather than longer-term predictive concerns. For example, a client from a neighbouring village asked (on August 29, 2012) about taking professional examinations and was told to wait until the following year. Someone asking about building a house was also told (on July 5, 2013) to wait a year (house building usually only takes place during the dry season):

280

Epilogue

471 Q: xxx came to ask about building a house. He wants to do it this year or should he wait? A: He should wait until next year.

However, this temporal distinction risks masking the way that short-term issues can become persistent. In 1985 one of the first divination sessions I was allowed to record concerned the ill health of the client’s daughter. She recovered from the malaria that was then the immediate issue, but sadly in 2022 her chronic eczema persists. With hindsight (retrodiction) we can see that what were then immediate short-term problems have now become long-term ones. The question that is never asked is the general one: will I resolve this problem? A negative answer would deny hope. There is always hope, so the question is not asked. A senior diviner discussed these issues with me (in the pauses during his joint consultation with another diviner while we waited for the spiders to emerge and provide their answers). He was clear that when divining about an illness one does not ask “will this person die?” The flow of our conversation turned him into a determinist (a case of what Baruch Fischhoff and Ruth Beyth11 would call ‘creeping determinism’: retrospectively ascribing events with inevitability). For him ‘divination does not lie,’ so if it says ‘someone will die’ then they will die.12 But he is not a hard-nosed determinist (except when discussion with an anthropologist makes him so), holding that if you do not ask the question then the possibility of recovery remains. On that view, the divinatory process seems to fix the outcome: to force an issue rules out options which would have been left open by avoidance of the question. In the example of this illness, recovery remained an option as long as it was not ruled out by asking about it. In other words, implicit in the question is the ‘horizon of expectation’ (Reinhart Koselleck13), which encompasses possible answers: the options put to divination enable us to sketch the bounds of possibility—the sense of what is possible that is demonstrably shared by the speakers, since the alternatives they propose are accepted as viable—they are literally unquestioned and hence come to form the backdrop to the consultation, its ‘horizon of expectation.’ Some of the range of possibilities considered can be seen in the responses to missing livestock: 630 (05/12/2015) Q: Her pig is lost. Will she find it or not? A: Yes she will find it. 631 (15/12/2015) Q: His pig is lost. Will he find it or not? A: No he won’t. 632 (15/12/2015) Q: Her goat is lost. Will she find it or not? A: Yes she will find it, it will come back by itself.

281

Epilogue

The final part of the response to 632 is likely to be the diviner making sense of otherwise contradictory responses: if the goat returns by itself in one sense she will not actively find it but in another sense, of course, she will. The overall pattern of topics recorded for me over more than five years14 demonstrate the key recurrent uncertainties that lead people in rural Cameroon to consult a diviner. By and large these issues will not surprise those familiar with the literature on West and Central African societies. Table E.2, showing the issues addressed by clients’ questions, enables us to see what a Mambila set of emphases may look like. The answers recorded also show how Mambila people in the early twenty-first century commonly respond to trouble (most frequently by administering the sùàgà sacrificial oath). Table E.2 categorises the issues about which more than ten questions were posed. It includes 645 out of a total of 671 documented cases; the remainder concerned issues raised fewer than ten times. Part of the reason that illness and health are so dominant is that in such cases an important issue in the local cultural tradition is to establish whether a particular case of illness is ‘from God’ (in other words has ‘just happened’ or is ‘naturally occurring’) or whether it is ‘from a person’ (in other words if the ultimate cause is the malign actions of a witch). In the latter case there is little point treating the patient using Western medicine since it would be treating the symptoms but not the cause. Only once the witchcraft has been addressed is it worth addressing the physical symptoms. There is a caveat that must be noted at this point: although to ask whether treatment should be sought in a dispensary/ hospital or by a traditional healer is essentially to ask whether the illness was caused by witchcraft, there is another important factor at play: that of cost. The cost of treatment often influences treatment decisions.15 Traditional healers tend to be not only cheaper than dispensaries but also more understanding about late payments than are the local dispensaries, let alone hospitals that Table E.2 Subjects of questions asked of one diviner, 2011–2016. Category_n Money Travel Farming Housing General problems Theft Work Ritual planning Marriage/family Illness/health

% 16 16 18 27 43 47 66 103 149 160 645

2% 2% 3% 4% 6% 7% 10% 15% 22% 24% 95%

282

Epilogue

have provided expensive surgical procedures. So a question about where to seek treatment may reflect concerns about its cost, rather than a belief that the illness was caused by witchcraft. However, sixteen of the recorded questions were explicitly about types of causation, so financial concern was clearly not those clients’ first concern. Returning to the avoidance of asking whether someone might die, this aspect of question framing suggests a Mambila version of Schrödinger’s cat, in which the cat sits in a limbo of superposed possibilities, being both dead and alive, until the act of inspection ‘collapses the wave-packet,’ bringing about one state or the other. In the Mambila version the superposed state would end not when the box was opened but when the possibility of death was asked of and answered by spider divination. A Mambila diviner might say in such a case that the spider had opened the box behind the scenes and reported what it found. A Mambila version of the thought experiment might well be called Schrödinger’s spider. The notion that divination affects outcomes has interesting and perplexing implications. It allows Mambila clients some say in how far their future is fixed. By not divining, or by divining but choosing their questions so as to leave positive outcomes in play, they keep their futures more open: for Mambila clients the play of anticipation as it shades into longer-term prediction is controlled and managed, even if not described as such. I am struck by the variety of opinion in Mambila and by variations in how it is expressed. Although the questions reveal the bounds of possibility being considered it is important to remain alert to the questions not being asked, not because they are unthinkable but because they are unpalatable. The UK Chancellor of the Exchequer was in the news in October 2017 for not making contingency plans for a hard Brexit. With apologies for a double negative, it was probably not that he could not anticipate that outcome and more likely that he did not want to make that possibility seem more real by planning for it (drafting this little more than two years later the irony of this was already apparent). We make our futures not only by the choices we make but, before then, by the outcomes we contemplate, by the patterns of our multiple anticipations. For this reason, the way questions are framed, the outcomes they include or exclude, and the background circumstances they take for granted or reject out of hand, deserve further study. There is an increasing amount of scholarship on differing conceptualisation of fate and destiny. These include Alice Elliot16 and Laura Menin17 on how Moroccan women think about love and marital choice in the light of Islamic revivalist critiques of Western modernity (see also their joint work on destiny which introduces the idea of ‘malleable fixity’18). As already mentioned, Rebecca Bryant and Daniel Knight’s book19 on the anthropology of the future includes a chapter on fate or destiny. Many of the examples discussed in the

Epilogue

283

book are about forms of ‘stuckness’: peace will come to Cyprus ‘one day,’ but no one knows when; austerity in Greece will end as surely as the United Kingdom will achieve Brexit, but there is no consensus about when and how this will be.20 Living in and with uncertainty makes futures uncanny (as Bryant and Knight put it) and if ‘the future is written,’ then it is not clear who can read the text or whether it can be revised (see my opening comments). If diviners are seen as those who can read ‘what is written,’ then how can clients be sure they have read correctly let alone understood what they have read? And the text that they read may in fact be a palimpsest, so a ‘reading’ may not help those who must decide how to act without interpretative work by all involved. As Samuli Schielke puts it, “this ‘malleable fixity’ (Elliot and Menin 2018) has made destiny an extraordinarily helpful idea for humans to find their way in a life that they live but do not own.”21 The study of divination can provide privileged insights into the way people consider how to deal with the multiple and intertwining problems posed by everyday life. The questions asked, the ways in which alternatives are framed, reveal the bounds of conceptual possibility in terms of which people reason when deciding what their actions should be. The clients may not have elaborated some of these ideas as much as the diviners, who are often the local intellectuals (in other words there may not be a common ontology connecting the participants, see the following discussion). But when they meet in a divinatory consultation diviner and client need to connect somehow or other. This may be through a (somewhat) shared understanding or by the divinatory praxis acting as a boundary object, which the diviners and clients use to talk across or through. DIVINATION: BOUNDARY OBJECT IN THE TRADING ZONE As has just been said, divination consultation acts as a boundary object connecting diviners and their clients. The term was introduced by Susan Leigh Star.22 On her account ‘boundary objects are those objects that are plastic enough to be adaptable across multiple viewpoints, yet maintain continuity of identity.’23 Another helpful definition of a boundary object is ‘an entity shared by several different communities but viewed or used differently by each of them.’24 Boundary objects link different groups of actors who are engaged in common or interconnected tasks. They may be used by different groups in different ways: for example, Star contrasts the radically different uses to which ‘the same’ map may be put. The utility of boundary objects follows from their not being at the forefront of attention. Boundary objects are part of the infrastructure, taken for granted and invisible: we see through them rather

284

Epilogue

than focusing on them, at least when they are working well. Boundary objects connect and simultaneously disconnect: they hold connections at a remove, in abeyance. In other words, the divinatory procedures (cards or many forms, tea leaves, hexagrams, charts, or Mambila spiders, etc.) play a role as connecting clients, often with immediate pressing concerns, with the diviners and the ontologies associated with the type of divination being used. So clients can consult divination without necessarily being a party to any ontologies that the diviners may associate with it. The procedures and technicalities of divinatory praxis may feature in divinatory consultations as boundary objects, having very different resonances for clients and diviners who nonetheless are able to continue their interactions, in part through their mutual orientation to the boundary object (divination) being undertaken. Such an approach works very well with Peter Galison’s idea of a ‘trading zone’25 to account for the complex development of science despite internal incommensurability between sub-groups, who manage to get along (more or less) despite their differences. As Galison put it, ‘trading partners can hammer out a local coordination despite vast global differences.’26 This helps remind us that agreement, mutuality, and other factors are not necessary for two or more groups to be able to live together and coordinate actions, even if they may not agree on their understandings of those actions! On his account, trading zones can flourish without common beliefs, ontologies, or full translation, as long as the groups concerned can manage coordination without them. A minimal agreement that these divinatory actions will provide answers is sufficient. As Galison might say, differences of opinion and lack of consensus can be managed by the use of boundary objects in the trading zone, where technologies such as divination allow people to meet and act together without going into detail or discussing whether they share an ontology. CONCLUSIONS A final thought on the role of the divinatory process is that as well as playing a role as boundary object it can be seen as a technology of ‘uncertainty absorption.’ Niklas Luhmann uses this idea when discussing processes of organisational decision-making, the way that the background to one decision is effaced and made irrelevant when considering subsequent decisions.27 As for organisations, so too for ordinary people. By providing an external warrant for a decision, divination provides a way for people to conceptually draw a line in the sand and move on without endlessly going over the decision. Whatever form of divination is undertaken, the process of consultation acts as a boundary object bridging the different parties who meet (sometimes in a trading zone) to discuss the problem at issue. A form of this approach may

Epilogue

285

be found in Tobias Kelly’s afterword to the collection Of Doubt and Proof: Ritual and Legal Practices of Judgment.28 Kelly builds on Hannah Arendt’s much earlier work The Human Condition29 (originally published 1958). As Kelly says, ‘For Arendt, social life is a gamble, whose outcomes are always unknown, but we have no choice but to play the odds. Predicting the future is notoriously difficult, as soothsayers, economists, weather forecasters and other prophets know only too well, but we are under pressure to do so nevertheless.’30 The pressured encounters that take place within the terms of that ‘nevertheless’ include the sorts of divinatory consultations I have been considering here. The divinatory results that are obtained, using the process as a boundary object, enable the clients to leave the consultation feeling that their issues have been (more or less) addressed, or they will not return. For too long the primary divinatory encounter for academics such as anthropologists has been that between researcher and diviner (and between diviner and divination procedure). It is time that we recognise that the encounter that matters is that between diviner and client. ACKNOWLEDGMENTS Parts of this have appeared in other work at greater length.31 Some of the text on boundary objects is from a section of a book that will appear late in 2022.32 My thanks to the convenors of the original conference and for their encouragement to write this up. NOTES 1. Mary Douglas, ‘Dealing with Uncertainty,’ Ethical Perspectives, 8, no. 3 (2001): 148. 2. Ibid. 3. Sandra Calkins, Who Knows Tomorrow? Uncertainty in North-Eastern Sudan (Oxford: Berghahn, 2016), 2. 4. Ibid. 5. Rebecca Bryant and Daniel M Knight, The Anthropology of the Future (Cambridge: Cambridge University Press, 2019). 6. See David Zeitlyn, Mambila Divination: Framing Questions, Constructing Answers (London: Routledge, 2020). 7. The shrub is called mvu ŋgam (Dacryodes sp.). Yamba leaves were cut from Dacryodes edulis (which is the reclassification of Pachylobus edulis given by Gebauer) (Paul Gebauer, Spider Divination in the Cameroons [Milwaukee: Milwaukee Public Museum, 1964], 35). Leiderer identified the leaves used by the Bafia as coming from the tree Oddoniodendron micranthum (Rosemarie Leiderer, Le Médicine

286

Epilogue

Traditionelle chez les Bekpak (Bafia) du Cameroun [Berlin: D. Reimer, 1982], I.125). The Wuli use only three cards cut from the Euphorbicae Bridelia sp. (tsətsə in Wuli) according to Baeke (p.c.). 8. Gebauer, Spider Divination in the Cameroons. 9. The set of 161 cards from the Wiya tribe donated to the Pitt Rivers Museum, Oxford, by M.D.W. Jeffreys are similar to Gebauer’s Yamba ones, even in the detailed iconography. Some of these cards have a bell-like outline but otherwise they are all cut to the same pattern. The iconography of these examples is different and more complex than the iconography used on Mambila cards. 10. Especially in Gebauer, Spider Divination in the Cameroons; Leiderer, Le Médicine Traditionelle chez les Bekpak (Bafia) du Cameroun, volume 1, chapter 4; Idelette Dugast, Monographie de la Tribu des Ndiki (Banen du Cameroun). 2: Vie Sociale et Familiale, volume LXIII, Travaux et Mémoires de l’Institute d’Ethnologie, (Paris: Institute d’Ethnologie, 1959), 46–52; and Isaac Paré, ‘L’Araignée Divinatrice,’ Etudes Camerounaises (1956): 53–54. Some early sources are James H.H. Pollock, Bum Assessment Report, Rhodes House MSS Afr s 797 (1927); Marguerite Dellenbach, ‘Un Nouveau Jeu de Feuilles Divinatoires (Attirail de Magie Provenant du Cameroun),’ Revue Anthropologique, 42 (1932); André Ménard and Mélanie Ménard-King, ‘Le rituel de divination de l’araignée (le N’Gam) chez les Bafia du Cameroun [Causerie prononcée le 13 décembre 1935 à la Société d’Etudes camerounaises],’ Bulletin Images et Mémoires, 38 (2013), http:​//​www​.imagesetmemoires​.com​/doc​/Articles​/ B38​_am​_divination​_araignee​_red​.pdf; Mélanie Ménard-King, ‘Le rituel de divination de l’’araignée (le N’Gam) chez les Bafia du Cameroun [Causerie prononcée le 13 décembre 1935 à la Société d’Etudes camerounaises],’ Bulletin Images et Mémoires, 32 (2012), http:​//​www​.imagesetmemoires​.com​/doc​/Articles​/B32collectionmenard​ .pdf (both the latter refer to material discussed in Yaoundé in 1935); Pierre Cournaire, ‘Notes Sommaires sur les pratiques Divinatoires des Populations de la Circonscription de Yaoundé,’ Journal de la Societé des Africanistes, 6, no. 1 (1936). 11. Baruch Fischhoff and Ruth Beyth, ‘I Knew It Would Happen: Remembered Probabilities of Once—Future Things,’ Organizational Behavior and Human Performance, 13, no. 1 (1975): 1–16. 12. This resembles Martin Holbraad’s account of Ifá divination in Cuba: Martin Holbraad, ‘Truth Beyond Doubt: Ifá Oracles in Havana,’ HAU: Journal of Ethnographic Theory, 2, no. 1 (2012): 81–109. The Mambila twist is that they seem to seek to avoid the fixity described. 13. Reinhart Koselleck, Futures Past: On the Semantics of Historical Time, translated by Keith Tribe (New York: Columbia University Press, 2004 [1985]), 255. 14. Principally 2011 to 2016, but also including some records from 2008, with a gap during 2015. 15. Note that I have classified as financial the question ‘If I tell my child to take me to hospital will they take me?’ since the lack of funds is the reason older people say children give for refusing to take them! 16. Alice Elliot, ‘The Makeup of Destiny: Predestination and the Labor of Hope in a Moroccan Emigrant Town,’ American Ethnologist, 43, no. 3 (2016).

Epilogue

287

17. Laura Menin, ‘The Impasse of Modernity: Personal Agency, Divine Destiny, and the Unpredictability of Intimate Relationships in Morocco,’ Journal of the Royal Anthropological Institute, 21, no. 4 (2015); Laura Menin, “‘Destiny is Written by God’: Islamic Predestination, Responsibility, and Transcendence in Central Morocco,” Journal of the Royal Anthropological Institute, 26, no. 3 (2020). 18. Alice Elliot and Laura Menin, ‘For an Anthropology of Destiny,’ HAU: Journal of Ethnographic Theory, 8, no. 1–2, 2018. 19. Bryant and Knight, The Anthropology of the Future. 20. This sentence was first drafted in 2019 and has been consciously left unchanged. 21. Samuli Schielke, ‘Destiny as a Relationship,’ HAU: Journal of Ethnographic Theory, 8, no. 1–2 (2018). 22. Susan Leigh Star, ‘The Structure of Ill-Structured Solutions: Boundary Objects and Heterogeneous Distributed Problem Solving,’ in Distributed Artificial Intelligence, edited by Les Gasser and Michael N. Huhns (San Francisco: Morgan Kaufmann, 1989); Susan Leigh Star, ‘This is Not a Boundary Object: Reflections on the Origin of a Concept,’ Science, Technology, & Human Values, 35, no. 5 (2010); as well as Susan Leigh Star and James R Griesemer, “Institutional Ecology, ‘Translations’ and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907–39,” Social Studies of Science, 19, no. 3 (1989). 23. Star, ‘The Structure of Ill-Structured Solutions,’ 37. 24. Note that this quote is commonly attributed to Star, but I cannot trace an original source, which is odd in the age of the internet. 25. Peter Galison, Image and Logic: A Material Culture of Microphysics (Chicago and London: University of Chicago Press, 1997). 26. Ibid, 783 (his emphasis). 27. Niklas Luhmann, ‘The Paradox of Decision Making,’ in Niklas Luhmann and Organization Studies, edited by David Seidl and Kai Helge Becker (The Hague: Liber, 2005). See also David Seidl and Kai Helge Becker, Advances in Organization Studies, volume 14 (Copenhagen: Copenhagen Business School Press, 2006); and David Seidl and Kai Helge Becker, ‘Organizations as Distinction Generating and Processing Systems: Niklas Luhmann’s Contribution to Organization Studies,’ Organization, 13, no. 1 (2006). 28. Tobias Kelly, ‘Afterword,’ in Of Doubt and Proof: Ritual and Legal Practices of Judgment, edited by Daniela Berti, Anthony Good, and Gilles Tarabout (Farnham: Ashgate Publishing, 2015). 29. Hannah Arendt, The Human Condition (Chicago: University of Chicago Press, 1998 [1958]). 30. Kelly, ‘Afterword,’ 186. 31. Zeitlyn, Mambila Divination: Framing Questions, Constructing Answers; David Zeitlyn, ‘Divination and Ontologies: A Reflection,’ Social Analysis, 65, no. 2 (2021). 32. David Zeitlyn, An Anthropological Toolkit: Sixty Useful Concepts (Oxford: Berghahn, 2022).

Index

abductive logic, 30, 34n25, 263 abstraction, 4, 21, 43, 77–78, 139n30, 210, 236, 248n14, 260; symbolic abstraction, 110–16 acceleration, xii–xiii, 74–75, 81–82, 183 accident, xi–xiv, 90–91, 219–32, 256 accidentals, 144, 147–49 acoustics, 263; evental acoustics, 263–64 actuality, xvii, 102n36 aesthetics, 19–20, 137, 146– 48, 155, 223 affordance, 57–58, 91–92, 96, 102n26, 138n12 a-formalism, 19 Agamben, Giorgio, xxv, 244–45, 249n26, 249n31, 252, 264–65, 265n6 agency, xv–xvi, 63, 116, 132, 235; distributed agency, 131, 161n86, 169; agent, xii, xvi, xviii–xiv, 6, 22, 57, 61, 63, 72, 77–78, 91, 137 aggregation, xix, 28, 45, 56; de-aggregation, xix; re-aggregation, xix aleatorics, 75–76 alethic, 164–71, 177 algorithm, xvi–xviii, xix–xxi, xxii– xiii, xxv, 14, 20–22, 25, 27–33, 35, 39, 42–44, 46–47, 50n20,

50n21, 53–57, 57–61, 63, 99, 113, 154–55, 160n77, 238, 242, 251–56, 264, 265n3, 266n9; deep–learning algorithms, 28–31; recommendation algorithms, 31 allagmatics, xxiv, 89–105, 177n2 alterity, xxiii, 3–17, 126–31, 136–37, 156, 248n13 Ananke, 10–14, 16, 18n26, 101n11, 163–79 anthropocentric, xxii, 98, 232 anthropogenesis, 90, 183 anthropomorphism, xxii, 164 architectonics, xxiv–xxv, 18n32, 183–99 architecture, xxiv, 19, 89–105, 145–48, 163–79; Brutalist architecture, 19–20, 32; learning architecture/s, xix–xxi, 30; networks architecture, 54–55 Aristotle, xxviin12, xxviin24, 23, n190 art, xiii, 9, 38, 107–10, 113–14, 118–19, 125, 127, 130–35, 136, 143–61, 171, 223; artistic technology, xxiv, 139n30, 143, 144, 150, 152, 155; artistic technique 143, 175 artificial intelligence (AI), 23–24, 31, 40, 253 automation, xviii, xix, xxii–xxiii, 21–33, 56, 61, 71–85, 235; automaticity,

289

290

Index

xviii, xxi; automatization, 111; automatism, xxi, 15, 72–74, 82, 112, 174; Automaton, 15, 26, 153; cognitive Automaton, 71–83 A New Kind of Fate, 153–55 A New Kind of Science, 153 A Thousand Plateaus, 93, 102n36, 261 Bachelard, Gaston, xv, 31, 146 Bateson, Gregory, xvi Baudrillard, Jean, 18n45, 74–75, 80 Beautiful Data, 155 behaviour, xviii–xx, xxi, 21, 27, 36, 39, 59–60, 62, 72, 75, 81–82, 95, 121, 136, 150, 153, 203, 208–9, 214, 231, 241–42, 255, 262 being, xxviiin32, 5, 190, 220, 225, 231 Bell Labs, 127, 131–33 Bergson, Henri, 91, 123n29, 157n12, 184, 195, 241 biopolitics, 235, 247n1; negative biopolitics, 235, 244, 247n1; positive biopolitics 235, 244, 247n1 Brecht, George xxiv, 143–61 binarism, 113, 272–76; binary 20–21, 31, 40, 42, 46, 153, 154, 164, 222, 272, 274, 276 biometrics, 58–59, 157n9 Bitcoin, 54, 57 bits, 9, 36, 42, 45, 170 blockchain, xxii, xxiii, 48n2, 53–70 Boltzmann, Ludwig, 10, 189– 95,198n42, 203–26, 215n11 Bohr, Niels, xvii, 49n8, 85n18, 144, 157n7, 211 Borst, Arno, 111–14, 123n19 Bratton, Benjamin, xxv, 89, 235–49 Bride Stripped Bare by her Bachelors, Even (The Large Glass), 147–48 Bryant, Rebecca, 272, 282, 283 Cage, John, xvii, xxiv, 38, 49n11, 125– 41, 150, 163n73 Calculating the Universe, 153–54

calculation, xiv, 4, 11, 20, 21, 27, 29–30, 36, 42, 49n15, 50n17, 110– 11, 114, 115, 211 Cameroon, 271–87 capitalism, xiv, 14, 20, 71, 75, 135, 183, 226, 248n13; financial capitalism, 76–77; techno-capitalism, 113, 242 catastrophe, xi, xii, xxii, 3–4, 82–83, 226 Ce qui arrive, xiii cellular automata, 144, 153–55 Chaitin, Gregory, xix, 44 chance, xi, xiv, xvi, 11, 15, 16, 18n36, 94, 118, 130, 143, 148, 149, 155, 163, 204, 211; chance operations, xiv, 147–48, 154 change, xv, xix, xxiv, 13, 14, 21, 25, 36, 39, 42, 61, 63, 72, 73, 79, 82–83, 94, 108, 119, 129, 136, 149, 153, 155, 156, 164, 167, 168, 171, 175–76, 177, 183, 187, 188, 190, 193, 201, 204, 207, 208, 209, 210, 212, 213, 223, 226, 230, 235, 239–40, 242, 253, 255, 268n35, 272, 276; microtemporal change, xv chaos, xvii, 21, 37, 47, 71–85, 205–6, 208–12; sur-chaos, xvii Chun, Wendy Hui Kyong, 55, 62, 64, 95, 237, 242–43 cognition, 27–28, 32, 81, 184, 201, 208, 212, 242; non-conscious cognition, 26–30 Colebrook, Claire, 90 collapse, 76, 82, 84, 208–9, 282 colonisation, 20 commensurability, xiv complexity, xiii, xviii, xxii, 14, 19, 27, 37, 39, 40–42, 49n13, 55, 72, 77, 81, 83–84, 136, 154, 156, 160n67, 167, 183, 213, 257 compression, 20–21, 25, 32; algorithmic compression, 20 computation, xvi–xviii, xix–xx, xxi, xxiii, xxiv, 6, 19–34, 35–51, 81, 100, 107–24, 154, 253–55, 264, 265n3;

Index

computational, 19–34, 35–51, 107– 124, 154, 264, 265n3; computational turn, 107; computing system xix; computus, 111, 114; material computation 35–51; pre-modern computation, 113 communication, xvii, xx, 22–28, 77, 91, 127, 131, 150, 212, 266n9; theory of communication, 3–18, 266n9 concealment, 23–24 concretisation, 111; re-concretisation, xxiii, 72, 77 consolidation, 82–83 constraints, 96, 144, 163–79 consciousness, xxiv, 13, 25, 62, 80, 81–84, 99, 112, 118, 157n12 contagion, xix; network contagion, xix–xxii contingency, xii, xiii, xiv, xv–xix, xx, 11, 14, 16, 31, 32, 35–51, 165–68, 228, 231, 236, 247, 257, 282; contingent, xi–xii, xv–xix, xxi, xxiii, xxiv, 14, 15, 32, 37, 39, 41, 42, 43–44, 48, 97, 116, 128, 130–35, 136, 152, 164, 173, 188, 227, 258, 264–65, 268n42 control, xii, xvii, xviii, xxiii, 10, 22, 26, 29–30, 35, 45, 53–56, 57–65, 68n61, 82–83, 112, 127–28, 130, 131, 134, 231 Corbusier, Le, 145, 154, 156 correlation, 30, 31, 32, 95, 153, 172, 207, 209, 265n4 correlationism, xxviiin42 COVID–19, xxiii, xxv, 72, 77–79, 235–50 crisis, xii, xxiii, 20, 84, 85n20, 238, 242, 248n13 Critique of Pure Reason, 184 cryptocurrency, 54, 61 cryptography, 53–54, 57, 63 cybernetics, xix, 3, 10–15, 16, 18n43, 22–25, 26, 27, 30, 40, 43, 131, 133– 34, 155, 163 Cyberpunk 2077, 40–45

291

Darwin, Charles, 74, 94, 164, 191– 95, 203, 244 data, 3, 20–21, 22, 23, 24, 27–32, 39, 42, 47, 48, 53–54, 56, 58, 59, 61, 63, 71–72, 75, 80, 81, 95, 99, 100, 113, 155, 157n9, 169, 202, 213–14, 235, 243, 247, 249n26, 252, 266n13, 279; dataism, xviii, 100; #datapolitik, xxixn50, 57–61, 63; data entropy, xix–xx, xxi dataveillance, 56–57, 235 Deacon, Terrence, 167–168 decentralisation, 53–54, 64, 66n20; decentralised autonomous organisations (DAOs), 55, 60–61, 64 decisionism, xxiii, 21, 33, 101n23, 105n81; technological decisionism, 21, 105n81 Deleuze, Gilles, xvi, xvii, xxv, 25–27, 56, 92–94, 97, 101n12, 102n36, 113, 134, 163, 168, 171, 223, 237, 241, 252, 254, 257–59, 261–62, 267n31, 268n35 Delirious New York, 97–98 dematerialisation (also de-materialisation), 107–24 demiurge, 3, 6 Derrida, Jacques, xxv, xxvin7, 95, 151, 152, 196n16, 219–32 Descartes, René, 4, 265n7 Desargues, Girard, 147–48 determinacy, 3, 11, 37, 251, 271–72, 279 determination, xxiii, xxvi, 22, 30, 44, 71, 73–74, 76, 78, 80, 82, 84, 151, 172, 177, 188, 190, 191, 255, 258; determination-indetermination, deterministic, xxiii, xxv, 10–11, 15, 36–37, 42, 56, 71, 76, 80, 255–56, 264; over-determination xvii, xxii, xxvi, 39 determinism, xviii, xxiii, 10, 11, 35, 36–37, 48n2, 53, 72–77, 81, 91, 153, 163, 165, 206, 208, 272, 280; engendered determinism, xviii

292

Index

deviation, xiv differentiation, 258, 260, 264, 267n31, 267n32 Difference and Repetition, 249n19, 258, 262 digital, xvii–xviii, xxii–xxiii, xxiv, xxv– xxvi, 3, 20, 28, 35–51, 53–70, 107, 109, 112–14, 116, 118, 120, 137, 138n8, 144, 154, 201–17, 253, 255, 266n9; digital age, 201–17; digital apologia, 35; meta-digital, 20–22, 30 disintermediation, 54–55, 58, 63 disorder, xiv, xvii, 11–13, 203, 205 displacement, 35, 189; entropic displacement, 189, 191, 193–95 disruption, xxiii, 13, 72, 78, 83–84, 151 divination, xxvi, 271–87; Mambila divination, xxvi, 271–87; spider divination, xxvi, 271–87 dogma, xxiv; socio-scientific dogma, xxiv doxa, 25–26 Duchamp, Marcel, xvii, xxiv, 139n30, 143–61 dynamics, xvii, 10, 11, 29, 37, 74, 77, 80, 81, 82, 144, 153, 156, 206, 210, 212; non-linear dynamics, xvii, xxviiin45 Ecology, 92, 96, 100, 220; algorithmic ecologies, xx–xxi ego, 5–7, 99 Eigenpraxis, xvi empiricism, 23, 104n58, 238; empirical xxi, 22, 51n24, 255, 263, 268n35 energy, 120, 149, 164, 167–70, 174, 176, 177, 183–99; energy dissipation, xxv, 185–89, 191–92 enframe, 108; enframing, 108, 226 Enlightenment, 45, 48, 250n33; postEnlightenment 8, 51n26, 250n33 entropy, xix, xxv, 11–13, 19, 119, 165, 167, 170, 183–99, 202–3, 207

environment, xvi, xxiv, 38, 40, 47, 62, 74, 75, 77, 94–95, 96, 100, 102n26, 126, 132, 169, 175, 193, 263 epiphylogenesis, 95 epistemology, xvi, xxii, 5, 23, 25, 27, 33, 91, 95, 100, 242, 243, 249n26, epistemological, xviii, xx, 31, 38, 51n24, 115, 126, 144, 165, 241, 243, 246, 274 error, xv, xx, 28, 37, 90, 99, 108, 203, 255 Essays on Radical Empiricism, 94 essence, 24–25, 129, 219–34 essergy, 193 event score, 150, 152 evolution, 5, 28, 39, 42, 74, 81–82, 90, 94, 144, 153, 164, 169–70, 192, 195, 203, 207–9, 213, 255, 257 exhaustion, 79 experience, xvii, xxv, 7–8, 9, 19, 24, 27, 38, 41, 43, 78, 82, 94, 96, 99, 107, 113, 116, 117–19, 126, 132, 165, 174, 184, 236, 241–43, 251, 260, 263–64, 271 experiment, 28, 207, 258; experimental experience, 263 Experiments in Art and Technology (E.A.T.), 131–33, 140n45 exteriority, xiv, 96, 156 extraction 20, 33, 228 fallibility, 236, 247 Farocki, Harun, 114, 116–18 Fazi, Beatrice, xv–xvii, xxiii, 36, 43–44, 48 Fichte, Johann Gottlieb, 5, 7–8 finitude, 26, 183, 188, 191 Floridi, Luciano, xxiii, 3–18 Flusser, Vilém, 118, 119–20 Foucault, Michel, xiv, 55, 56, 62, 64, 244, 247n1 Fragment on Machines, 79 Freedom, xxi, 6, 7, 10, 11, 56, 58, 71, 73, 75, 81, 123n29, 190–91, 195, 209–12

Index

From Being to Becoming, 204 function, xvi, xxi, 13, 16, 25, 28, 76, 80, 102n36, 113, 144, 151, 169, 174–75, 186, 208–9, 214n5, 252, 255, 268n42; biological function, 256 functionalism, 19 General Data Protection Regulation (GDPR), 58 generativity, 36 Global North, 71, 78–79, 82–83 God, xxii, 3–12, 16, 62, 71–74, 96, 112, 281 Gödel, Kurt, xvi, xviii, 50n19, 251, 265n3 governance, xiii–xiv, xvii, xxiii, 20, 26, 29, 30, 53–70, 235, 243; neoliberal governance, xiii–xiv governmentality, xxiii, 53–70, 247; algorithmic governmentality, 14, 53–70; blockchain governmentality, 53–70 Guattari, Félix, xxiv, 25–26, 56, 92–98, 102n36, 103n38, 113, 134, 156, 163, 174, 252, 261, 262 habit, xviii, 55, 76, 94, 139n21, 235–50; habitus, 242, 246 hapticality, 252, 264–65 Hayles, Katherine, xix, 27–28 heat death, xxv, 12, 18n32, 165, 176, 183–99 Hegel, Georg Wilhelm Friedrich, xvi, xxv, 152, 219–26, 231–32 Heidegger, Martin, xxv, xxviin17, 4, 22–27, 51n27, 107–10, 125, 155, 157n11, 188, 196n16, 221, 223–24, 225–26, 231, 244 Heisenberg, Werner, 74, 109, 149 Heller-Roazen, Daniel, 115 Helmholtz, Hermann Von, 186–88 heterogenesis, 252, 265; differential heterogenesis, xxv, 252, 261–64 hikikomori, 78

293

Hui, Yuk, xv–xvii, xix, xxiii, 42, 48, 107–9, 120, 134 humanism, 72–73, 99, 120 Hume, David, 139n21, 235–50 Idea, 184–99, 258–259; regulative ideas, 183–99 idealism, 5, 104n58, 224; German idealism, 5–9; transcendental idealism, 224 ideology, 23, 36, 48, 50n23, 51n27, 64, 74; digital ideology, xxii–xxiii, 45 identity, 38, 43–45, 53–70, 90, 156, 197n40, 224–25, 228, 232, 283; identity management, 58–60; self– sovereign identity (SSI), 53–70 Ihde, Don, 126–129, 132–134, 139n20 Images of the World and the Inscription of War, 117 immanence, 27; immanent, 9, 29, 91, 99, 101n12, 165–66, 169, 176, 258, 268n41 immaterialisation, 107–24 immediation, 92 immutability, 53–54, 58–70 impredicativity, 91 incomputability, xix–xx, 44–45 indeterminacy, xvii, xxii–xxvi, 3, 10–11, 14–16, 23, 29, 30–32, 37, 43–44, 71–82, 89, 49n15, 144, 149, 152, 155–56, 163, 168, 174–76, 183, 206, 247, 251–69, 271–72, 279; indeterministic, 11, 78, 81, 259 indetermination, xix, 10, 14–17, 31, 76, 91, 190–91, 195–96, 236; indeterminate, xi, xvi, xvii–xviii, xx, xxiii, xxvi, 4, 31, 37, 42, 73–74, 82, 89, 108, 31, 176–177, 190–191, 195, 221, 251, 254, 255, 263, 268n42 Inextinguishable Fire, 116, 118 individuation, xvi, 91–95, 98, 101, 156, 166–67, 169–73, 176, 236, 258, 261 inevitability, 11, 36, 220, 280 information, xii, xix–xx, xxii–xxiv, xxv, 3–18, 20–21, 22, 25–26, 27–28,

294

Index

31–33, 37, 41, 57–58, 60, 76–80, 83, 89, 94, 95–98, 105n74, 119–20, 133–35, 165, 169–71, 174–79, 189, 203–4, 213–14, 238, 251, 266n9; information technologies, xxii, 3, 6, 8–10, 14; information theory, xix, 3–18, 133–135 infosphere, 9, 12, 15, 83 infoware, xviii infrastructure, 20–22, 243, 283 Infrastructural Brutalism, 20 instability, 40, 48, 72, 77, 79, 136, 147, 204, 205, 208–10, 228 instrument, xxiv, 24, 26–27, 51n27, 113, 115, 123n29; musical instrument, xxiv, 125–41, 150, 175 instrumentalism, xxiii, 45, 47–48, 51n27 instrumentality, xxiii, 19–34, 41, 114, 125 interaction, xiv–xvi, 27, 36, 42, 55, 81, 92, 120, 131, 136, 138n8, 160n67, 164, 168, 209–10, 254; humancomputer interaction, 138n8 Interspecies Smalltalk, 128, 135–36 intermedium, 147 intermediary, 55, 63–64, 143 Invisible Committee, 251, 265 irreason, xvii irreversibility, xxv, 64, 183, 201–17 iteration, 48n2, 155, 174 Jacobi, Friedrich Heinrich, 7–8 James, William, xix, 94 Kant, Immanuel, xvi, xxiv–xxv, 5, 7–8, 10, 17n16, 92, 183, 187, 223–24, 250n33 Kauffman, Stuart, 171, 251, 256–57, 267n18, 268n42 Kirby, Vicki, 113 Klein, Jacob, 110–11, 112–13, 115 Koolhaas, Rem, xxiv, 97, 105n68 Knight, Daniel, 272, 282–83 knowledge, xiii–xiv, xvi, xviii, xx, xxii, 7–9, 10, 20, 22–24, 30, 32,

33, 48, 51, 56, 74, 91, 95, 100, 102n24, 108, 115, 145–47, 153, 155, 165, 184, 206, 210, 213, 246, 273; knowledge base, 32 Knowles, Caroline, 220–21, 226–31 labour, xxiii, 35, 71–85, 103n46, 116, 117, 229, 239n61; cognitive labour, xxiii, 76–79 Lamarck, Jean-Baptiste, 94–95 Laplace, Pierre-Simon, 73, 81, 204, 210 Laruelle, François, 27 Latour, Bruno xiii, xxvii, 158n30 Lazzarato, Maurizio, xvii Legba, 156 Leibniz, Gottfried Wilhelm, xxii, 8–10, 15, 18n43, 46, 51n26, 261–62 Les Immatériaux, 114, 119–22 Lewis, George E., 128, 135–37 life, xv, xviii, xxv, 3, 4, 10, 12, 20, 40, 42, 48, 54, 55, 58, 62, 68n61, 71–72, 74, 76–77, 78–79, 82, 90, 93, 97–98, 102n36, 112, 113, 116, 123n29, 125, 130, 146, 155–56, 157n3, 164, 169, 171, 183–99, 205, 213, 220, 227, 230, 236, 238, 240, 241, 242, 246–47, 251, 257, 264, 265, 272, 279, 283, 285 logos, 27, 91, 143–144, 222 Longo, Giuseppe, 183, 251, 265n4 Lotka, Alfred J., 194–95 Luhmann, Niklas, xii, 284 Lyotard, Jean–François, 8–10, 13–15, 18n20, 114, 119–21, 126 Machine, xvi, xvii–xviii, xix, 15, 19–34, 36, 40–43, 46–47, 55, 63, 72, 76, 80, 97, 108, 112, 113, 118, 123n20, 123n29, 128, 145, 175, 187, 224, 227–28, 252–56; machinic, xvi–xviii, xix, xx–xxi, xxii–xxiii, xxv, 71, 93, 102n36, 242, 259; machine learning, xix, 22, 24, 28–30, 32, 113

Index

Malabou, Catherine, xix, xxi, xxv, 22, 24, 28–32, 113, 219–234, 248n13, 248n14 Marx, Karl, 48n2, 76, 79 materiality, xxiii, 30, 43, 114, 120 manifold, xxii, 149, 169, 184, 258–59, 261, 266n13, 267n32, 268n40; Riemannian manifold, 259, 267n30 materialisation, 76, 114; rematerialisation (also re-materialisation), 107, 114, 116–19, 121 measurement, 37, 49n8, 75, 111, 113, 193, 208–9, 211 mediation, 25, 27, 29, 31, 92, 126, 138n5, 143, 145, 153 medium, xx, 28, 98, 126, 148, 150, 154, 242, 273; time-based medium, 28 Meillassoux, Quentin, xvii Metaphysics, 8, 22–23, 24–26, 32–33, 35, 95, 108, 120–21; metaphysical, xxii, 3–6, 9, 16, 23–24, 30, 37, 66, 93, 128, 185, 224 metastability, 92 Minkowski, Hermann, 110–11 Modular, 19, 169 Monad, xxii, 3–18 Monadology, 8 Multiplicity, 54, 60, 97, 157n12, 258–59, 261–62 Nagel, Alexander, 114 necessity xviii, 11, 36, 43, 46, 49n6, 103n46, 123n29, 163–64, 165, 169, 176, 188–90, 194, 211, 225 negentropy, 11–13, 165–69, 171, 177, 193 Neofinalism, 166, 102n36 neoliberalism, 72, 82, 83; neoliberal, xiv, 74–75, 84, 227, 242 network, xvii, xix, xxi–xxii, 26, 28, 29, 37, 53–54, 55, 62–63, 80, 108–109, 111, 127, 135, 143, 145, 243 neuroscience, xviii, xxv, 81, 231 New Materialism, 107, 109, 113, 121

295

Newton, Isaac, xxv, 112, 183, 201–7, 258; Newtonian mechanics, 10, 36 ŋgam du, 271–87 Nietzsche, Friedrich, xi, 4–5, 94, 165, 187, 244, 248n14, 254 nihilism, 3–18 noise, xix, xx, 13–15, 19, 31, 128, 134– 35, 139n33, 154, 258 notation, 110, 121, 129, 130, 131, 139n30, 259, 267n34 Nussbaum, Martha, xiv, xxviin24 n + dimensions, 144, 147, 149 observation, xii, 44, 85n20, 112–13, 118, 144, 195, 256 Oksala, Johanna, 56 Omega xix–xx, 44 ontology, 35–36, 46, 51n24, 95, 121, 158n26, 231, 283–84 On the Mode of Existence of Technical Objects, 14, 98 Ontology of the Accident, xxv, 221, 225, 231–32 ontogenesis, 91, 144, 172, 251–69 Operaismo, 79; Operaisti, 79 oracles, 274 Order Out of Chaos, 183 Other, 8–9 pandemic, xxv, 72, 77–78, 82, 213, 235–50 Panopticon, 62 Parisi, Luciana, xxiii, 36, 42–45, 48, 49n4, 101n23, 105n81 pattern, xix–xx, 20, 51n25, 84, 154, 272, 274, 279, 281, 286n9 Perrow, Charles, xiv perspectivism, 95 petromodernity, xxv, 221, 226–32 petrochemical industry, 226, 228, 229 phenomenology, xxv, 95, 145, 235–37, 242, 248n14 piano, xxiv, 125–41; prepared piano, xxiv, 125–41

296

Index

philosophy, xvi, xxv, 3, 4–10, 14, 22–23, 25–26, 30, 32–33, 35–36, 51n24, 57, 81, 107–10, 114–16, 139n20, 163, 183–84, 189, 208, 219, 220–26, 230–32, 236–50, 261, 272; Analytic philosophy 5, 9; continental philosophy 107; European philosophy, 9; Philosophy of Information (PI), 4, 8 photography, 107, 114–18, 154 photosynthesis, 194, 198n56 Pierce, Charles Sanders, 30 Planck, Max, 204, 207 plastic, xv, xviii, xiv, xxi, xxv, 4, 90, 94, 156, 219–34, 248n14, 283 plasticity, xviii, xxiii–xxiv, xxv, 3, 16–17, 92–94, 143–45, 148–49, 152, 155, 219–34, 236, 248n14 Poincaré, Henri, xvii, 147, 203–4, 209–12 positivism, xxiii, 37, 45–48, 51n24, 99, 247n1 post-truth, 20–24, 26, 27, 32; posttruth doxa, 26 potentiality, xv–xvii power, xiii, xiv, xviii, 6, 20–21, 25, 30, 54–57, 59, 62, 64, 71–72, 78–79, 81–84, 126, 133, 164, 172–73, 185, 187, 195, 227, 231, 261; destituent power xxv, 251–52, 264–65 precarisation, 41, 78–79, 83, 228–29, 231; precarised, 74 predictability, 16, 74, 205, 208 Prigogine, Ilya, xvii, xxv, 183, 191, 193, 201–17 probability, xvii–xviii, 3, 10–13, 44, 154, 155, 203, 205–10, 256, 266n15; probabilistic, xxiii, 4, 10–12, 16, 105n74, 155, 183, 189–90, 191, 203, 206–7, 255 Process and Reality, 260, 265n7 processing, xx, 9, 22, 24–25, 27, 29, 30, 32–33, 36, 47, 56, 78, 214; meta-processing, 24

programme, xii, xvi, xviii, xix–xxi, 19–20, 22, 27, 29, 42, 43, 47, 53–54, 60–61, 63, 71, 77, 119–20, 121, 131, 134, 136, 154, 242; programming xiii, xxi, 25, 27, 49n15, 60, 121 propagation, 174; back propagation, xix; protention, 93–94, 98, 103n43 Pythagoras, xxiv, 114–116 quantification, 9, 28–30, 189 quantum physics, 207 quasicausality, 90–96 randomness, xix–xx, 11, 15, 19–20, 23–24, 29, 31–32, 44–45, 154, 253; random, xxv, 14, 32, 44, 160n67, 251, 255–256, 264, 266n15 RankBrain, 31–32 ratio, 27, 44, 50n21, 75–76, 115–18, 210 rationalism, xxiii, 45–48; rationalisation, 56, 71, 115 Ravaisson, Félix, 237, 241 ready-made, 143–61 reason, xvii, xx, 7–8, 20–21, 23–26, 29–30, 33, 39, 46–48, 51n26, 73, 75, 77, 83, 115, 143, 166, 168, 170, 184, 208, 237–41, 244–45, 246–47, 283, 286n15; reasoning, 7, 19, 22–23, 25, 27–30, 31–33, 48, 55, 212, 237, 238, 244–45, 249n26, 265n3; instrumental reason, 19, 21–24, 26, 30; principle of abundant reason, 39; principle of sufficient reason, 46; transcendental reason, 7, 19–34, 139n20, 184, 196n16, 224 recombination, 154; recombinant, 74, 76, 80, 153 recursion, xvi; recursive xvi, xix, 25, 27, 33, 132, 134, 144; recursivity, xvi, xxii, 134 remediation, 153–161 representation, 7, 33, 57, 109, 110–11, 113, 153, 266n9, 267n34; representational, 91, 99, 148

Index

resignation, 79; Great Resignation, 78 resonance, 96, 131–32, 209 retention, 9, 93, 95, 103n43 Revenge of the Real, xxv, 235–50 Rouvroy, Antoinette, 56, 95 Ruyer, Raymond, 102n36, 163, 166 Schizoanalytic Cartographies, 93, 96, 98 Schrödinger, Erwin, 192–93, 202, 208, 214n5, 271, 282; Schrödinger’s spider, 271–287 sensible, 8, 41, 64, 83; supersensible, 10 signal, xix–xx, 14, 15, 121, 131 semiosis, xxvin7, 151 semiocapitalism, 75–76, 80 semioproductivity, 76 sense-making, xxv, 251–52, 263, 264 Shannon, Claude, xix, xx, 3, 12–14, 28, 266n9 Simondon, Gilbert, xvi, xix, xxiii, xxiv, 4, 10, 11, 14–17, 31, 90–99, 101n11, 103n45, 104n58, 144, 145–46, 149, 152, 163, 168–79, 191, 261 smart contracts, 54, 57, 60–62 software, xviii, 42–43, 54, 55, 57, 60, 62, 128, 136–37 solutionism, 58 Spinoza, Baruch, 90, 101n12 Stiegler, Bernard, xv, xxiv, 4, 13, 90, 94–95, 98–100, 183 stochastic, xvii, xxv, 74, 251, 256, 264 statistics, 113, 246, 256; statistical, 10–11, 21, 57, 99, 113, 183, 206, 207, 210 Stengers, Isabelle, 90, 183, 260 subject, xxiv, 5, 6, 7, 20, 33, 57, 59, 63–64, 80, 94–95, 97–100, 104n58, 109, 113, 129, 136–37, 174, 214n5, 223, 235, 246 subjection, 78 supply chain, xxiii surveillance xxiii, 14, 53, 56, 62, 135, 235 swarming, xix

297

Symbolic Exchange and Death, 74 synapse, 173–176; synaptic, xxi, 93, 163–179 system, xii, xvi, xvii, xxvin7, 4, 7–8, 11, 12, 13–14, 15, 16, 24, 30, 31, 37, 40, 43, 46, 57–58, 60, 62, 68n61, 74–80, 82–84, 93, 94, 95, 111, 118, 131–37, 143, 151, 153, 156, 164, 168, 170–72, 175, 184, 188–91, 193, 201, 203, 205, 208, 210–11, 213, 214n5, 223–24, 231, 252, 253, 255–56, 264, 266n9, 266n13; deterministic system, 15, 252, 255–256; nonequilibrium system, 213 technicity, 25, 30, 90, 92, 93, 97, 100, 168–69, 172–76 technique, xxv, 44, 110, 143, 175 Technics and Time, xv, 98 technocratic, 4, 235 technology, xi–xii, xiii, xiv, xv, xxiii, xxv, 14, 19–21, 22–32, 43, 45, 47–48, 51n27, 55, 57–59, 64, 80, 108–9, 118, 120, 125–37, 139n20, 139n30, 143–45, 149, 150, 151, 153–56, 169, 201, 212–13, 245, 251, 263, 266n9, 284; aural technologies 87–181; epistemic technologies, 181–251; everyday technologies, xvii, xix, xxii, xxiv, xxv–xxvi, 50n23, 91, 125–41, 171, 188, 205, 235–50, 245–46; social-digital technologies xxii, 1–87; spatial technologies, 87–181; visual technologies xxii, 87–181 teleology, xvi, 20 The Accident of Art, xiii The Future of Hegel, xxv, 219–25, 232 The New Wounded, xxv, 221, 223, 225, 228, 232 The Philosophy of Information, 4 thermodynamics, xxv–xxvi, 10–12, 91, 119, 149, 167, 183–99, 201–17, 248n14; second law of

298

Index

thermodynamics, 12, 149, 165, 167– 69, 191, 195, 198n56, 202–7 Thomson, William, 185–88 Three Gap Events, 151–52 Three Lamp Events, 152, 154 Three Standard Stoppages, 147–48 time, xi, xv, xvi, xvii–xviii, xx, xxi, xxiv, xxv, xxvi, 9, 15, 18n24, 26, 28, 29, 40, 42, 54, 63, 72–77, 79, 82, 83–84, 91–92, 94, 97, 99, 107–24, 137, 144–48, 150, 153–56, 159n49, 163–66, 173–74, 183, 186, 187, 188, 189, 195, 201–17, 219, 222, 224, 227, 228, 241, 261, 262, 268n35, 271, 272, 274; arrow of time 202–4; temporal, xv, xix–xxii, xxiii, xxvi, 16, 28, 38, 43, 91, 93, 95, 108, 113, 150, 151, 152, 155, 156, 157n12, 166, 224, 237, 241, 246, 280 tokenomics, 61 tonality, 114 topology, 19, 37, 90, 92, 101n12, 264, 267n30 Topological Media Lab, 263–64 Towards a Philosophy of Photography, 119 trading zones, 284 transduction, 34n13, 91, 176 transformation, xi, xxiv, xxvi, 21–30, 74, 82, 84, 110, 116, 120, 122, 170, 185, 187, 221, 225, 227, 232; technological transformation, xxvi transparency, xxiv, 19, 54, 61, 64, 125– 26, 133, 138n7 Treatise on Thermodynamics, 204 Troika, xxiv, 143–61 Trustlessness, 54, 55–57, 60–62 Tudor, David, 128, 130–32, 134, 135, 140n45 Turing, Alan, xvii, 43, 50, 253–54; Turing machine, xvii, 253–54

Two Durations, 150 Tyche, 10–14, 16, 18n36, 163 typology, 90, 134, 230 undecidability, xviii, 43–44, 50n19, 251; undecidable, xii, xix uncertainty, xxvi, 11, 29, 40, 74, 82, 176, 191, 201–17, 271–72, 283, 284, 285 unknowability, xi; unknowable, xix, xxi, 43 universality, xiv, 24 un-concealment, 23, 24, 26 Updating to Remain the Same, 242 Variations II, 127, 130 Variations VII, 127–28, 130, 132, 134–36 Virilio, Paul, xiii Virno, Paolo, 79 virtuality, 252, 255, 258 Vismann, Cornelia, xv visualisation, xii, 146, 155 Voss, Daniela, 254, 268n35 Voyager, 128, 135–37 war, 19, 30, 75, 116–17, 126, 133, 135, 144, 220, 226, 230, 239; RussianUkrainian war, 56, 77, 79, 82 What Should We Do with Our Brain? xxv, 221, 225 Weaver, Warren, 3, 154, 160n67 Wheeler, John Archibald, 3 Wiener, Norbert, xix, 3, 9, 10–13, 18n43, 163 Whitehead, Alfred North, xvi, 260, 265–66n7 Wolfram, Stephen 153–54 Zuse, Konrad, 3

About the Authors

Franco Berardi Bifo founded the magazine A/traverso (1975–1981) and was part of the staff of Radio Alice. He fled to Paris, where he worked with Félix Guattari. Publications include Mutazione e Cyberpunk: Imagginazione e Tecnologia (1993), The Soul at Work (2009), Heroes: Mass Murder and Suicide (2015), And: Phenomenology of the End (2015), Futurability: The Age of Impotence and the Horizon of Possibility (2017), Breathing: Chaos and Poetry (2018), and The Third Unconscious: The Psychosphere in the Viral Age (2021). In the last ten years, he has been lecturing at many universities around the globe. Iain Campbell is a Teaching Fellow in Aesthetics at Edinburgh College of Art and a Research Associate at Duncan of Jordanstone College of Art and Design, University of Dundee, where he is working on the project The Future of Indeterminacy: Datafication, Memory, Bio-Politics. He has written on topics across philosophy, music, sound studies, and art theory for publications including parallax,  Contemporary Music Review, Sound Studies, and Continental Philosophy Review. His current research focuses on experimentation and on the differences and continuities between conceptualisations of this notion in philosophy, art, music, and science. He is co-editor, with Natasha Lushetich, of Distributed Perception: Resonances and Axiologies (2021). Stephen Dougherty is Professor of American Literature at Agder University in Kristiansand, Norway. He has published articles and essays on diverse topics, including nineteenth- and twentieth-century US and British literature, psychoanalytic theory, cognitive science, and science fiction. His work has appeared in Configurations, Cultural Critique, Diacritics, Mosaic, Psychoanalytic Quarterly, and elsewhere.

299

300

About the Authors

Aden Evens is Associate Professor in the Department of English and Creative Writing at Dartmouth College in New Hampshire. At liberty following a doctorate in Deleuze Studies, he pursued research in music, sound, and technology, generating two records of electroacoustic music on the Constellation label under the band name, re:, and publishing a book in 2005: Sound Ideas: Music, Machines, and Experience. In the subsequent fifteen years, Aden has focused on critique of the digital, relying on his early training in software engineering to forge a research methodology that sidesteps the usual cultural theoretic approach and instead considers the digital from within, on the basis of its underlying technological principles. A preliminary account of that research, The Logic of the Digital, was released in 2015, and Aden hopes to publish the second and final volume, The Digital and Its Discontents, in the next year or so. Stavros Kousoulas is Assistant Professor of Architecture Philosophy and Theory at the Faculty of Architecture of TU Delft. He studied architecture at the National Technical University of Athens and at TU Delft. He received his PhD cum laude from IUAV Venice. He has published and lectured in Europe and abroad, and is a member of the editorial board of Footprint Delft Architecture Theory Journal. He is the author of the book Architectural Technicities (2022) and the edited volumes Architectures of Life and Death with Andrej Radman (2021) and Design Commons with Gerhard Bruyns (2022). Natasha Lushetich is Professor of Contemporary Art & Theory at the University of Dundee and Arts and Humanities Research Council Leadership Fellow. Her research is interdisciplinary and focuses on intermedia and critical mediality, global art, the status of sensory experience in cultural knowledge, and biopolitics and performativity. Her books include Fluxus: The Practice of Non-Duality (2014),  Interdisciplinary Performance (2016), The Aesthetics of Necropolitics (2018), Beyond Mind, Symbolism, an International Annual of Critical Aesthetics (2019), Big Data—A New Medium? (2020), and  Distributed Perception: Resonances and Axiologies (co-edited with I. Campbell, 2021). Peeter Müürsepp is Associate Professor at Tallinn University of Technology and frequent Visiting Professor at Al-Farabi Kazakh National University. He is a corresponding member of the International Academy of the History of Science and chairperson of the Estonian Association of the History and Philosophy of Science. He is editor-in-chief of two Scopus indexed journals: Acta Baltica Historiae et Philosophiae Scientiarum and ICON. He has been visiting researcher or professor at several well-known academic centres,

About the Authors

301

including London School of Economics and Political Science, Helsinki Collegium for Advanced Studies, and Shanghai University. His research and publications are in philosophy and history of science and technology. Luciana Parisi’s research is a philosophical investigation of technology in culture, aesthetics, and politics. She is a Professor at the Program in Literature and Computational Media Art and Culture at Duke University. She was a member of the Cybernetic Culture Research Unit and currently a co-founding member of Critical Computation Bureau. She is the author of Abstract Sex: Philosophy, Biotechnology and the Mutations of Desire (2004) and Contagious Architecture. Computation, Aesthetics and Space (2013). She is completing a monograph on alien epistemologies and the transformation of logical thinking in computation. Andrej Radman has been teaching design and theory courses at the TU Delft Faculty of Architecture since 2004. A graduate of the Zagreb School of Architecture in Croatia, he is a licensed architect and recipient of the Croatian Architects Association Annual Award for Housing Architecture in 2002. Radman received his masters and doctoral degrees from TU Delft and joined the Architecture Philosophy and Theory Group as assistant professor in 2008. He is an editor of peer-reviewed journal for architecture theory Footprint. His research focuses on new-materialist ecologies and radical empiricism. Radman’s latest publication is Ecologies of Architecture: Essays on Territorialisation (2021). Alesha Serada is currently a PhD candidate at the University of Vaasa, Finland. Their background is in visual culture and game studies, publishing on such topics as ‘Red Comrades Save the Galaxy: Early Russian Adventure Games and the Tradition of Anecdote’ (2022) in Video Games and Comedy and ‘Death and the Plague in The Story of Wanderings’ (2021) in Mortality. Alesha’s research interests revolve around exploitation, violence, horror, deception, and other evils in visual media as well as real-life experience of being a Belarusian. Alesha’s dissertation, supported by the Nissi Foundation, discusses construction of value in games on blockchain. Sha Xin Wei is a Professor at the School of Arts, Media & Engineering and the School of Complex Adaptive Systems and directs the Synthesis atelier for transversal art, philosophy, and technology in the Global Futures Lab at Arizona State University. He is an associate editor for AI & Society and serves on the Governing Board of Leonardo. Sha’s core research concerns poiesis, play, and process. His art and scholarly work range from gestural media, movement arts, and responsive environments through experiential

302

About the Authors

design to critical studies and philosophy of technology. Sha’s publications include Poiesis and Enchantment in Topological Matter (2013).  Dominic Smith is Senior Lecturer in Philosophy at the University of Dundee, where he researches philosophy of technology/media. Dominic is interested in bringing the continental tradition in philosophy (e.g., phenomenology, critical theory, poststructuralism, new forms of realism, and materialism) to bear on philosophy of technology and media. He is a member of the Scottish Centre for Continental Philosophy:  http:​//​scot​-cont​-phil​.org​/. Dominic’s latest book is Exceptional Technologies: A Continental Philosophy of Technology. His current project involves thinking about how philosophy of technology can be broadened to speak to issues in philosophy of education, design, and creativity, with a focus on the work of Walter Benjamin. Oswaldo Emiddio Vasquez Hadjilyra is a Cypriot-Dominican artist and musician, with a formal training in mathematics and philosophy. He is currently working on a PhD in Media Arts and Sciences at Arizona State University, in which he considers the aesthetic and political implications of treating digitality and computation within a materialist framework.

Joel White completed his MA in Contemporary European Philosophy at CRMEP and Paris VIII and his PhD in French Philosophy at King’s College London. His current research is situated in the emerging field of continental philosophy of science and technology. He is currently a Research Affiliate at the Research Network for Philosophy and Technology as well as being the Executive editor of Technophany, the Network’s journal. Ashley Woodward is a Senior Lecturer in Philosophy at the University of Dundee. He was a founding member of the Melbourne School of Continental Philosophy, and is an editor of Parrhesia: A Journal of Critical Philosophy. His research interests include existential meaning, Nietzsche, twentiethand twenty-first-century French philosophy, aesthetics, and philosophy of information. His publications include the books Nihilism in Postmodernity (2009),  Lyotard and the Inhuman Condition (2016), and the co-edited volume Gilbert Simondon: Being and Technology (2012). He is currently translating Raymond Ruyer’s writings on cybernetics, and working on a project titled Transforming Information. David Zeitlyn is Professor of Social Anthropology at the University of Oxford. He has been working with Mambila people in Cameroon since 1985.

About the Authors

303

His research covers religion, sociolinguistics, and vernacular photography. Spider divination has been a recurrent topic which has been tackled in several different ways as reflected in his 2020 book: Mambila Divination: Framing Questions, Constructing Answers. A Java-based simulation (which does not run in some modern browsers) is online: https:​//​era​.anthropology​.ac​.uk​/ Divination​/Spider​/. This has been validated by diviners in Cameroon.