Design and Digital Interfaces: Designing with Aesthetic and Ethical Awareness 9781350068308, 9781350068285

Are digital interfaces controlling more than we realise? Can designers take responsibility, and should they? From domest

955 166 14MB

English Pages [178] Year 2021

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Design and Digital Interfaces: Designing with Aesthetic and Ethical Awareness
 9781350068308, 9781350068285

Table of contents :
Cover
Half-title
Copyright
Title
Contents
List of figures
List of contributors
Preface
Introduction: What are digital interfaces?
Technological interfaces
Cultural interfaces
Historical interfaces
What does an interface designer do?
Theoretical perspectives and frameworks
Chapter 1. Complexity and fragmentation
Fragmented distribution
Fragmented devices
Fragmented attention
Technological approaches
Design approaches
Research methods
Chapter 2. Social interfaces
Design for social impact
Soft interfaces: healthcare and loneliness
Accessibility: democratization of tools
Collaborative interfaces: beyond western-centrism
Interfaces for sociality
Constructing social identities
Chapter 3. Legal and political interfaces
Political interfaces
Entangled interfaces
The political action of interfaces
A history of critical practice
Openness and access
Inscrutability and opacity
Critical interfaces
Chapter 4. Ethical interfaces
Design as exploitation
Unforeseen consequences
Legislation
Ethical legibility
Ethical design cultures
Futuring ethical principles
Ethical designers
Chapter 5. Aesthetic interfaces
Aesthetics and the senses
Cultural aesthetics and meaning
Design patterns and behaviours
Aesthetics for use
Aesthetics for empathy
Chapter 6. Uncertainty, deviance and futures
Embracing uncertainty
Science fiction and design
Design fiction
Design imaginaries
Deviant interfaces
Interviews
Anab Jain
Dan Lockton
Mushon Zer-Aviv
Sarah Gold
Glossary
References
Acknowledgements
Index

Citation preview

9781350068278_txt_app.indd 1

08/03/2021 11:13

BLOOMSBURY VISUAL ARTS Bloomsbury Publishing Plc 50 Bedford Square, London, WC1B 3DP, UK 1385 Broadway, New York, NY 10018, USA 29 Earlsfort Terrace, Dublin 2, Ireland BLOOMSBURY, BLOOMSBURY VISUAL ARTS and the Diana logo are trademarks of Bloomsbury Publishing Plc

Library of Congress Cataloging-in-Publication Data Names:

Fass, John, author. Revell, Tobias, author. Stopher, Ben, author. Verhoeven, Eva, author.

Title:

Design and digital interfaces : designing with aesthetic and ethical awareness / John Fass, Tobias Revell, Ben Stopher, Eva Verhoeven.

Description:

London, UK ; New York, NY, USA : Bloomsbury Visual Arts, Bloomsbury Publishing Plc, 2021. | Includes bibliographical references and index.

Identifiers:

LCCN 2020056629 (print) LCCN 2020056630 (ebook) ISBN 9781350068278 (PB) ISBN 9781350068292 (eBook) ISBN 9781350068285 (ePDF)

Subjects:

LCSH: Human-computer interaction. | Computer software--Social aspects.

First published in Great Britain 2021 Copyright © John Fass, Tobias Revell, Benjamin Stopher and Eva Verhoeven, 2021 John Fass, Tobias Revell, Benjamin Stopher and Eva Verhoeven have asserted their rights under the Copyright, Designs and Patents Act, 1988, to be identified as Authors of this work. For legal purposes the Acknowledgements on p.165 constitute an extension of this copyright page. Cover and book design Conor Rigby All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage or retrieval system, without prior permission in writing from the publishers. Bloomsbury Publishing Plc does not have any control over, or responsibility for, any third-party websites referred to or in this book. All internet addresses given in this book were correct at the time of going to press. The author and publisher regret any inconvenience caused if addresses have changed or sites have ceased to exist, but can accept no responsibility for any such changes.

Classification: LCC QA76.9.H85 F37 2021 (print) LCC QA76.9.H85 (ebook) DDC 004.01/9--dc23 LC record available at https://lccn.loc.gov/2020056629 LC ebook record available at https://lccn.loc.gov/2020056630 ISBN PB: 978-1-350068-27-8 ePDF: 978-1-350068-28-5 eBook: 978-1-350068-29-2

To find out more about our authors and books visit www.bloomsbury.com and sign up for our newsletters.

A catalogue record for this book is available from the British Library.

9781350068278_txt_app.indd 2

08/03/2021 11:13

John Fass Tobias Revell Benjamin Stopher Eva Verhoeven

Designing with aesthetic and ethical awareness

9781350068278_txt_app.indd 3

08/03/2021 11:13

Contents 6

List of figures

8

List of contributors

10 Preface 16 Introduction:

What are digital interfaces?

17

Technological interfaces

18

Cultural interfaces

19

Historical interfaces

21

What does an interface designer do?

23

Theoretical perspectives and frameworks

40

Chapter 2 Social interfaces

42

Design for social impact

44 Soft interfaces: healthcare and loneliness 46 Accessibility: democratization of tools 48 Collaborative interfaces: beyond western-centrism 51

Interfaces for sociality

54

Constructing social identities

26

Chapter 1 Complexity and fragmentation

58

Chapter 3 Legal and political interfaces

29

Fragmented distribution

59

Political interfaces

31

Fragmented devices

61

Entangled interfaces

33 Fragmented attention

63 The political action of interfaces

33

Technological approaches

65

A history of critical practice

35

Design approaches

67

Openness and access

37

Research methods

70

Inscrutability and opacity

72

Critical interfaces

4

9781350068278_txt_app.indd 4

Design and digital interfaces

08/03/2021 11:13

76

Chapter 4 Ethical interfaces

78

Design as exploitation

80 Unforeseen consequences

Ethical legibility

85

Ethical design cultures

Chapter 6 Uncertainty, deviance and futures

111

Embracing uncertainty

112 Science fiction and design

81 Legislation 82

108

115 Design fiction 118 Design imaginaries 121 Deviant interfaces

86 Futuring ethical principles 88 Ethical designers

92

Chapter 5 Aesthetic interfaces

124 Interviews 125 Anab Jain

94 Aesthetics and the senses

129 Dan Lockton

98 Cultural aesthetics and meaning 101 Design patterns and behaviours

136 Mushon Zer-Aviv 141 Sarah Gold

105 Aesthetics for use 106 Aesthetics for empathy 148 Glossary 154 References 165 Acknowledgements 166 Index Contents

9781350068278_txt_app.indd 5

5

08/03/2021 11:13

List of figures

22

Figure 0.1 Netflix home screen.

67

Bloomberg / Getty Images.

Figure 3.2 Technological Dream Series: Robots. Copyright Dunne and Raby, 2007.

28

Figure 1.1 A foetal ultrasound image branded with the General Electric logo.

68 Figure 3.3 An Athens Wireless Metropolitan Network node.

Copyright nikkigomez, Creative Commons 2.0 generic licence.

Copyright Vaggelis Koutroumpas, 2014.

70 31

Figure 1.2 Netflix home screen.

Copyright Martin Haase, Creative Commons 2.5 generic licence.

Bloomberg / Getty Images.

47

Figure 3.4 War-chalking.

Figure 2.1 OLPC’s Sugar interface

50 Figure 2.2 Ushahidi

73

Figure 3.5 IF’s data licences

75

Figure 3.6 Dronestagram. Copyright James Bridle, 2012.

53

57

Figure 2.3 Diaspora

79

Figure 2.4 Estonian e-Residency kit

66 Figure 3.1 Superstudio, The Continuous Monument. The Museum of Modern Art (MoMA), New York. Copyright Photo SCALA, Florence.

6

9781350068278_txt_app.indd 6

Figure 4.1 PRISM Slide. Public domain.

84 Figure 4.2 The Fairphone 2. Copyright Fairphone, Creative Commons 2.0 generic licence.

89 Figure 4.3 ‘Ethical Things’ by Simone Rebaudengo

Design and digital interfaces

08/03/2021 11:13

90 Figure 4.4 Still from Choy Ka Fai’s ‘Prospectus for a Future Body’ (2011). 94 Figure 5.1 GOV.UK landing page

110 Figure 6.2 The Futures Cone. 112 Figure 6.3 Google’s Project Ara 113 Figure 6.4 Film still from Minority Report (2002).

96 Figure 5.2 Apple AirPod headphones.

Copyright Steven Spielberg.

Bloomberg / Getty Images.

100 Figure 5.3 Blinx7 by artist Rosa Menkman.

116 Figure 6.5 Still from Microsoft’s Future Productivity Vision (2015).

Copyright Rosa Menkman, Creative Commons 2.0 generic licence.

Copyright Microsoft in Business.

117 Figure 6.6 Still from Curious Rituals (2014).

102 Figure 5.4 Bootstrap

Copyright Near Future Laboratory.

103 Figure 5.5 Twitter’s pull-to-refresh patent.

120 Figure 6.7 Superflux’s Uninvited Guests (2015).

104 Figure 5.6 Dark patterns

Copyright Superflux.

126 Figure 7.1 Superflux’s Song of The Machine (2011).

107 Figure 5.7 Personas

Copyright Superflux.

110 Figure 6.1 ‘Postcards from the Future’, Jean Marc Cote (1899). Copyright Jean-Marc Cote, public domain.

List of figures

9781350068278_txt_app.indd 7

7

08/03/2021 11:13

List of contributors

Dr. John Fass John is a designer, researcher and teacher, and is the Course Leader MA User Experience Design at London College of Communication, University Arts London. He has been working as an interaction and interface designer for ten years, with clients including The Wikipedia Foundation, Universal Music, Exxon, Global eHealth Foundation and a range of Silicon Valley start-ups including Scanadu, TED and Index. John has lived and worked as a designer and art director in London, Berlin, Milan and Brussels, and has exhibited his work at Moderna Museet (Stockholm), Bauhaus (Dessau), FACT (Liverpool), Transmediale (Berlin) and Bozar (Brussels). He has presented research at conferences internationally including: CHI, NordiCHI, DIS, INCITI Recife, and sits on the Program Committee for Research Through Design. As a teacher, John also lectures at the Royal College of Art on the Information Experience Design MA. Tobias Revell Tobias is an artist and designer. Spanning different disciplines and media, his work addresses the urgent need for critical engagement with material reality through design, art and technology. Some of his recent work has looked at the idea of technology as a territory, expectations of the future, rendering software and the occult and the supernatural in pop culture discussions of technology. Tobias is a Programme Director at University Arts London. He is a co-founder of research consultancy Strange Telemetry, a founding member of Supra Systems Studio and is one-half of research and curatorial project Haunted Machines who curated the Impakt festival in 2017. He lectures and exhibits internationally, and has recently appeared at Improving Reality, FutureEverything, IMPAKT Utrecht, Web Directions Sydney, Transmediale Berlin, ThingsCon and Lift Geneva. He is a PhD candidate in design at Goldsmiths. He takes it all very seriously.

8

9781350068278_txt_app.indd 8

Design and digital interfaces

08/03/2021 11:13

Benjamin Stopher Ben is a designer, educator and researcher with extensive experience as a practitioner, creative consultant and curriculum designer in the broad areas of communication and digital design. He is the founding Dean of the University Arts London Creative Computing Institute. As an academic researcher, he explores models of distributed collaborative and augmented ideation practice through speculative prototypes and user research. Ben was the UAL academic lead for the EU funded INTERACT mobility project (INTERACTive Studios & Innovative Networks for Future Design Careers). Project partners include: University of the Arts London (UAL), Danish School of Media and Journalism (DMJX), Royal Melbourne Institute of Technology (RMIT) and QUT (Queensland University of Technology). As part of the organizing team in 2013 and 2014, Ben was responsible for London’s first Maker Faire and he chaired a panel at the Victoria & Albert Museum drawing on this experience, and exploring design education and maker culture. Dr. Eva Verhoeven Eva is an artist, a designer, a researcher and the Programme Director for Interaction Design and Visual Communication at London College of Communication, University Arts London. Eva is interested in the consequences of technological development, its relays into society and culture, and the question of the role of the designer within it. Eva has co-organized a number of events and conferences, including, speculative life-coding events at the Piksel Festival (an event for those working with free/libre and open source audio-visual software, hardware, and art) in Bergen, Norway. She has exhibited and presented her work and research nationally and internationally (Los Angeles, New York, Istanbul, Norway and the Victoria & Albert Museum, London). Together with Luke Pendrell and Ben Branagan, she published Doggerland, a small-scale independent publication of a bastard archaeology of damaged, unwanted and unverifiable artefacts. She was also a co-organizer of London’s first Maker Faire at LCC in 2013, and chaired a panel on the Politics of Making at the V&A. List of contributors

9781350068278_txt_app.indd 9

9

08/03/2021 11:13

Preface

10

9781350068278_txt_app.indd 10

08/03/2021 11:13

This book is not a ‘how to’ guide for designing digital interfaces. Rather, this book reflects the fact that we, the authors, care deeply about our lived experience – one that is increasingly mediated through digital interfaces; we are using this book as an opportunity to examine the critical aspects of our lives with them. We also believe that the ways in which designers configure these experiences requires a refined understanding of their responsibilities, within the context of the accelerated fragmentation, and complexity, of technological infrastructure as it plays out across our daily lives. Digital interfaces shape how we live, work, fall in love, are cared for, educated and remembered. Digital interface design is the creative discipline that shapes how we use the computer systems around us, whether they are on our wrists or in the cloud. This book takes the view that the practice of designing digital interfaces calls for an expanded aesthetic, critical and ethical awareness on the part of designers, who should be willing to act with sensitivity and understanding towards the people they design for and with. For large swathes of people, digital interfaces are ubiquitous in everyday human life. As digital technology grows in complexity, it becomes harder to understand. Similarly, as digital technologies spread to more and more people around the world, they increasingly influence how people live, feel and behave. This means interface designers have a responsibility to ensure that what they produce is not misleading, coercive or exploitative. Acting responsibly as an interface designer means considering the ethical implications of a design, including how it shapes opinions and actions, while of course making room for pleasure, delight and surprise. We, the four co-authors whose voices weave through the text, and intermingle with other voices in the form of interviews, examples of practice and theoretical positions, are writing at a particular point in time in which uncertainty is perhaps felt more acutely again. The book is situated in a particular technological, cultural, social, political and economic context. As designers, artists, researchers and educators we Preface

9781350068278_txt_app.indd 11

11

08/03/2021 11:13

are thoroughly involved in this context, and this situatedness permeates the text. It is a time when the inventor of the World Wide Web, Tim Berners-Lee, warns of the negative consequences of powerful digital gatekeepers (Facebook, Google, etc) increasing political polarization, supported by so-called ‘filter bubbles’ that direct attention to similarities rather than differences via algorithmic control, as well as the proliferation of fake news. And while Berners-Lee ‘imagined the Web as an open platform that would allow everyone, everywhere to share information, access opportunities and collaborate across geographic and cultural boundaries’ (Berners-Lee, 2017), right-wing populism is propagating a fear of the other, an idea of a strong national identity, and the pulling up of drawbridges (and the building of walls). The role that design and interface design can and must play within this context is therefore of heightened interest.

How to read We have written this book to be read in a number of different ways. A sequential reading, starting with Chapter 1, will lead you through an argument populated with examples and images, which builds on a definition of a digital interface, based on its effects on the self, on society, and its essential characteristics, and ending with a look at the near and medium-term future of living with, and designing for, digital interfaces. The chapters are organized around different areas of exploration, presenting focal points on the aesthetics, social, political and ethical aspects of living with, and designing for, digital interfaces. We have also written this book to enable a more modular way of reading. Each chapter presents its own section of the argument and its own examples. These can be taken as coherent units on their own and can be used to inform teaching, professional practice or simply the development and extrapolation of adjacent ideas. Furthermore, the book can be read through the interviews alone. These sit at the end of the book and complement the themes explored. Reading just the interviews will give readers a unique insight into the thoughts and practices of some key practitioners and thinkers in the field. Finally, like any good book, you can read the introduction, look at the pictures, check the index and get a pretty good overview of what we think is important to say on the topic of living with digital interfaces. Chapter 1 deals with the reciprocal topics of complexity and fragmentation as they play out across digital culture. We use these 12

9781350068278_txt_app.indd 12

Design and digital interfaces

08/03/2021 11:13

terms as framing concepts for the characteristics and effects of digital interfaces, and we explore their meanings through digital interfaces for bodies, selves and societies. The politics and social dynamics of fragmentation that is mediated by digital interfaces is discussed with reference to historical and contemporary examples. Chapter 2 investigates the social consequences of digital interfaces and how they mediate social action. This includes the topic of design for social impact, social media and online communities, social robots and the need for a more consciously ethical approach to social interfaces. This chapter considers the social effects of interfaces – how they permit or restrict access to certain user groups, reinforce or disrupt social hierarchies and constructs, or seek to direct and influence behaviours. Chapter 3 examines the legal and political implications of living with digital interfaces, how they function to channel behaviours and as forms of control, as well as to shape and deform information. We explore the power dynamic at play in the way digital interfaces are used, the structural logic of digital interfaces as technical artefacts, and the background of critical design used to analyse them. Chapter 4 highlights the complex ethical responsibilities of interface design. It examines how information is gathered, shared and released to the user, and how privacy, control and agency form important points for consideration in interfacing with systems and services. The chapter discusses these complex ethical responsibilities through concepts of openness, modes of transparency and their relationships to the opaque. Chapter 5 explores the topic of the operational aesthetic at work in digital interfaces and how it is both configured and consumed by designers. This involves an analysis of how digital interfaces communicate, and how they engage human senses in increasingly tangled ways. The difficulty of observing a unified set of aesthetic criteria for digital interfaces is explored from the perspective of hidden effects, and suggests that an understanding of the aesthetics formed though our experience of digital interfaces is a key area of expertise for the digital interface designer. Chapter 6 accounts for the uncertain and unstable future of a life with digital interfaces. We show how science fiction has influenced the interfaces we end up with, and suggest ways of shaping an alternative future, one informed by wider politics of interaction, and inclusive of more diverse possibilities. It is important to point out that we understand the complex interrelationship between these areas, but the separation into chapters is a means to explore examples of interface design through a particular lens. These lenses are purely used as tools to focus on particular Preface

9781350068278_txt_app.indd 13

13

08/03/2021 11:13

aspects of cultural practice. It is an interesting and exciting time to be engaged in the creative practice of digital culture, with a huge range of opportunities for bold and experimental innovations. Our ambition for this book is to provide a companion to those engaged in these innovations, and to help them critically consider the interfaces they would like to see in the world.

14

9781350068278_txt_app.indd 14

Design and digital interfaces

08/03/2021 11:13

9781350068278_txt_app.indd 15

08/03/2021 11:13

Introduction: What are digital interfaces?

16

9781350068278_txt_app.indd 16

08/03/2021 11:13

The term ‘interface’ originates from the natural sciences and describes the boundaries between different things as ‘a plane surface regarded as the common boundary of two bodies’ (Chambers, 2000: 476). This definition emphasizes the closed aspect of these material states or bodies, and points to them as opaque black boxes, accessible only through their common boundary. The black box bodies themselves are closed to the next state or body, hinting at complexity, while the interface creates a double bind of enabling access, while at the same time obscuring or hiding the internal workings. This is interesting, because the states that are being interfaced are black boxes, their inner workings are complex and inaccessible – and the interface plays the role of translating, but also obscuring and fragmenting, these systems. The concepts of complexity and fragmentation are applied in this book as lenses through which to explore contemporary computational culture, and will be discussed in more detail in Chapter 1.

Technological interfaces More specifically, this is a book about digital interfaces, and drawing from the paragraph above, digital interfaces are the boundaries between two components of a digital system where one symbolic framework is translated into another, in order to aid communication between these different components. Digital interfaces exist at all levels of digital systems, from interfaces that connect hardware components to system software (voltages to binary code), interfaces that enable communication between different software manifestations, for example system software and programming software or application software, to the humancomputer interfaces that enable users to access systems via input/ output devices (keyboards, mouse, speakers, etc). All of these interfaces have been designed and engineered to support the communication between different constituent parts of a digital infrastructure. What are digital interfaces?

9781350068278_txt_app.indd 17

17

08/03/2021 11:13

Here, we explicitly explore digital interfaces at the human-perceptible level, looking at their design and how they mediate daily life. As a consequence, in this book we focus on digital interfaces as the locus of user interactions between humans and computers. This involves manipulating visual symbolic representations such as buttons, menus and sliders, but also features a wide variety of sensory modalities such as sound, movement and touch. We also consider the functional characteristics of a digital interface that involves in its dimensions, for example, both lateral and vertical, its click depth, the degree to which it affords multiple possibilities for interaction and how well it performs its stated purpose. Functional characteristics can be dramatically transformed between devices, and are therefore not fixed, but can be seen as inconstant, responding to the specific circumstances under which they are accessed. The technical characteristics of digital interfaces can depend on how they are delivered, i.e. via a smart watch or a television, and more hidden aspects such as how optimized the content is for fluctuating connection speeds, the degree of latency between action and response and the resolution of any given device. Digital interfaces must be programmed to respond to input and to present their content as required, and so depend on a range of computer languages such as HTML5, JavaScript and PHP, each of which performs a layer of the interface function. We explore the results of this layering on the digital experience.

Cultural interfaces Digital interfaces are not neutral technological artefacts. By design, all interfaces are avatars in that they are an aesthetic representation of the aims, ideology and structure of the cultural data and system they are an interface for. In this book, while we pay attention to the technical characteristics, we are more concerned with the cultural characteristics of digital interfaces. This includes how they may exclude certain user groups, or how they may enforce implicit dynamics of control, such as can be seen in the hidden data collection methods used by dating apps. We propose that digital interfaces can be considered as sites of cultural meaning and are the point where much contemporary human life is played out, from health and well-being, to sexual life and work. The human-computer interface as a human-computer-culture interface was posited by media theorist Lev Manovich in 2001, when he described the move to a predominantly computer-based distribution 18

9781350068278_txt_app.indd 18

Introduction

08/03/2021 11:13

of cultural data (pictures, video, text, music, etc) that we access via digital interfaces; we are ‘interfacing’ culture as data (Manovich, 2001: 69). Furthermore, we consider that digital interfaces are themselves an important form of culture, but one which has not received enough critical attention from a design perspective. We should care about how digital interfaces are designed, what kind of decisions are necessary to create them and how these decisions can be made in a way that is consistent with the values of openness, equality and ethical transparency.

Historical interfaces When I first heard about computers, I understood, from my radar experience, that if these machines can show you information on punch cards and printouts on paper, they could write or draw that information on a screen. When I saw the connection between a cathode-ray screen, an information processor, and a medium for representing symbols to a person, it all tumbled together in about half an hour. Douglas C. Engelbart (Rheingold, 2000)

In the early 1960s, Douglas Engelbart set up the Augmentation Research Center at the Stanford Research Institute (SRI) in order to work on what they called the oN-Line System (NLS), a computer system that used a mouse-driven cursor as an input device, and video screens as output devices that displayed multiple windows and worked on hypertext. It was an innovative approach to computing, and turned out to be a significant advancement for the development of the personal computer as we know it today. The NLS was funded by the Defense Advanced Research Projects Agency (DARPA), NASA and the U.S. Air Force. The reason DARPA and the U.S. Air Force were interested in funding such a project lies in its close link to military defence and anti-aircraft tracking and fire control systems that had played an important part in the British air defence against Nazi Germany’s Luftwaffe, for example. The innovation of the oN-Line System didn’t lie in the input and output devices – all of these had already been used and tested in the context of military advancement: screens used for radar, for example, and the light pen and joystick (which later became the mouse) initially to guide missiles and later, to ‘move a cursor in a direction relative to the current What are digital interfaces?

9781350068278_txt_app.indd 19

19

08/03/2021 11:13

position on an absolute plane. The necessary technical conditions for the guidance of cursors on cathode ray tubes become clear through an examination of the history of anti-aircraft artillery systems’ (Roch, 1996). This innovation can be located in the Stanford Research Institute’s approach to bring the different devices together in a different context of screen-oriented computer programming where ‘targeting the enemy was reborn in the form of a mouse on an ordinary computer [while] the cursor used in air defence [can be regarded as] the return of the searchlight on a tactical command level: the computer screen’ (Roch, 1996). While today’s screens do not make use of cathode ray tube technology, and the mouse is being rapidly replaced by track pads and touch screens, the NLS laid the path for today’s input and output interfaces. In its development of the oN-Line System, the SRI was influenced by theories about the relationship between humans and machines of the time. For example, Joseph Carl Robnett Licklider’s thinking about ManComputer-Symbiosis was instrumental, because it suggested a much more interactive relationship between humans and machines, which up to this point was not dialogical but required forward planning of code on punch cards (Licklider, 1960). These cards were input into computers, which in turn did the calculations. This more immediate interaction was then realized in Sketchpad, a system developed by Ivan Sutherland at MIT, which enabled the user to manipulate shapes directly on a screen using a stylus. In the 1970s, a team at Xerox PARC, which included Alan Key, developed the first computer – Alto – that used a graphical user interface, including a desktop metaphor and WYSIWYG (what you see is what you get) word processing. The Alto remained a prototype, but an influential one, because it was from this prototype that the personal computer, which is now ubiquitous in the fat (more is plenty) world, was developed. From these pioneers came much of the graphical user interface (GUI) elements that persist today such as WYSIWYG, direct manipulation of screen icons, windows and folders, colour graphics and bitmapped displays. While the GUI revolutionized computing and arguably democratized the computer by reducing complexity, the GUI as a layer is a doubleedged sword: it creates visibility and clarity while at the same time hiding the underlying processes from human perception. Since our everyday is more and more regulated and monitored through algorithmic processes, it is ever more important to consider this false transparency that renders the process opaque.

20

9781350068278_txt_app.indd 20

Introduction

08/03/2021 11:13

What does an interface designer do? Interface designers are commonly concerned with how a digital interface is presented. They decide what goes where on the screen, how interactions appear to the user and how the interface signals what it is for. There are many other interlocking disciplines involved in the design of a digital interface, such as interaction design, user experience design and information architecture, but we are not unduly worried here about the subtle distinctions between design domains. Instead, we will look at interfaces as networked artefacts of socio-technological infrastructure, and we will consider how interface design is undertaken, by whom, under what conditions and to what end within the complexity of that sociotechnological infrastructure. Methods of interface design include typographic composition, page layout, the design of transitions between states, and ensuring consistency of presentation across devices and software systems. This means having a working knowledge of basic graphic design skills. Increasingly, however, interface design is done computationally by designers operating at code level rather than using design software, which requires a working knowledge of technological and functional opportunities and limitations. This has profound implications for how interface design is understood and what interface designers learn when they are training. It is not just the visual and coding skills that are important – interface designers also increasingly need to be able to understand the grammar and semantics of the contemporary cultural hegemony that is structured around computational systems. This requires an expanded understanding of the enfolded nature of complex systems of interaction and the interfaces at different levels, and asks interface designers to question what the embedded political assumptions of their interface designs are. Recently, interface design tasks have been delegated to algorithmic intelligence. For example, the banner advertisements on the Chinese commerce site Alibaba are devised without the involvement of human design input, by responding automatically to user choices about colour, placement and size. Video streaming service Netflix uses a similar process to automate the images that appear as covers for film and TV series (Chandrashekar, Amat, Basilico and Jebara, 2017). This departure from how design has traditionally been understood (i.e. as a craft-based, highly skilled specialist profession) means digital interface design must also be considered as an automated activity, untouched by human hand and as mechanized as a modern car factory. The implications of computational interface design are not confined only to a new set of skills for designers to learn, but extend to considerations of scale and how manipulating the interface code base can implement changes on What are digital interfaces?

9781350068278_txt_app.indd 21

21

08/03/2021 11:13

a massive scale. This is especially true of global corporations such as Google, which is constantly engaged in user testing to optimize its interface for each individual user. These innovations have been extensively discussed from the perspective of efficiency and productivity, often with a kind of breathless amazement and admiration, or perplexity and surprise. What has been missing from the debate in design is a critical voice of how we should respond to these and other changes, what the implications are for the discipline of digital interface design and for the way digital interfaces shape and influence human behaviour. We propose here that interface designers also bear an ethical responsibility to their users that involves being aware of the constraints of a system, of what it may do that is not immediately obvious, of who can use it and of the implicit assumptions it may embody. For example, an interface designed to work solely for Apple’s iOS automatically presupposes that users of other systems are excluded.

Figure 0.1

Netflix home screen. Bloomberg / Getty Images. Video streaming service Netflix uses a process of automation to generate cover art for films and TV shows based on what users have previously interacted with. It’s a good example of how automated processes increasingly intersect with what has traditionally been the realm of the designer.

22

9781350068278_txt_app.indd 22

Introduction

08/03/2021 11:13

Theoretical perspectives and frameworks This book is intended to reflect on the contemporary moment from the perspective of digital interfaces, drawing on current theories and the practice of interface design. Lev Manovich positions interfaces as the key aspect of how software mobilizes its ability to bring existing media together in new arrangements. Commenting on early innovations, Manovich argues that digital interfaces do much more than remediate existing technologies into digital forms (Manovich, 2001). They do new things like enabling dynamic search of text documents, or allowing multiple levels of magnification of digital images. The early GUIs were described by their inventors as a ‘metamedium’, with completely new interactive and cognitive properties, going far beyond the affordances of pen and paper or analogue film. In this book, we develop this concept of digital interfaces as being more than arrangements of symbolic representations, or as places where new features are possible. Alexander Galloway describes how digital interfaces are seen as ‘thresholds’ or ‘doorways’ (Galloway, 2012). This idea holds that digital interfaces are less objects with specific design characteristics than they are points of transition between different tiers in a nested series of mediatic layers. The outside – the visible parts of an interface, including its icons, menus and windows – is evoked so that the inside, i.e. the deeper content of the system, including videos, text, images and sound, may occur. For Galloway, an interface is not a thing, it is an effect. The effect of an interface is seen in how it processes or translates choices – it is not a form of media, but a site of mediation. Galloway uses the example of the interface from video game World of Warcraft. He distinguishes between the diegetic and non-diegetic elements of the system. The diegetic realm is spatial, it is a carefully rendered environment populated by characters, objects and buildings. This spatial view is overlaid with a non-diegetic layer of information showing tools, statistics, interactions and game information. Building on the symbolic and representative bases of digital interfaces and the metamedium of Manovich, Galloway argues that interfaces are more than simply doorways to content. They are both experiential objects and symbolic systems that present as a simultaneous integration and separation of distinct modes of communication. Galloway calls this the ‘intraface’ (Galloway, 2012). In this book, we employ this view of digital interfaces as complex and multilayered object/system arrangements. Ash highlights two ways in which digital interfaces have been theorized that are relevant to this book. The first he calls ‘interface as practice’ (Ash, 2015), which attempts to transcend a reading of digital interfaces as sets of computational devices such as servers, screens, routers, software protocols and so on. Drawing on Farman, What are digital interfaces?

9781350068278_txt_app.indd 23

23

08/03/2021 11:13

this approach proposes that devices such as cell phones are not interfaces in themselves, but instead are part of a set of sociotechnical relations that include how they are used by people, how they communicate with other devices and technologies, and therefore how they are both symbolic and discursive (Farman, 2012). The second approach highlighted by Ash, a development of the first, is what he terms ‘interfaces as carriers of cultural logics and ideologies’ (Ash, 2015: 20). Digital interfaces encode specific views of the world – how it may be ordered, what may be searched for, who can access information. Interface designers, whether consciously or not, conform to these logics and reproduce them in forms of increasing speed and responsiveness. While digital interfaces do not appear to be explicitly ideological, the paradigms of efficiency, production, ranking and transmission that they assume can be described as requiring a political response. In this book, we adopt an analysis of digital interfaces as more than simply technical artefacts, and explore their potential for new cultural and political meanings, an enhanced ethical awareness and the surprise and delight that great design can accomplish. This becomes even clearer when looking at interfaces at a megastructural level. Interfaces in no way stand alone as designed artefacts, they are networked. Not only on the internet or other comparable computational networks, but also in their placement within ‘the social-cum-technological milieu that at once enables the fulfilment of human experience and enforces constraints on that experience’ (Liu, 2016). This milieu is more commonly referred to as infrastructures. Infrastructures – from the internet itself, to GPS, to the server stacks of the advertising companies and the fibre optic cables of the telecom companies – shape and are shaped by the interfaces, and these in turn shape our culture. In this context, it’s easiest to visualize the interface as the point of human interaction with the infrastructure. Recently, theorists have begun to conceptualize this infrastructure as a global computational network rather than separate technological systems. Through various protocols, standards and operating systems, computational networks share and store data, translate information and compute. Benjamin Bratton refers to this as ‘The Stack’ – an ‘accidental megastructure’ that comprises everything from oil drilling and refining for plastics for parts, to government surveillance systems, all tied together in a system of planetary computation where the Earth itself is a computer (Bratton, 2016). It is this infrastructural complexity and its resulting fragmentation of experiences that we are exploring in this book. The combination of these theories describes what we mean when we refer to interfaces: they are surfaces between systems and bodies, both human and non-human; they exist between the worlds 24

9781350068278_txt_app.indd 24

Introduction

08/03/2021 11:13

these systems and bodies inhabit, and they are entangled in cultural and technical infrastructures which they affect and which they are also affected by.

What are digital interfaces?

9781350068278_txt_app.indd 25

25

08/03/2021 11:13

Chapter 1 Complexity and fragmentation

26

9781350068278_txt_app.indd 26

08/03/2021 11:13

In this chapter, we explore the impact of complexity and fragmentation on digital interface design. We discuss how increasing complexity can lead to consequences for people and society far beyond the interface. Fragmentation is explored from several directions. First, within any individual interface there are many possible routes and directions, often delivered in tiny pieces of information. Secondly, the ubiquity of digital interfaces means there are potentially millions to choose from. Finally, users and systems are fragmented, resulting in the harsh seams between systems and users that break us up into individual consumers of digital content. Digital technologies shape and influence how we experience our lives from before we are born until after we die. For example, the ultrasound technician in an antenatal clinic works by dividing their attention between the live signal on a computer screen and the human body in front of them. The screen interface is designed for a highly constrained set of needs and provides visual feedback of the ultrasound signal. The aim is to measure the size and shape of the foetus, listen to its heartbeat and compare the data to previously established standards of foetal development. The interface allows for visual measurement by superimposing lines and ellipses over the image of the foetus. The unborn child is thus an unwitting subject in this technical arrangement, its most intimate physical characteristics captured, registered, recorded, printed. An article by Mark Wilson recounts his horror at seeing the General Electric logo on his unborn child’s ultrasound image. He considers his child to be tainted by what he calls ‘primal branding’, the ability for large corporations to instil subtle cues into life events and create a halo effect around all that they touch (Wilson, 2013). New parents have long been targeted by fast-moving consumer goods (FMCG) companies because of their assumed emotional vulnerability to branding messages and the huge amount of new goods they are expected to buy. In the case of Mark Wilson’s lived experience of the ultrasound interface, it contains an intrusion for marketing purposes consciously included by the designer. Complexity and fragmentation

9781350068278_txt_app.indd 27

27

08/03/2021 11:13

Figure 1.1

A foetal ultrasound image branded with the General Electric logo. Copyright nikkigomez, Creative Commons 2.0 generic licence. For journalist Mark Wilson, this image represented ‘primal branding’ – an intrusion into the life of his child before it was even born.

Similarly, a digital autopsy is a non-invasive way of looking inside the bodies of the deceased, used to determine the cause of an unexpected death. The body is enclosed in a CT scanner, which produces a detailed three-dimensional image. This model can then be inspected in fine detail by an autopsy technician, using a digital interface that allows for panning and zooming the image. Individual body parts can be measured and assessed in the same way as in an antenatal ultrasound, using system tools for isolating specific parts of the image. The process is diagnostic, forensic, expository – designed to produce digital data that reveals otherwise hidden human physical attributes. Digital interfaces are thus integrated into human experience from cradle to grave. They constitute a network of relations that takes in mediated bodies, corporate and public institutions, and laws of protection and coercion. We expand on this aspect in Chapter 4. Formal and semi-formal work practices, physical objects and representations of personhood are also included in this. As you can see, this complexity is increasingly fragmented across devices, software, territories and time. 28

9781350068278_txt_app.indd 28

Chapter 1

08/03/2021 11:13

Fragmentation from the technical perspective of computer systems is a term that describes the process which occurs when a program es random access memory (RAM) to carry out its operations. These processes are requested at interface level when performing operations such as rotating an image or playing a video. In performing these operations, the program continuously uses multiple blocks of memory. The spaces that the memory occupies on the disk become smaller and smaller over time. Internal fragmentation happens as more memory is allocated to a program operation than is needed, due to the constraints of fixed partitions. External fragmentation happens as the memory spaces available are smaller than the program needs, leading to inefficiency of allocation. The end result of this fragmentation is a gradual degradation in software performance. Given this, fragmentation is a result of algorithmic disorganization, and of the electro-mechanical limitations of computer processors. Digital experiences are built between devices involving changes of context and the introduction of noise. Fragmentation reaches from the hardware to the cognitive dimensions of digital interfaces.

Fragmented distribution With the growth of cloud computing as the dominant paradigm of distributed network computing, fragmentation has become invisible at interface level. Cloud computing allows for separate files of a single system, for example all the images on an individual Facebook feed, to be stored in multiple global locations, but be presented at interface level as a single scroll or slideshow of personal photos. The ability to retrieve these images, or download them all, depends on the willingness of the provider, and the availability of a legal framework to do so. Cloud computing also allows separate parts of the same file to be stored in different locations, and for different processes to be carried out by different cloud-based servers. Living with the digital interfaces enabled by cloud computing thus involves the fragmented affordances of mobile devices, geolocation technologies and software as a service. Seb Franklin positions cloud computing as a cultural object, describing the distributed storage of digital data as symptomatic of the age of ‘ubiquitous informatics’. As Franklin puts it, ‘“Cloud computing” is used to describe the relocation of computational resources from individual local machines to a distributed network’ (Franklin, 2012: 447). He articulates the contradiction between technical materiality (the integrated circuits, logic gates and processors that determine computing Complexity and fragmentation

9781350068278_txt_app.indd 29

29

08/03/2021 11:13

performance) and the conceptual immateriality of the cloud (invisible, metaphorical and intangible). The network effects evident in Web interfaces that provide clear signifiers of connectivity – online or offline – are, through cloud computing, transformed into a paradigm of permanent capture and perpetual connectivity. This is primarily fragmentation of a spatial kind, since cloud computing enables people to be untethered from servers and storage media. In turn, the interfaces enabled by cloud computing include those that deliver the experiences of social media, collaborative working and streaming media content. With processing power delivered remotely from distant infrastructure and storage media accessible in infinite capacity at the tap of a smartphone screen, the fragmentation of digital media is concealed by the immaterial and amorphous nature of cloud technologies. As Franklin says, ‘The cloud can be deformed, compressed, expanded, intensified or thinned out to fit any available space and stretch beyond the reach of any earthly material base’ (Franklin, 2012: 451). As the individual operations of digital processing and fragments of data are spread over an ever-wider area, Alexander Galloway warns that the increasing intangibility of networked connectivity means the threshold of perceivability ‘becomes one notch more invisible, one notch more inoperable’ (Galloway, 2012: 25). Distribution of media content is enabled across multiple platforms and interfaces. This fragmented landscape constitutes a complex ecosystem of menus, catalogues and categories. Taking the category of film and television content alone, ‘Netflix’, ‘Amazon Prime’, ‘Hulu’, ‘Acorn’, ‘Now TV’ and ‘Sling TV’ all compete for attention and subscriptions in the same sector. The interface metaphor they deploy is a catalogue. Every moving-image streaming platform organizes content according to categories, which are themselves infinitely reorganizable depending on who is watching, what is the most popular content, or what the platform wishes to promote. Each platform strives to promote its own content, thus fragmenting the viewing experience across multiple services and interfaces. One phenomenon facilitated by streaming interfaces is bingewatching. Binge-watching is in part driven by the portability and navigability of streaming content interfaces (Steiner and Xu, 2018) and scholars note how binge-watching is disrupting norms of televisual advertising as well as traditions of content production and distribution (Schweidel and Moe, 2016). The ‘Netflix’ interface facilitates this behaviour by automatically playing one episode after another of a TV series, without requiring any interaction from the viewer. The content is displayed against a black backdrop in order to better echo the black screens of smartphones and tablets. Designing these interfaces involves accounting for differences in screen size between devices and between 30

9781350068278_txt_app.indd 30

Chapter 1

08/03/2021 11:13

authoring technologies. In an analysis of streaming interfaces, Molly Lafferty notes how they are constrained by design choices relating to viewing distance, display resolution and the D-Pad that limits navigation to four directions: up-down-left-right, thus enforcing a rectilinear grid layout (Lafferty, 2016). Manovich sees these navigational logics as encoding models for understanding the world, saying: ‘A hierarchical file system assumes that the world can be organized in a multi-level hierarchy’ (Manovich, 2013: 76). Manovich proposes the concept of computers as a metalanguage platform. By this he means that the expressions produced by digital interfaces consist of a fragmented hybrid of visual aesthetics that includes dynamic menus, animated transitions, autoplay videos and nested interactions across multiple pages of a site or screens of a mobile app.

Figure 1.2 Netflix home screen. Bloomberg / Getty Images.

Fragmented devices Digital devices consist of a rapidly increasing typology of modes including watches, wearables, voice interfaces, tablets, phones, VR headsets, laptops and desktops. Each has its own language of interaction and display, bounded by a set of symbolic conventions and constraints. Matthew Fuller and Andrew Goffey describe these interfaces as offering ‘a regime of mazes and boxes’ where users are required ‘to be Complexity and fragmentation

9781350068278_txt_app.indd 31

31

08/03/2021 11:13

enrolled into an economy of minimal movement’ (Fuller and Goffey, 2012: 56). In their view, living with digital interfaces involves the production of the user as a behavioural unit – to be tested and optimized by the underlying technologies of digital interfaces. The landscape of interfaces is thus increasingly fragmented across different devices, each of which enrolls its user into a proprietary set of interface conventions in the form of signifiers and interaction possibilities. The smartphone camera race is a good example of this. No sooner does Samsung introduce low-light image capture technology to its smartphone range, than Google’s new smartphone chipset outdoes it with high-dynamic-range imaging, and so on to infinity. Designing for the many screen sizes and resolutions is an increasingly complex task, with each interface design required to work across multiple devices. Repurposing designs for different devices is, in fact, a whole sub-industry, and it has given rise to a new technical design paradigm – the responsive interface design. Responsive designs adjust automatically to different screen sizes and orientations. Designers thus encode an aesthetic description of living with digital interfaces into devices and operating systems, and the interactions they afford – a concept further explored in Chapter 5. This fragmentation of surfaces through which digital experiences are distributed illustrates a reading of interfaces as ‘both experiential objects and symbolic systems’ (Cramer and Fuller in Ash, 2016: 17). Digital interfaces comprise a set of visual, sonic and haptic symbols such as icons, sliders, chimes and vibrations, but they are also comprised of the underlying code that instructs the system. Objects, such as smart watches, are physical interfaces with a size, colour and weight. The realm of perceived symbols is reached through physical interaction with the object. Alexander Galloway emphasizes this multidimensional aspect of digital interfaces with the idea that ‘an interface is not a thing, an interface is always an effect’ (Galloway, 2012: 33). By this he means that ‘any single digital interface quickly gives way to multiple interfaces between a variety of hardware and software’ (Ash, 2016: 18). As digital interfaces get more complex, so do the complexities of the relationship between hardware and software. For example, Amazon’s ‘Echo’ device, which responds to voice commands, is an interface not just with its owner’s voice, but also with voices that it picks up from background television and radio broadcasts. It is an interface between Wi-Fi radio signals, Amazon’s speech recognition algorithm ‘Alexa Voice Services’, electrical power and the sonic properties of whatever spatial environment it sits in. The set of relationships this interface mediates is then fragmented far beyond its ability to read the news when prompted.

32

9781350068278_txt_app.indd 32

Chapter 1

08/03/2021 11:13

Fragmented attention Our attention is constantly divided between these devices and the contents they both host and support. Linda Stone calls this ‘continual partial attention’, with reference to the state of constant interruption by alerts, notifications and updates from digital interfaces. She says, ‘We pay continuous partial attention in an effort not to miss anything. It is an always-on, anywhere, anytime, anyplace behaviour that involves an artificial sense of constant crisis’ (Stone, n.d.). This fragmentation of concentration is consciously exploited by digital systems as they compete in the crowded marketplace for attention. This drives the increasing proliferation of immersive digital experiences, ones that exclude all other content from human perception by providing a 360˚ visual and sonic environment. Tim Wu has charted the history and rise of the attention economy from the invention of advertising and its distribution through mass media, to the phenomenon of viral videos, social media and clickbait (Wu, 2017). The deliberate attempts to drive traffic via digital interfaces, often with the hidden aim of capturing user data, is facilitated by the design of Web browser software. For example, the popularization of tabbed browsing helped dissolve online task boundaries. This means that users of browsing interfaces can switch in fractions of a second between unrelated tasks; reading the news one second, buying shoes the next. The boundaries between tasks are collapsed by the interface. One study of viral media observed a weak correlation between information quality and virality. This was found to be a direct result of what is called ‘poor information discrimination’, as a consequence of information overload and limited attention (Qui et al., 2017). This provides an interpretation for the high volume of misinformation observable online, resulting in what Mark Manson has described as ‘the never-ending stream of non sequiturs and self-referential garbage that passes in through our eyes and out of our brains at the speed of a touchscreen’ (Manson, 2014).

Technological approaches We describe complexity and fragmentation in this chapter in order to recognize that digital interface designers can account for the systemic implications of their work in their design practice. Next, we want to enable a discussion of what we should do about complexity and fragmentation, and what the consequences are of dealing with these issues. Complexity and fragmentation

9781350068278_txt_app.indd 33

33

08/03/2021 11:13

What follows are some practical, theoretical tools and methods for dealing with complexity and fragmentation. This discussion should be seen as a way to start acting in the micro context of the interface, and on the macro issues of complexity and fragmentation in digital experience. One way of handling the kinds of complexity that result from the extreme fragmentation of the social and political sphere, as it is experienced through digital systems, has been via systems theory. Systems theory provides a framework for analysing intersecting phenomena at various scales by paying particular attention to the ways they interact. One characteristic of systems theory is that it crosses disciplinary boundaries and has been used to investigate social systems (Bateson, 1972), psychology (Maturana, 1980) and biology (Hodgkin and Huxley, 1952), often finding a way to describe the interdependencies of these knowledge domains by highlighting the synergistic entanglement of their temporal and spatial boundaries. More recently, systems theory has received renewed theoretical attention in the form of actor network theory (Latour, 2005), which proposes that social and natural phenomena are comprised of networks of relations that change through time, and that nothing can exist outside of these relations. This idea has proved influential in the field of technology, where it has been used to describe the relationships between network infrastructure, electricity, computer code, devices, people and societies. For example, Fuchsberger et al. (2014) use actor network theory as the basis for research into digital, physical and social materiality in a semiconductor factory. Living with digital interfaces involves a complex interplay of attention, interaction, screens, code, markets and network infrastructure. Systems theory offers a lens through which to view this fragmented territory, one that takes fragmentation as contributing to complexity and seeks to explain its effects without being reductive or metaphorical. ‘Cybernetics’ refers to a particular direction taken by systems theorists in the information age. It is concerned with how technical and natural systems respond to information. The key terms in cybernetics are control and feedback referring to the governing mechanisms of technical systems and the ways that systems tend towards selfregulated constraint (the result of a negative feedback loop) or amplification (the result of a positive feedback loop). Dubberley and Pangaro (2015) draw the threads of culture, computing and cybernetics together to tell the story of cybernetic theory’s connection to digital interfaces and design. Key figures in this story include Gordon Pask, Ted Nelson and Norbert Wiener whose work attempted to integrate the control mechanisms of biological systems with electrical engineering and neuroscience. Driving all this was computing, which, influenced by Shannon’s theory of communication, was seen as the engine of development in multidisciplinary research. The connection with design is 34

9781350068278_txt_app.indd 34

Chapter 1

08/03/2021 11:13

found in unifying design ideas such as Christopher Alexander’s Pattern Language (1978) and Herbert Simon’s The Science of Design (1988). The former finds a distinct echo in Google’s ‘material design’ guidelines, the latter provides a basis for the application of design thinking in business and science. Cybernetics has shown how information in the form of computer code influences systems far beyond the immediate input mechanisms of mouse, keyboard, gesture or internet connection. Complexity theory deals directly with uncertainty and non-linearity and has paid detailed attention to organizational change, particularly how organizations (and organisms) become more sustainable, adaptive and innovative. A key concept in complexity theory is complex adaptive systems. The classic example is human cells that have individual functions but feature emergent properties when grouped, and can adapt to their environment. Complexity theory aims to identify the guiding characteristics of such systems. These include the importance of diversity for system resilience (evident in genetic inheritance), decentralized mechanisms of control (shown by systems of local government) and emergence (demonstrated by flocking birds). The World Wide Web is a complex adaptive system as it responds with network reach to the millions of new connections that happen every minute. It consists of a decentralized set of networks and features, emergent properties at the technical level when data packets are refused or denied, and viral media at the content level. Living with digital interfaces from the perspective of complexity theory means inhabiting and populating a complex adaptive system, contributing to its complexity through our interactions with it. As ways of addressing complexity, systems theory, cybernetics and complexity theory all have related but distinct approaches. They are all characterized by a desire to view technical and natural phenomena as interconnected, and by a refusal to reduce or simplify.

Design approaches Digital interfaces encourage a form of individuation that has a fragmentary effect. For example, the early understanding of websites as a kind of publishing platform has resulted in the current fragmented news landscape. As mainstream news organizations took themselves online, and digital-first news sites emerged, people looking for news content were forced to visit many separate websites. In contrast, news magazine sites such as ‘Feedly’ or ‘Metacritic’ produce no content of their own, they simply gather stories from elsewhere on the Web and package them into a new interface. This method of synthesizing the Complexity and fragmentation

9781350068278_txt_app.indd 35

35

08/03/2021 11:13

complexity arising from the overwhelming multitude of digital interfaces and systems is aggregation. Aggregation offers a way of collecting disparate data sources and devising a new delivery method for them. Designing aggregating interfaces involves a technical scraping function that groups chosen content, and an interaction and interface function that delivers this content to people. Using an aggregation interface means navigating news stories that are consumed separate from their editorial context. The particular ‘voice’ of a news source, which arises from its political stance, its editorial decision-making, and its individual writers recede. While complexity may decrease, a new kind of fragmentation comes into play. By removing news stories from their original context and presenting them adjacent to competing or contrasting content, the aggregation site itself embodies a fragmented news-scape, one that both encourages and prevents a consistent news delivery mechanism. Deriving categories of analysis by grouping, clustering and merging is the process of synthesis. The effect of digital interfaces on everyday life is characterized by a procession of syntheses. For example, Facebook synthesizes all the activity in your personal feed to those posts and comments that its proprietary algorithms consider most likely to lead to more clicks and thus more advertising revenue, via a process of further synthesis by which the system gains knowledge about your habits, attitudes and opinions. These sequences of layered and nested syntheses are products of ways that designers handle the complexity of unfolding interactions through time. Living with digital interfaces involves living with synthesis. Jon Kolko has defined synthesis in design as making sense of chaos – what he calls abductive reasoning. He says, ‘Synthesis reveals a cohesion and sense of continuity; synthesis indicates a push towards organization, reduction and clarity’ (Kolko, 2010). Daniel Fallman describes a similar process: ‘Fieldwork, theory and evaluation data provide systematic input ... but do not by themselves provide the necessary whole. For the latter, there is only design’ (Fallman, 2003). Abductive reasoning is positioned as distinctive to deductive or inductive thinking, because it holds the possibility of leading to new knowledge based on a type of inference enriched by personal experience and skill. ‘It is the hypothesis that makes the most sense given observed phenomenon or data and based on prior experience’ (Kolko, 2010). Pragmatist philosopher Charles Peirce describes abduction as ‘the idea of putting together what we had never before dreamed of putting together, which flashes the new suggestion before our contemplation’ (Peirce, 1988). The form of synthesis found in design involves a gathering of images, evidence and responses – a field of impressions that coalesce through the application of various techniques such as prioritization, framing and connecting. 36

9781350068278_txt_app.indd 36

Chapter 1

08/03/2021 11:13

Designers set constraints as a way of defining the limits of what is possible or desirable in a design situation. Constraints can be determined by resources of time, skill or material; they allow designers to act in a determined space of opportunity and achievability. The idea of constraints in design can be traced back to cybernetics, in particular Ross Ashby who defined constraints as the aspects of a system pertaining to the amount of variety found in them (Ashby, 1956). In design, it is the designer who imposes constraints on what Horst Rittel calls the ‘design space’ (Rittel and Webber, 1973). Constraints are design options that are excluded from consideration. Two types of constraint are present in design. The first limits the extent of a design process or the extent of the components needed to design a product. The second is an agreed limit to the conceptual territories available to designers. Onarheim and Wiltschnig (2010) find a dual capacity of constraints in design; to simultaneously enable and limit what designers can do. Digital interfaces are limited by many different types of constraint, including screen dimensions, software operability, network reliability, processor speed and battery life. Designers of digital interfaces must work within these constraints and look for opportunity in the ensuing design space for practical, aesthetic and equitable outcomes.

Research methods The complexity of human behaviour where it intersects with similarly complex and fragmented digital technologies has led to a range of methods that are useful to analyse and design for this complexity. Methods of inquiry in the physical sciences tend to prioritize hypothesis testing and statistical analysis. For example, the gold standard of medical research is the randomized control trial in which medicines are tested for their effect with a sample of patients, half of whom receive a placebo. The emphasis is on the gathering of evidence in the form of numbers to support a claim, and the subsequent mathematical analysis of that evidence. In digital interface design, these types of methods are used to evaluate existing systems. For example, eye-tracking technologies capture where on a screen people tend to look the most often, or for longest. That area then becomes the location for important information. A/B testing – where people are exposed to two different versions of a design and inference is drawn from their reactions as to which is the most effective – is used to develop and change designs. In both cases, the direction of research is towards quantitative measurement in the form of statistical generalizability – if most people click the green button, let’s make the buttons green. Complexity and fragmentation

9781350068278_txt_app.indd 37

37

08/03/2021 11:13

In social science, research aims are often to uncover the intricacies and complexities of human behaviour, social structure, thoughts, cultures and histories. The way that social scientists do this involves a range of techniques, some specific to individual disciplines. Units of analysis in political science, anthropology or history may vary, but they often focus on interpretive examination of documents. The positivist tradition in social science seeks to establish empirical evidence using quantitative methods. In contrast, the interpretive approach is more interested in the qualities of human experience as revealed by descriptive analysis. This is the qualitative research paradigm. The aims of qualitative research is, in Geertz’s famous phrase, ‘thick description’ (Geertz, 2008). That is, methods in qualitative research are intended to account for the messiness and entangled nature of everyday life. To that end, qualitative researchers often start with observation including note-taking, video and photographic observation. They may then move to the elicitation of spoken responses in the form of interviews, conversations, and even songs and poems. In archaeology, physical artefacts are important, and so careful examination of the symbolic and social functions of objects is an established technique alongside more quantitative methods such as radiocarbon dating. An important aspect of qualitative research is that it is unavoidably political. Researchers pay deliberate attention to the positions of power they may unwittingly adopt with regard to their participants and respondents in a way that researchers in, say, physics are not usually required to do. This makes qualitative research explicitly a moral and ethical enterprise. Design research as an academic discipline values a foundational document – Research in Art and Design – in which the author Christopher Frayling outlines three approaches labelled research into, through and for design (Frayling, 1993). Research into design involves the study of existing designs, research for design involves the invention of new materials and forms. Design has long resisted being classed as a science in order to make room for practice, that is making things, generating forms and artefacts in the process of doing research: Frayling’s through design. This has led to an enriching methodological diversity through which designers act in ways familiar to both quantitative and qualitative researchers but also in ways distinctive to design. One of these is participative or co-design (Sanders and Stappers, 2014) in which designers work with people to create new things. This may be a part of the product design process, where possible future users of an object or interface are consulted throughout the design process so as to refine its end state to one validated by those who may use it. Another distinctive approach used by design researchers is metadesign, where the design work involves creating a metaphorical (and sometimes physical) space within which 38

9781350068278_txt_app.indd 38

Chapter 1

08/03/2021 11:13

non-designers can be creatively engaged. Designers are profoundly involved with materials. It is through engagement with materials that designers seek to decode and analyse the entangled complexity and fragmentation of socio-technical relations. In summary, the technological and cognitive fragmentation that arises from living with digital interfaces leads to an increasingly complex and entangled socio-technical world. People interact with individual devices dependent on technical and political systems that embody specific values at both personal, local and global scales. The effects of this fragmentation include a loss of attention, interpersonal ability, ownership and empowerment. The increase in complexity makes it difficult for people to understand what is going on in the background of the software and hardware they use every day and how it shapes their views, relationships and actions. Traditional ways of countering fragmentation include theory and synthesis. Countering the effects of complexity has been done by thinking in systems. Designers have done this through the application of constraints, a reflective awareness of process, and an insistence on material engagement. In the end, perhaps we need to recognize as designers that our views of the world around us can only ever be fragmented and conditional, but that we should strive to act in ways that respect all living things. We should recognize that we can only ever see whatever the specific set of social and technical circumstances we inhabit allows us to see. We must share our skills and abilities widely, and remember that there are many forms of knowledge with which to enrich our understanding.

Complexity and fragmentation

9781350068278_txt_app.indd 39

39

08/03/2021 11:13

Chapter 2 Social interfaces

40

9781350068278_txt_app.indd 40

08/03/2021 11:13

As discussed in the introduction, digital interfaces are often understood as objects of translation from one form of representation (i.e. machine readable) to another (i.e. human readable). Interfaces exist at every moment where one form of representation encounters another; hardware components (via voltages) to system software and system software to application software, and so on. This reading of the interface denotes it as an apparatus or a well-defined device with well a defined purpose (following Friedrich Kittler) and proposes the interface as an object over its effects. Alexander Galloway uses Gilles Deleuze’s essay What is a dispositif (1992) to emphasize that ‘one should not focus so much on devices or apparatuses as such and more on the physical systems of power they mobilize’ (Deleuze in Galloway, 2012: 18). Galloway suggests that interfaces should be studied and understood as components of larger social systems, rather than as simple devices or technical components. Galloway’s critique of Kittler’s studies of technical media is that its strong focus on technical media in itself cannot account for the complexities of digital media. He proposes a shift in thinking from media to mediation: ‘A philosophy of mediation will tend to proliferate multiplicity; a philosophy of media will tend to agglomerate difference into reified objects’ (Galloway, 2012: 17). This distinction between media and mediation is valuable when considering and designing with digital objects, because it positions them as being inside and part of social interactions. This chapter explores interfaces that enable, act on, determine, build and manipulate social interactions and social relations, both intentionally and unintentionally, as they are experienced through digital interfaces. The social here is played out in our interactions with each other, with technologies and with natures. It is not a given, predetermined entity (a society), but rather a diverse entanglement in which non-human and human agency all play a role in continuously constructing and reconstructing our daily social lives. We are considering the social effects of interfaces: how they permit or restrict access to certain users, reinforce or disrupt social hierarchies Social interfaces

9781350068278_txt_app.indd 41

41

08/03/2021 11:13

and construct identities, seek to direct and influence behaviours and invoke and perform normatives. This underlines the idea that societies are as much a product of the individuals within them as individuals are products of their societies. It’s easy, in discussion about technology and society, to fall into technological-determinism, which suggests that technology shapes society. Here, we are exploring how social interactions shape and enable the development of digital interfaces and how interfaces simultaneously shape social relations. This chapter concludes with suggestions of methods for designers to develop critical awareness of the social implications of their work, and to design critically with and through the tools at their disposal.

Design for social impact Information technologies have been harnessed for social impact throughout a wide range of fields, from healthcare and education to environment and sustainability. The recognition of the ‘power of design to address pressing global challenges’ (World Design Summit, 2017) was emphasized by the World Design Summit Montreal in 2017. If one couples this with wider trans-global aims such as ending poverty, protecting the planet and ensuring prosperity for all, as proposed by the UN Sustainable Development Goals, then interface design plays an important part in manifesting and making accessible various projects and initiatives, some of which will be discussed in this chapter. There are many examples of initiatives that leverage the ‘power of the crowd’ via digital platforms. The term ‘power of the crowd’ has arisen in the context of network culture. By some, it is seen as a means to democratize the tools of production, consumption and dissemination, while others highlight the opportunities (and problems) of digital work done through methods such as crowdsourcing and telepresence. The idea of crowdsourcing has been heavily criticized because it can atomize and alienate workers thereby diminishing their voices. Additionally, the digital labour market is mostly unregulated. Ride-sharing company Uber is discussed as an example of digital labour in more detail in Chapter 3. In this chapter, we will focus on the opportunities for social impact and social consequences, afforded by interfaces. GoodGym is a project that uses the power of the crowd without being linked to a financial exchange. The project grew out of frustration with ‘normal gyms being a waste of energy and human potential’ (GoodGym, n.d.). It was first submitted as a small-scale project to start-up accelerator Social Innovation Camp in 2008, and has grown 42

9781350068278_txt_app.indd 42

Chapter 2

08/03/2021 11:13

to operate in forty areas within the UK; GoodGym is now on track to support almost 30,000 people. GoodGym works by marrying staying fit through running, with neglected tasks within a community, such as battling loneliness and isolation amongst an older population. For example, runs could involve a one-off task for a member of the community who might not otherwise be able to do the task themselves, like clearing a garden. Or the runs could incorporate more regular meetings with people within the community, to address isolation (GoodGym). GoodGym can be understood as a new type of digital community service in the wider context of austerity within the UK, which has resulted in significant cuts to council budgets, real-term spending cuts to public services and the National Health Service, coupled with a government agenda of ‘Big Society’. The ‘Big Society’ is a political concept that became a flagship policy of the Conservative general election manifesto in 2010, aiming to empower local communities, and people, through devolution and volunteerism, all of which coincided with large cuts to public spending. While GoodGym is a great solution to addressing the gap left by the withdrawal of public services, it also provides a useful example through which we might question how we conceive of and address the problems that digital interfaces often attempt to solve. To address the root of the problem that GoodGym is attempting to solve, one must ask why some older people feel isolated in one of the world’s wealthiest and most wellconnected countries. GoodGym can be seen as a solution to a pressing issue, but also as a symptom of the changing relationship between government, technology and society. The further problem that arises when projects like GoodGym are seen in the context of the ‘Big Society’ is that it shifts the responsibility of care from the state, which has a duty to the whole territory, to individuals and small groups and businesses. Smaller organizations do not have the same resources and access to territory, thereby fragmenting the services provided based on additional contributing factors for digital interfaces such as accessibility and network coverage. Looking at the map of GoodGym’s coverage, it becomes clear that it is predominantly urban. Of course, population density is partially responsible, but this coverage also highlights the differing quality of connectivity between rural and urban areas. There is still an inequality in digital skills acquisition, which plays out most sharply across generations. It was never within GoodGym’s remit to address the digital divide, however as a digital project it functions within it. Mariana Mazzucato deconstructs what she calls the ‘myth of the public vs private sectors’. In her book, The Entrepreneurial State, she points out that the risks that are inherent in innovation have often been Social interfaces

9781350068278_txt_app.indd 43

43

08/03/2021 11:13

carried by the state, not by the private sector and that, in the past, public agencies were most often the places where innovation happened. This in turn benefitted the private sector, a reversal of how the relationship is currently conceived (Mazzucato, 2018). The examples that Mazzucato gives are what later became the internet (developed by NASA and DARPA) and the BBC’s learning programme on computer code in the 1980s. The increasing responsibility of designers and technologists to develop solutions to social issues, without the support of the state, and at great risk, often puts the burden of that risk on to those who the designers and technologists are trying to help. If GoodGym fails to finance its operation, for whatever reason, then the people it supports will also be failed, with no recourse to state services. In this way, the emphasis on social good and social care finds itself at odds with the inherent risk involved in start-ups and financing.

Soft interfaces: healthcare and loneliness In 2018, after a twelve-year pause, Sony released the next generation of Aibo, a robotic pet and ‘smart’ companion. This iconic pet was first introduced as a consumer product in 1999 and by 2006, when Sony decided to discontinue Aibo, it had been added to the Robot Hall of Fame at Carnegie Mellon University. In Japan, Aibo robot pets are given much the same farewell as humans, attesting to the emotional bond that had been created between people who had worked on the project and the robotic pet. The newest version promises to be able to form emotional bonds with its owners by using artificial intelligence to learn owners’ behavioural patterns and respond accordingly, working towards simulating the relationships that people develop with their pets. Aibo’s face mimics facial expressions, working with eye contact and ‘conveying emotions intuitively’ (Sony, n.d.). Aibo interacts as a mirror to ourselves in that the material of the device does not actually respond to emotions or learned social behaviours, but is designed to accommodate human expectations of how living, animate beings should behave. The project talks about a need to anthropomorphize the objects we surround ourselves with. Anthropomorphism is the process by which inanimate objects and devices are designed to resemble living or even human forms. It is an example of the interface as surface and mirror, obscuring the complexity of Bratton’s ‘stack’, and the interface effects, by conveying them as somehow alive and animate. Aibo conceals the fragmentation 44

9781350068278_txt_app.indd 44

Chapter 2

08/03/2021 11:13

and complexity of our digital lives by creating a reflective surface between user and object – a mirror in which users seemingly recognize living forms. The ‘conveyance of intuitive emotions’ through expressive eyes and gestures are interface design choices, not emergent behaviours of an actual living thing. This is not the only example of robotics projects that address our human need for companionship. PARO is a therapeutic seal robot developed by AIST in Japan. It is used in healthcare settings to bring animal-assisted therapy (AAT) into situations where live animals would be inappropriate. PARO is also used as a companion in homes for elderly people, and for dementia care. At a time when many wealthy countries are struggling with an increasing older population and ever larger healthcare bills, ‘carebots’ like PARO seem an obvious and affordable solution to combat loneliness and the need for companionship. There are, however, ethical considerations to be made: Does it matter that some dementia patients won’t know or understand that they are interacting with a robotic carebot rather than a real seal? And more importantly, what does the need for such carebots say about the relationship between human carers and their patients? How does PARO address the often-difficult working conditions of carers? How does it address the messy lives that we all inevitably live, in which stress, working conditions and interhuman relationships play an important part? As demonstrated in the example of GoodGym, PARO is also deployed as a technology which is used to solve a symptom of a much wider social issue in which the underlying causes remain unaddressed. PARO and, to a certain extent, Aibo, could be described as ‘socializing’ interfaces. They are designed to create social interactions between them and their users, where the interface itself is the subject of social interaction. This makes it even more urgent to question the ethical implications and consequences of these artefacts – especially with a view to what their impact might be on the human relationships we develop. Both PARO and Aibo are intentionally developed socializing interfaces designed to address specific needs, but the increasing fidelity of other types of interfaces – such as smart assistants – has also begun to position them as socializing interfaces. For example, over a million people asked Amazon’s Alexa to marry them in 2017 (Leskin, 2018) and though most of these requests were probably jokes or provocations, there is an increasing amount of research that suggests children in particular find it difficult to differentiate between technological interfaces and living beings. Sherry Turkle’s Second Self includes a comprehensive study of children’s interactions with computers, and how they struggle to reconcile their knowledge of a Social interfaces

9781350068278_txt_app.indd 45

45

08/03/2021 11:13

computer as a technical object with their feelings that it is somehow alive (Turkle, 2005). This dichotomy is further entrenched in an age of talking, responsive and apparently autonomous interfaces such as Amazon’s Alexa and Apple’s Siri. This is especially problematic when users, whether children or parents, may not conceive of these assistants as interfaces as defined by Bratton’s Stack, and where ‘thinking of these “friends” and “mentors” as subordinates may obscure the fact that many of them will effectively serve as spies’ for targeted advertising or surveillance (Shulevitz, 2014). The ways that these ‘living’ interfaces may reshape social interactions remains to be seen, but these potential impacts need to be considered by the designer. This is especially important when we consider how these socializing interfaces could use their fidelity and sociality for exploitation, deceit or, more prosaically, when the socialization of commercial activities is normalized from the earliest ages of childhood.

Accessibility: democratization of tools Sugar, the graphic user interface (GUI) for the One Laptop per Child (OLPC) project, is another example of an interface design aiming for a wide social impact. The OLPC project was set up in 2005, working across the world to ‘empower the world’s poorest children through education’ (onelaptopperchild.org) by providing free laptops to school age children in developing countries. OLPC develops hardware, content and software for ‘collaborative, joyful, and self-empowered learning’ (onelaptopperchild.org, n.d.). It was founded by Nicolas Negroponte, founder and chairman of the Massachusetts Institute of Technology’s (MIT) Media Lab, and draws on pedagogy pioneered by (amongst others) Seymor Papert, who also worked at the MIT Media Lab on social constructionism, mathematics and artificial intelligence. Constructionism emphasizes experience-based learning and suggests that people construct knowledge for themselves in interaction with their environment. It takes constructivism one step further by exploring ‘learning through making’. In order to achieve the OLPC’s aim, the NGO developed the OLPC XO laptop and custom-made open source software. The laptop features an ultra-low power screen, flash memory and wireless mesh networking between machines. The XO’s Sugar interface does not have a desktop, folders or windows – it doesn’t follow the pervasive desktop metaphor of most classic graphic user interfaces as discussed in our introduction. Instead, it is arranged around a set of individual activities and programs 46

9781350068278_txt_app.indd 46

Chapter 2

08/03/2021 11:13

that can only be used one at a time, such as browsing, calculating or chat. A journaling facility automatically saves every session. The idea is that children can work on clearly defined tasks without the system producing distractions or interruptions. The consequences of distraction, which have been designed into most contemporary interfaces through common features – such as notifications and bottomless scrolling – are financing an ‘attention economy’ based on views, clicks and interactions. These principles are part of a timely discussion amongst design ethicists like Tristan Harris, the former Google Design Ethicist and co-founder of the Center for Humane Technology (Harris, 2016). While the Sugar interface was developed at least a decade earlier, it is interesting to read interface design as a prediction of the future. We discuss the interface design elements that are used to draw our attention in more detail in Chapter 5.

Figure 2.1

OLPC’s Sugar interface The Sugar interface is a desktop environment that was developed specifically for the OLPC project. It was designed for interactive learning for children. It does not use the commonly used metaphors of desktops, multiple folders and windows. It runs one program at a time, limiting interaction to the task at hand.

Social interfaces

9781350068278_txt_app.indd 47

47

08/03/2021 11:13

The Sugar Zoom interface also enables collaborative and networked working. Its neighbourhood feature displays all the other OLPC devices and activities in a child’s community. It ‘graphically captures their world of fellow learners and teachers as collaborators, emphasizing the connections within the community, among people, and their activities’ (onelaptopperchild.org, n.d.). Furthermore, it does not require literacy in any language but works purely through the use of pictographs. It was developed from scratch in collaboration with interdisciplinary teams, including teams from design agency Pentagram and open-source catalysts Red Hat, and its aim was to revolutionize computer interfaces and democratize digital infrastructure by making hardware and software more accessible. OLPC has attracted criticism for what has been seen as an overly western view of priorities for developing countries. Furthermore, it has been criticized by some for a centralized and top-down approach to designing the product. The lessons of OLPC raise important challenges for designers in terms of agency and participatory design processes. Working with a bottom-up approach and making sure that diverse user groups are not only consulted, but are actively participating in the design process, becomes an ever increasing requirement. Similar projects, such as India’s $20 ‘Sakshat’ laptop and Argentina’s ‘Conectar Igualdad’, build on Microsoft Windows, but the XO remains a rare example of a digital interface design intended specifically for social impact in the context of education in different countries. It has granted an estimated 2.5 million children access to computer technology.

Collaborative interfaces: beyond western-centrism The One Laptop Per Child (OLPC) project also illustrates an often-held conception that the development of digital technologies takes place in the western world, and through a very particular view of the world. In 2015, Google Photos, a photo sharing and storage service developed by Google, developed a simple image-recognition feature, which would tag uploaded images with keywords based on their content. These systems are often prone to failure, and in a notorious incident, Google Photos tagged two African-Americans as gorillas. The algorithm behind the facial recognition tool for Google Photos had been given a limited set of data, without people with non-white skin to learn from (Hern, 2018). This is an apt demonstration of a common phenomenon in which human biases are translated into machine biases, and it demonstrates how algorithms and other technical systems are never neutral, but often carry the biases of their designers. Building 48

9781350068278_txt_app.indd 48

Chapter 2

08/03/2021 11:13

and designing interfaces therefore requires us to question our own biases and to make sure the design work we do reflects the diversity of our users. Another example is Microsoft’s ‘Tay’, a chatbot deployed to Twitter in 2016 and designed to demonstrate Microsoft’s machine learning capabilities by responding to and conversing with users. However, sixteen hours after launch, Tay was removed, after Twitter users bombarded it with racist and anti-Semitic tweets to a point that it, too, began to tweet racist and anti-Semitic statements. Google similarly removed all tagging that referred to gorillas when their ‘intelligent’ system displayed socially inappropriate behaviours. Removing and turning off these systems prevented further harm, but highlights the inability of the world’s largest technology companies to respond to bias in the process of development, or in the deployment of new interfaces. It is increasingly important to ensure that diversity and ethical principles are built into systems from the outset. Equally important is to be able to hold companies that build digital systems to account, which requires an open and transparent approach to building software and the resulting interfaces. In 2013, Mark Zuckerberg, founder of Facebook, developed the idea to bring the internet to places that were struggling to develop the infrastructure. He declared access to data as a basic human right. He planned to work together with telecommunications companies of countries with less developed networks in order to provide limited, but free, data to customers through a free app, developed by Facebook, with limited access to services that were selected by Facebook (and which included Facebook). This was presented as a humanitarian effort to fulfil a human right of access to free data, while helping the economy to grow. Critics of the proposed project voiced concern, and were especially vocal in India, which was where Facebook planned to roll out the pilot in a bid to capture a large and, as yet, disconnected user base. In India at that time, the idea of net neutrality was being openly debated. Facebook’s project was heavily criticized in this context, because it was seen as building a monopoly around internet access. By February 2016, the Telecom Regulatory Authority of India ruled that net neutrality would be upheld, which put an abrupt stop to Facebook’s project to provide access to the internet across India. Facebook is a profit-driven company, and while it considered its intentions to be humanitarian, its goal was ultimately still market driven. Beyond that, it demonstrates an attitude toward innovation in which the global presence and success of Facebook provided a perceived justification for large-scale infrastructure development and governance. Again, as with GoodGym and others, we see the intersection of technical systems deployed on a large scale where the responsibilities of the state and the aspirations of business are thrown into conflict. Social interfaces

9781350068278_txt_app.indd 49

49

08/03/2021 11:13

There are plenty of examples of innovative developments beyond a western centrism. Ushahidi, which translates as ‘testimony’ in Swahili, is one such example. It was developed to map reports of violence in Kenya after the post-election violence in 2007. Ushahidi aggregates various types of social media data from key social media platforms with its own custom data from surveys via smartphone apps and SMS submissions, in order to generate useful and timely information in territories where state actors may struggle to intervene. It combines citizen journalism with geospatial information to empower communities. This kind of platform can have a significant social impact: It makes communities more resilient and supports social institutions that are struggling for funding priorities in areas facing economic and environmental stress. The social impact of supporting free and fair elections cannot be underestimated, and this is why the UN dedicates such resources to the effort.

Figure 2.2 Ushahidi Ushahidi is a not-for-profit technology company that uses crowdsourcing to collect data and map responses in situations of crisis and human rights reporting, for example. It gives voice to otherwise marginalized people and decentralizes media reporting.

What platforms like Ushahidi demonstrate, is that the design of digital interfaces needs to carefully consider the interest of marginal groups and edge cases, and support the development of norms of good governance. It also needs to consider the longevity of its service and design with a long-term plan in mind. In this context, it is interesting to consider the expectations of transparency that systems such as Ushahidi inculcate amongst its users. Users of the service come to expect legible and timely information regarding elections, and one hopes 50

9781350068278_txt_app.indd 50

Chapter 2

08/03/2021 11:13

that these expectations over time develop into minimum expectations of state agencies. Immediately after the earthquake in central Mexico on the 19th of September 2017, a group of activists, coders, journalists and lawyers built a platform to provide reliable information about the location of collapsed buildings and collection centres as a layer on Google Maps. To organize aid better, they added digital postcards with detailed information about what was needed at crisis centres. In that way, resources and aid were dispatched more evenly throughout the area. Verificado19s, as the project is called, resulted in an interactive map and a collective database that was updated continuously and in real time, while verifying information that was coming in. The platform was developed because people realized that misinformation and rumours were hampering rescue efforts. It quickly became a trusted source of information and was used by officials and the general population. This is a very localized example of collaborative interface design and collaborative database maintenance that responded to a real problem. The interface and database here was used as a tool to aid a more organized human response, rather than acting as a replacement. It also used the same communication channels that enable rumours to spread more easily in order to counteract these. Verificado19s and Ushahidi are important examples of interfaces and systems that emerge from specific social contexts: primarily the shared need for safety and security in times of crisis. These systems can also set precedents and norms for forms of interaction, responsibility and social consciousness once the crisis has passed. Here, we see examples of how these home-grown systems work with the norms and regulations of the places they’ve developed in to great success, as opposed to the difficulty that Facebook faced when trying to impose its own model and vision on a context already developing its own aspiration for its technological future.

Interfaces for sociality A lot can be said about social media in terms of the democratization of tools that enable user-generated content and peer-to-peer communication, rather than broadcasting and centralized communication channels like TV and radio. Video conferencing apps like Skype, as well as Facebook Messenger and WhatsApp, enable people to connect globally and conduct meetings from the comfort of their own homes. Social interfaces

9781350068278_txt_app.indd 51

51

08/03/2021 11:13

It also enables new groups to form beyond national borders and language boundaries. The Brazilian language school Cultural Norte Americano (CNA) set up the Speaking Exchange, in which Brazilian language students who want to learn English are put in contact with seniors living in retirement homes in the United States. CNA developed a custom-made interface for the Speaking Exchange video conferencing tool. It has a simple purpose-built interface that connects language students with native English-speaking seniors. The project has received a lot of attention and has won multiple awards. The project works intergenerationally, enabling seniors to play an active part in everyday life, enabled through a digital interface. It encourages human interaction across the barriers of age, language and national borders, by digitally supporting students who would otherwise not be able to have access to native English speakers, and in doing so, offers a more inclusive approach to language learning. The Speaking Exchange is a very specialist example of a social platform, in that it has a specific aim and user groups, as opposed to global social network platforms. The current landscape of social networking sites is diverse and manifold, however the top end of the landscape is occupied by a few large technology companies that dominate the market due to their coverage and user base. Diaspora is an alternative example. It is a decentralized and distributed social network that is owned by its users. It runs via independent servers that are coupled together to create a network, rather than via central servers that are owned by a single company. The networks of nodes – or pods – as Diaspora calls them, run the Diaspora software. The tactic of using a distributed network means that data privacy and ownership are differently configured than, for example, with Facebook. The user maintains ownership and full control of their data. No one else has the right to use data for revenue, and users can download and delete their data whenever they desire. In addition, users can sign up with pseudonyms adding a level of anonymity that is not possible with some social networks. Diaspora is built on the three key ideas of decentralization, freedom and privacy. These ideas are the framework through which the software is developed, and decisions are made. At first sight, the interface resembles a stripped-down version of Facebook, with a newsfeed centre-right, a menu bar at the top and a menu on the left-hand side. At closer inspection it becomes clear that Diaspora has more options for customization, which adds a level of complexity. For example, it is possible to format your status message text by using a mark-up system. On the other hand, it does not offer as many features as other social networking sites. Diaspora gives two reasons for that, one is that this software project is entirely developed by 52

9781350068278_txt_app.indd 52

Chapter 2

08/03/2021 11:13

volunteers, and secondly, every feature is evaluated against the three key values of freedom, decentralization and privacy.

Figure 2.3

Diaspora Diaspora is a decentralized and distributed social network, run via independent servers that are coupled together to create a network. The Diaspora software is run on a distributed network of nodes.

In comparison to social network sites like Facebook or messaging apps like WhatsApp, Diaspora has significantly fewer users. It is estimated that is has about one million users, which is significant as a decentralized network, but not compared to Facebook’s 2.23 billion users. And therein lies some of the issue. Facebook is so successful that it is difficult to compete with. Recent years have seen a flurry of alternative models to Facebook and others, which receive attention in the wake of scandals like that of Cambridge Analytica, but almost inevitably fail to gain popular traction. It’s convincingly arguable that Facebook’s penetration is such that in much of the world it forms a social scaffolding that makes it almost impossible to live without. Diaspora also has much more specialized user groups, rather than family and friends, which runs contrary to the reason most people use social networks. In 2014, it made news because it was reported that Islamic State had moved its social media profile to Diaspora after Twitter had blocked its accounts. Since it’s a distributed network, the Diaspora community has no control over the data that is shared, and cannot access it or block it. Contrarily, the centralized control of a platform like Facebook makes targeting and censoring individual users Social interfaces

9781350068278_txt_app.indd 53

53

08/03/2021 11:13

easier. However, this centralization leads to an opacity and a lack of user agency, which makes exploitation easy. These specific use cases highlight another apparent intersection in the rhetoric of social interfaces – the drive for exposure and spread that makes them exciting and popular against the technical and social limitations that confine them to specific groups. The rise of social networks like Gab, which provides a haven for alt-right and right-wing extremists ejected from mainstream social networks, such as Twitter and Facebook, does so under the guise of ‘free speech’ and ‘promot[ing] raw, rational, open, and authentic discourse online’ (Grey Ellis, 2016). The isolation of Gab and its creators has made its ostracization for their tacit support of right-wing extremism simpler.

Constructing social identities Online dating apps have multiplied since the advent of smartphones, and Tinder, a location-based app launched in 2012, is one of the most popular. It was one of the first apps to use the swiping motion as a method of interaction. Users can swipe right if they like what they see, or swipe left if they don’t. If both users swiped right a match is created, which allows them to chat to each other. The information that is provided is quite minimal: a Tinder profile includes just a profile picture and a short bio. Since 2012, the online dating app market has mushroomed, and dating apps now exist for different sexual orientations, fetishes and sexual desires, enabling connection for users with specific interests. It has also enabled dating for people who might be mostly homebound or live in more rural areas where opportunities for socializing are reduced. Questions about algorithmic control have been raised in relation to dating apps, especially where the data collection is less transparent. With sites like match.com or OkCupid, users have to answer a whole range of questions, the answers of which are used to build a profile and find possible matches. An investigation by Tactical Tech Collective and Joana Moll in 2018 found that most dating apps are owned by the same company – the Match Group – and that profiles and their data are packaged and sold for around $150 for one million profiles (Moll, 2018). Tinder – also owned by the Match Group – on the other hand, does not require you to answer any questions. Your profile is built from your Facebook profile photo and your brief bio. Tinder uses your location, your swiping preferences and your activity levels to calculate who to 54

9781350068278_txt_app.indd 54

Chapter 2

08/03/2021 11:13

present to you. Tinder’s CEO, Sean Rad, explains that your swiping and conversation behaviour determines who Tinder shows you: We look at your behavior and we optimize who we show you based on who you are saying yes or no to. There are a lot of signals that we take into consideration. If you say no to somebody there are a lot of things about that person that we know — whether you had common friends with them, who the common friends were, how old that person is, on and on, what their interests are. We take all that into consideration when serving better recommendations in the future. Also, when you match with somebody, we look at the depth of the conversations you are having with your various matches. You might have a deeper conversation with one person of a certain characteristic or another person of another different characteristic. Sean Rad in Yury, 2014

A lot of our choices, made unconsciously, will also contain unconscious biases that are perpetuated through Tinder’s algorithmic approach. In a conversation with Arif Kornweitz for the Dutch online radio station Ja Ja Ja Nee Nee Nee, Tamar Shafrir and Füsun Türekten raise the question about the construction of self in the context of the Tinder profile. Because so much emphasis is placed on very few visual cues and minimal text, it gives people the opportunity to quite carefully construct Tinder identities. These are always very reduced snapshots, which are part of a fragmented online identity we build up. This online fragmented self, as Shafrir calls it, presents a certain amount of freedom and control because you can choose what and who you want to present on which platform. For example, in most cases your Tinder profile will look different to your LinkedIn profile. For example, in most cases your Tinder profile will look different to your LinkedIn profile. In the case of Tinder, it also reduces our self to superficial visual cues (Shafrir and Türekten, 2017). The reduction of the self to fit the format of social interfaces is a necessary part of constructing technical social frameworks. For instance, your country’s tax authority doesn’t need to know your favourite films or childhood memories in order to understand the information it needs to process you. Equally, you wouldn’t share your income, outgoings and expenses on Facebook. LinkedIn, which is exclusively geared towards career enhancement and professional networking, invites users to share recent achievements, awards or positions, while Facebook invites memories and experiences. In each of these interactions, we reduce ourselves to fit the normative interactions Social interfaces

9781350068278_txt_app.indd 55

55

08/03/2021 11:13

of the specific network. This reduction predates social networks or state bureaucracy. The person you present to your parents is probably very different from the one you present to your friends. Social interfaces can be interchanged with the idea of platforms; the idea that we operate on and across the different networks as forms of access to certain types of interaction, opportunities and networks. Facebook is a platform for friends and family, LinkedIn is a platform for professional networks and colleagues. However, something like your tax authority or other state organization, which as mentioned at the beginning of this chapter often form the scaffolding through which social relations are enabled, have rarely been platforms. An exception to this is perhaps for the ultra-rich, where concepts such as citizenship (and consequently tax responsibility amongst others) are manipulable with the correct financial leverage. However, the digitization of the state and its organizations has begun to indicate its potential as a platform for fluid identities for more individuals. The Estonian e-Residency card gives the owner a digital identity, issued by the Estonian government, and provides access to a range of government services to help start and manage an EU company while being location independent. The services on offer for the e-Residency are limited to those you would need for business transactions, including opening a bank account in Estonia, however the concept of a digital residency looks different in the current context of increasing nationalism in Europe. Dan Hill argues that the e-Residency card, despite its limitations, asks ‘how we might be able to think more richly of “both/and” in terms of identity, of being part of nations, cities and the world. This implies respect for, and understanding of, both the local and the global’ (Hill, 2017). This could lead to nations as a ‘platform for other identities than its own’ (ibid), which the Estonian e-Residency card at least suggests as a possibility. And, while the actual design of the interface might not yet be fully complete, it is an example of a ‘country as a service’ (ibid), accessible through interfaces and user experiences. The Estonian e-Residency implies a new debate about access to sovereignty and citizenship through social interfaces. While this privilege is currently largely confined to the ultra-rich, it is often spoken of in quasi-utopian terms. Technological cultures, such as those of Silicon Valley, have regularly come into conflict with state institutions that they often see as restrictive. Much like with other aspects of the social roles of interfaces, there are complex, intersecting needs, aspirations and technical limitations that need to be considered. Interfaces, by and in turn, shape our social circumstances. Interfaces like GoodGym and PARO should encourage designers to ask questions about the assumptions made by society as to the 56

9781350068278_txt_app.indd 56

Chapter 2

08/03/2021 11:13

responsibility of design in improving the social lives of users. Similarly, the enormous and seemingly unstoppable impact of services, such as Facebook, need to be acknowledged for their potential for exploitation, as much as for their positive impact on sustaining individuals and communities. As we move into the future, and more and more social institutions, up to and including the state itself, are turned into platforms, it’s more important than ever for interface designers to consider the reach and effect of the screen and surface in shaping society.

Figure 2.4

Estonian e-Residency kit The e-Residency kit contains the e-Residency card and a card reader.

Social interfaces

9781350068278_txt_app.indd 57

57

08/03/2021 11:13

Chapter 3 Legal and political interfaces

58

9781350068278_txt_app.indd 58

08/03/2021 11:13

The political dimensions of technology have a direct impact on the way we live with digital interfaces, and how we use them to live with each other. In this chapter, we will examine the political dimensions of interfaces and how they function to channel behaviours as well as shape information. We will look at practices challenging these design principles and propose alternatives for how we could interact legally and politically through and with interfaces. The ‘political’ refers to the set of social rules, biases and expectations that define how humans interact and live. A political discussion of design is not concerned with convincing the audience of a certain perspective, so much as it is highlighting the multiplicity of perspectives and how those perspectives are constructed and maintained. The idea that design has a role to play in the way we construct and build political ideas is hardly new. Designers in many fields have recognized that they are not objective actors in the media infrastructure. In his seminal work on the role of design in political discourse, Carl DiSalvo addresses the notion established by John Dewey that ‘publics’ assemble around ‘things’: ‘How are publics made with things?’ remains unaddressed—but it is exactly this question that also should be asked as the products and processes of design are increasingly politicized and used for political ends. DiSalvo, 2009: 49

Political interfaces Uber is a good example of politics manifested in a digital interface and then transposed around the globe. The ride-sharing company Legal and political interfaces

9781350068278_txt_app.indd 59

59

08/03/2021 11:13

is a child of the Silicon Valley ethos, what Richard Barbrook terms ‘The Californian Ideology’. This ideology promotes and materializes a technological utopia based on individualism and free-market principles (Barbrook and Cameron, 1996). However, while Uber found enormous success in California in the economically liberal circumstances from which it emerged, it has found trouble elsewhere. Cities like London in the UK, with strong social-democratic cultures of both public and private transport (Transport for London and the black cab respectively), have fought legal and political battles with Uber and other organizations like it. In London, Uber has been the subject of numerous legal cases regarding its employment practices and regulation; it has been the subject of repeated protests from the general public and it has been responsible for increasing congestion in the city (Wood, Parry, Caruthers and Rose, 2017). Uber took for granted the political dimensions around employment, business practices and how public services are valued. So when the company moved to other markets with their own cultures, it rubbed up against their politics and legal practices. The relationship between technology and the social circumstances in which it is used has been extensively studied by the social sciences. Far from the popular assumption that technology is somehow neutral and devoid of any cultural or social dependencies is the understanding that ‘machines are social before being technical’ (Deleuze, 1988: 39). The anthropologist Alfred Gell defined technology as an assemblage of tools, knowledge and social necessity: Technology not only consists of the artefacts which are employed as tools, but also includes the sum knowledge which make possible the invention, making and use of tools [and] technology is coterminous with the various networks of social relationships which allow for the transmission of technical knowledge, and provide necessary conditions for cooperation between individuals in a technical activity. Gell, 1988: 6

This is where a political discussion of technology and digital interfaces has particular importance. In the same way that technologies are shaped by their social context, they also go on to reshape their social context and create new imaginaries for future technologies in a continual feedback loop. This relationship between digital interfaces and their imaginative potential is further explored in Chapter 6. For now, discussion of the political dimension of technology bears importance because of the influence that technologies have on shaping our society even as our imagined future technologies are drawn from ‘the set of 60

9781350068278_txt_app.indd 60

Chapter 3

08/03/2021 11:13

social and cultural economic conditions that make possible, in the sense of making it possible to imagine, the specific technological inventions’ (Connor, 2017: 12).

Entangled interfaces The entangled nature of interfaces is key to their politics. This entanglement means that they are networked and connected technologically and socially; a part of and a product of their context, rather than stand-alone devices. When considering digital interfaces in particular, it is important to acknowledge and understand their networked dependencies and how the political tendencies of these systems and networks influence interfaces and users. Interfaces do not stand alone as designed artefacts; they are networked. Not only on the internet or other computational networks, but also within ‘the social-cum-technological milieu that at once enables the fulfilment of human experience and enforces constraints on that experience’ (Liu, 2016). This milieu is more commonly referred to as infrastructure. Infrastructures, from the internet itself, to GPS, to the server stacks and fibre optic cables of telecoms companies, shape and are shaped by interfaces, and these in turn shape culture. It is impossible to understand the contemporary state of interfaces without acknowledging the infrastructure behind it. Infrastructure is rarely considered by designers. For example, smart watches are reliant on pervasive networks able to carry a continual data demand. Embedded in these networks are a complex system of protocols and switches defined by international treaties and scientific policy, which allow the transfer of information around the world. These networks are also dependent on a reliable system of financing for the purchase of smart watches, as well as subscription to the data services and apps that support them. Smart watches require regular access to power in order to charge at standardized current and voltage, and they also require expansive globalized supply chains with cheap labour and advanced manufacturing and distribution capabilities. All of these networks are politically dependent; they assume a particular type of user with access to expensive technological artefacts and the networks that support them. Where these networks aren’t present, the use case for the smart watch is severely challenged, as the infrastructures that it normally relies on fail. These assumptions lead to the infrastructural theorist’s favourite axiom that ‘infrastructure is everything you don’t notice – until it fails’ (Bogost, 2018). Legal and political interfaces

9781350068278_txt_app.indd 61

61

08/03/2021 11:13

Benjamin Bratton refers to the notion of a global computational network, accessed by users at different interfaces, as The Stack: an ‘accidental megastructure’ that comprises everything from oil drilling and refining for plastics to government policy, all tied together in a system of planetary computation where the Earth itself, through the datafication of everything, is a computer (Bratton, 2012). Bratton makes the point that between the user and the address is the interface. The address here refers to the computationally legible device addressable by other devices and networks. The job of the interface is to translate meaning between machines and humans. This interface is entangled in the politics of both the user and the infrastructure, and in the case of digital interfaces, this infrastructure is most commonly the internet. It is almost impossible now to imagine an internet in any other way than a series of centralized nodes owned and operated by corporations. Even in the very protocols that govern the flow of information across the internet, politics are present. As Alexander Galloway has explored, decisions made by international treaty and corporate agreement about the naming conventions and addresses of sites, reveal the politics of infrastructural arrangements and their protocols ‘as a pseudo-ideological force that has influence over real human lives’ (Galloway, 2004: 81). These institutions have their own interests in controlling infrastructure. The politics of infrastructure are perhaps most notable in debates over the concept of net neutrality, which have brought these embedded politics of digital infrastructure to the fore. Net neutrality is the concept that internet bandwidth should be delivered equally to all clients by internet service providers (ISP), rather than auctioned to those willing to pay more for extra bandwidth. In a neutral model, sometimes referred to as ‘common carrier’, smaller services are given as much priority in the delivery of data to the user as larger services. The threat to net neutrality stems from the profit-motive of throttling access to certain sites in order to promote access to proprietary content. For example, a large video streaming company might pay an ISP to give users priority access to their site. For the user, this means the site will load faster than its competitors. In more severe cases, alternative sites might be entirely blocked by the ISP. For example, an ISP might only allow access to their own video streaming service. The political dimension of these intersections is clear: the internet is ostensibly founded on the principles of free and open access to all, with net neutrality being a key pillar of that concept. The end of net neutrality gives ISPs control of the information that users are exposed to. This has made net neutrality a freedom of speech issue as much as a market competition issue. The most userfriendly interface is still potentially doomed to failure at the behest of the ISP, or in the case of blacklisting, government policy. 62

9781350068278_txt_app.indd 62

Chapter 3

08/03/2021 11:13

This positions the internet as a political issue in the same way as healthcare is. In many cases, the internet’s pervasiveness and accessibility has similar implications for social good if used well.

The political action of interfaces Interfaces are not objective technical constructs; they are permeable to the politics of everyday life. Jack Stilgoe, in his exploration of the development of the autonomous car, points out how this misunderstanding is the most common failure in the innovation process, particularly when the development environment of the lab fails to conform to the lived experience of the world. Technologies are ’driven by social processes of goal-selection, machine-making, governance, use and their encounters with the world around them’ (Stilgoe, 2017: 44). In the same way that interfaces are shaped by their political situation, they also shape the political situation they are in. Take, for example, the different ways that Google Maps chooses to name countries in politically contentious environments, with Crimea being presented differently to Russian internet users than the rest of the world (Taylor, 2016). Google Maps has become the most world’s most popular mapping interface and this challenges a history of cartography that was constructed and done in service of the state, rather than for a human-centred notion of usability. As a result, it has drawn ire from some states due to how Google chooses to represent the physical world. A lot of this comes from the fact that Google, as a corporate entity, is perceived as neutral, serving users before anything else. The history of mapping as a political exercise is well documented and finds its origin as an explicitly political act used to define territory. James C. Scott has demonstrated how this control in turn grants the map-maker or owner the ability to define the world. Mapping, databasing and data-fying create the conditions for control of the physicality of the world: The utopian, immanent, and continually frustrated goal of the modern state is to reduce the chaotic, disorderly, constantly changing social reality beneath it to something more closely resembling the administrative grid of its observations. Scott, 1998 :82

In Scott’s idea of modelling and abstracting the world, institutions, whether states or corporations, build models of the world in order to Legal and political interfaces

9781350068278_txt_app.indd 63

63

08/03/2021 11:13

make it legible; then they decide on action to enact on the model, but in performing an action on the model, the world is reshaped. Consider stories in the popular press about GPS systems leading drivers through houses, or into non-existent roads. In the mind of the user, the interface is a carbon copy of the world that defines it; the GPS map is the world. In this sense, the interface connects the users, the network and the world. The impact of social networks, data collection, modelling and predictive analytics on the institutions of democracy provides a sobering example of political interfaces. Increased attention has been drawn to the effects of digital interfaces on political change since Facebook and Twitter took credit for the Arab Spring (Wolman, 2013). This followed claims to relevance in uprisings and political movements around the world, and a growing sense of political power vested in the major social networks. However, these networked political movements are hard to sustain, moved as they are in line with the advertising-based profit models of the social networks (Tufekci, 2017). Zeynep Tufekci, an activist and scholar of political movements and social networks, suggests that social networks, such as Facebook, provide opportunities for selforganization and communication, but this is only supported as long as there is a constituency of users feeding their advertising models. The popular awareness of the Arab Spring drew users from around the world to articles, sites and services related to the movement, resulting in advertising revenue for the social networks; but as the news cycle moved on, the subject was no longer profitable and the algorithms relegated it, in turn putting the movements at risk. The story of Cambridge Analytica and its attendant organizations using Facebook data to target swing voters in the 2016 US Elections and the UK Brexit Referendum caused an outcry over the control that these organizations could have by manipulating the media accessible to targeted voters. The power of social media to control and direct attention toward certain ideas is also exploited by the growth of ‘troll farms’ – shadowy organizations that manufacture memetic ‘fake news’ en masse in order to spread misunderstanding and fracture the media landscape (Lee, 2018). Political analysts increasingly signpost social networks as symptoms of and contributors to political division. The algorithmic tendencies that optimize the behaviour of users creates ‘filter bubbles’ that reinforce biases and confirm assumptions by directing information at users, which in turn confirms their worldview (Pariser, 2011). Social networks are perhaps the most politically contingent of all forms of interfaces. They are algorithmically augmented simulacra of society, in which people are algorithmically steered through their significantly impacted political lives. In this context, there is an increasing need for designers as politically conscious actors. Designers should be able to understand 64

9781350068278_txt_app.indd 64

Chapter 3

08/03/2021 11:13

and navigate the political dimensions of their work and the embedded infrastructural and social tendencies and assumptions. This critical approach to practicing the design of interfaces is vital if we are to comprehend and interrogate the entangled politics embedded in interfaces.

A history of critical practice In Europe, an awareness of the political position of design is exemplified in the practice of the Italian radicals of the mid-twentieth century. Studios such as Superstudio worked against the modernist conception of the role of the designer as a neutral actor, claiming that instead of being an objective, rational practice, design in fact reinforced political norms into material culture. Superstudio’s lead, Adolfo Natalini, said, ‘Architecture is merely the codifying of the bourgeois models of ownership and society’ (Elfiline, 2016: 56). This notion stood in direct contrast to the rationalism and master-planning of the ‘New Objectivity’ of post-war Europe. These architectural and design movements leant heavily on scientific and pragmatic arguments in their design, distancing themselves from political responsibility while simultaneously claiming that this gave them a unique moral standpoint, away from the corruption of political concerns. This heritage of politically conscious or critical practice continues today where designers like Dunne and Raby take a more technologyfocused approach to critical practice around the politics of design culture. In his 1998 thesis, Hertzian Tales, Anthony Dunne draws direct reference to digital design as a place where the ‘post-optimal’ object might exist; where norms and preconceptions could be challenged by the changing nature of the interface between users and institutions that digital culture represents (Dunne, 2006). Dunne suggests that electromagnetic technologies, which have invisible components – i.e. the electromagnetic spectrum – open up new conceptual possibilities for the designer, and from this he proposed the use of speculation to critique the politics of design culture. When designers work critically, they use design itself as a language to challenge social and political assumptions. Carl DiSalvo has analysed the power of design in constructing publics and thus in being able to materialize and bring discussion to new ideas. He suggests that publics ‘are constructed in the sense that they are brought together through and around issues’ (DiSalvo, 2009: 51). DiSalvo argues that design’s power is in the ability to materialize issues to make them feel real, to make Legal and political interfaces

9781350068278_txt_app.indd 65

65

08/03/2021 11:13

arguments that are ‘plausible and persuasive’ through ‘the expert use of design skill’ (DiSalvo, 2009: 55). In other words, critical practice works to create political discussion by engaging audiences in their assumed behaviours or expectations, and challenging them. Audiences are forced to ask why things are different: Why don’t these robots look like the ones from the movies? Why would anyone want a gigantic glass and steel structure around the entire Earth? It is in this process that the political tendencies of mainstream design are revealed.

Figure 3.1

Superstudio, The Continuous Monument The Museum of Modern Art (MoMA), New York © Photo SCALA, Florence. Superstudio’s Continuous Monument (1969) was a speculative architectural model for a gigantic structure of steel and glass that encircled the entire Earth. Superstudio used architecture as a way to critique the social and political tendencies of the time. In particular, they were concerned with globalization and the homogenization of the city, and the power of bureaucracy over civic life.

66

9781350068278_txt_app.indd 66

Chapter 3

08/03/2021 11:13

Figure 3.2

Technological Dream Series: Robots. Copyright Dunne and Raby, 2007. Dunne and Raby’s Technological Dream Series: Robots (2007) is an early example of what became known as Speculative and Critical Design. This design approach speculates and provokes rather than proposes. The purpose of the objects is to challenge the audience to consider their assumptions. In this case, a series of ‘robots’ challenge the popularly held assumptions of what robots are and what their purpose is.

Openness and access The solution-oriented approach to technological innovation means that digital technologies have been elevated to a position of being the default solution to social problems. The theorist Evgeny Morozov refers to this as ‘solutionism’ (Morozov, 2013). For example, the effective delivery of services, such as education and welfare, are increasingly seen as being technological and not political issues. Unquestioned in this techno-solutionism approach is the assumption that everyone has ubiquitous connectivity despite the fact that even in the UK alone, as of 2014, 16 per cent of households had no access to the internet (Philip, Cottrill, Farrington, Williams and Ashmore, 2017). The assumption is also made that everybody has enough digital literacy to access these solutions. This assumption, based on the politics of one half of the ‘digital divide’ has made techno-solutionism a default for social and political issues, even at the cost of those who Legal and political interfaces

9781350068278_txt_app.indd 67

67

08/03/2021 11:13

may be excluded by infrastructure. The politics of mainstream design presupposes young, tech-savvy urbanites. Rather than reconsider the material of the interface to better fit the lived experience of people, non-connectedness is perceived as a shortcoming, leading to a policy mindset in which governance and education focuses on ‘upskilling’ users to fit the technology, rather than developing and supporting alternatives, an idea explored further in Chapter 2. It is these types of challenges that have led to some alternative forms of network infrastructure. Mesh networks, for instance, are a proposed solution to the ‘last-mile problem’, the point at which, for telecommunications companies, laying down infrastructure for connectivity is more expensive than any profit they would make. Mesh networks create hosts out of all the machines connected to a network, rather than using only a centralized server. The Athens Wireless Metropolitan Network (AWMN), an open source mesh network, was set up in response to unreliable telecommunications during the height of the anti-austerity protests of 2013 (Kloc, 2013). The AWMN also served as a point for political rallying, and for fulfilling the dependencies of people who would otherwise be in an information blackout (Tsimitakis, 2013).

Figure 3.3

An Athens Wireless Metropolitan Network node. Copyright Vaggelis Koutroumpas, 2014. The Athens Wireless Metropolitan Network is an example of an alternative form of network infrastructure and interface that is very much a product of its political context. Poor connectivity and censorship by the Greek government in the wake of austerity

68

9781350068278_txt_app.indd 68

Chapter 3

08/03/2021 11:13

protests led to the construction of a DIY internet based on a mesh structure, rather than a hierarchy. The network has no centrally controlled nodes or servers, so it is very difficult to stop or censor it. It also means traffic is shared across the whole network, resulting in more reliable connectivity.

These alternative infrastructures inevitably reshape the interface. In the case of the Athens Wireless Metropolitan Network, users were unable to access mainstream Web services like Google, Facebook and others, and had to construct their own interface with the AWMN replicating the form of early internet chat boards and channels. Without the level of production enabled by a mass-market model, the hardware is difficult to set up and maintain, requiring significantly more expertise than a modern commercial router. More widely, debates around openness in software, information and hardware go hand-in-hand with similar debates around ‘net neutrality’. The notion that software and information ‘want to be free’, as Stewart Brand is credited saying, came up with computer culture in the 1980s and forms the basis of many everyday consumer technologies. Tim Berners-Lee’s refusal to patent the World Wide Web in the early 1990s is directly responsible for the internet we have today. However, the rise of the personal computer in the 1980s, and the corporations that manufactured and sold them, pushed open-source ideals to the fringes, as companies protected their intellectual property (i.e. source code) in order to preserve revenue. Today, many organizations advocate for open-source technology. The Apache Foundation, Linux Foundation, The Mozilla Foundation, Libre, Wikimedia and others are prominent in the support and development of open-source platforms and information distribution. The advantage of working in an open-source culture is that products can often be more driven to user needs than commercial goals. The imperative is to reserve a quality product that engages developers and users in the culture of the platform. However, this is often compromised by unreliable processes, mostly as a result of the lack of financial backing and designby-committee. The proliferation of open-source licences, though useful and powerful, can be confusing and alienating for new users. They are often inscrutable and expect users to have the agency and technical knowledge to interpret the community-defined definitions for use. A consumer-friendly commercial corporation might instead obscure the legal and political dimensions of their product in the interest of consumer satisfaction. Accessibility, openness and freedom are the utopian founding principles of our contemporary techno-sphere. However, they are rarely encountered or considered by people. The dominant politics of the market and the financial and legal certainty they bring make for a more Legal and political interfaces

9781350068278_txt_app.indd 69

69

08/03/2021 11:13

user-friendly experience in most cases. Users’ trust in brands rather than communities also draws designers to work with the ‘cathedrals’ of commercial organizations rather than ‘bazaars’ of networked communities, as early open-source pioneer Eric Steven Raymond puts it (Raymond, 2000). The product of the corporatization of technology, however, is an alienation of users from their technology and a political context in which individuals are alienated from their interfaces.

Inscrutability and opacity The 1980s saw an explosion of interest in computational technology with the introduction of the personal computer. This enormous boom required the introduction of intellectual property safeguards – to protect and guarantee revenue – that were often in direct opposition to the principles of the pioneers of the technology. These pioneers, often emerging from counterculture, imagined a ‘global village’ in which information would be freely distributed and universally available. Historical artefacts like ‘war-chalking’ demonstrate points where the necessary protectionism of network technologies rubs up against the open principles of the internet.

Figure 3.4

War-chalking. Copyright Martin Haase, Creative Commons 2.5 generic licence. War-chalking was a guerrilla practice present in the early days of Wi-Fi. War-chalkers would seek out Wi-Fi networks and then use chalk to mark their security details on pavements and walls so that other internet users could access them. At a time when access rights were febrile and undefined, war-chalking was a protest against the increasing tendencies of networks to be exclusionary and secured.

70

9781350068278_txt_app.indd 70

Chapter 3

08/03/2021 11:13

The drive to protect revenue is tied to a phenomenon known as ‘blackboxing’. Black boxes are described by Bruno Latour as a paradox in which ‘the more science and technology succeed, the more opaque or obscure they become’ (Latour, 2000: 303). Often, this notion is paraphrased to indicate that as technologies become more advanced, they become more inscrutable. Jenna Burrell, referring particularly to machine-learning technologies highlights how this ‘unknowableness’ is the product of three factors: (1) opacity as intentional corporate or institutional selfprotection and concealment and, along with it, the possibility for knowing deception; (2) opacity stemming from the current state of affairs where writing (and reading) code is a specialist skill and; (3) an opacity that stems from the mismatch between mathematical optimization in highdimensionality characteristic of machine learning and the demands of human-scale reasoning and styles of semantic interpretation. Burrell, 2016: 1–2

Burrell suggests it is, in fact, overly simplistic to suggest that the market imperative of protecting intellectual property is solely to blame for the opacity of digital interfaces; there are two other factors at play. First, their technological complexity, which supports Latour’s notion of technology being made invisible by its own success. Secondly, that the tasks we demand of digital technologies often mean that they are rendered impenetrable. Contemporary ‘computational tools are designed to uncover relationships that defy human intuition [which] explains why the problem [is] particularly pronounced’ (Selbst and Barocas, 2018: 14). To address Burrell’s first condition of opacity – intentional secrecy – we need look no further than End User License Agreements (EULA) and Terms of Service (TOS). These legal agreements, designed to be hurriedly clicked through by users, establish the legal status of the user, the product, the company and the state. These agreements justifiably attract popular ridicule. They are long and complicated, often running to tens of thousands of words of legalese. The inaccessibility of these legal agreements can have consequences for both users and developers. Take, for example, Samsung’s Smart TV from 2015. The device could be voice operated, which required an always-on microphone. Buried in the device’s privacy policy was a line asking users to ‘be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party’ (Nguyen, Legal and political interfaces

9781350068278_txt_app.indd 71

71

08/03/2021 11:13

2017). Beyond being perceived as invasion of the privacy of the home and the intimate conversations that often occur, critics highlighted the lax security protocols on devices, such as internet-connect televisions, and their ease of exploitation by hackers. Although almost definitely not intended to be malicious, this case highlights the importance of the use of complex legal agreements in establishing the rights, but also revealing the potential exploitation, of users. The Cambridge Analytica scandal of 2015–2018 involved a data analysis company harvesting data from millions of Facebook users through a third-party application. However, written into the terms of service of Facebook, and the application, was a clause that stated people agreed to have their own data and that of their friends collected by the company. This situation was further complicated by the jurisdictions of those involved. Under US law, where Cambridge Analytica operated, this collection was legal, however many of the users whose data was collected were not US citizens. This data was used to micro-target political adverts in the 2016 US elections and the Brexit referendum, as well as in other elections (Osborne and Parkinson, 2018). Although the effectiveness of this campaign is contestable, it started a public debate about the use of private data and user, corporate and government complicity in user exploitation. We return again to the notion that technology is ‘institutional and political as it is machinic’ (Chun, 2008: 322). The context in which digital interfaces are developed and deployed shapes their use, and they in turn shape the context in which they are deployed. This question of how things and people are politicized through designed objects intensifies when those things are opaque and impossible to recognize. Questions over opacity and openness have reinvigorated designers’ interest in critical practice. ‘Making the invisible visible’ is a rallying cry for politically conscious designers attempting to bridge the gap between people and the interfaces they engage with. Designers also aim to help these users develop political consciousness regarding how the technologies they use may be exploiting them.

Critical interfaces Critical practice in European design began in the mid-20th century and found a resurgence in the context of popularly accessible network technologies in the early 2000s. These practices, exemplified by those of Superstudio and Dunne and Raby respectively, sought to bring practitioners’ attention to the political dimensions of design. 72

9781350068278_txt_app.indd 72

Chapter 3

08/03/2021 11:13

They suggest that designers are shaped by their political and social contexts and in turn, through their designs, they shape the political context, assumptions and biases of users. A conscientious approach to this reciprocal loop is especially important because increasingly, the processes and interests of a technology are made either inscrutable by its technical construction, or intentionally secret for reasons of intellectual property protection and exploitation. We explored some of the practices around open-source technologies and information earlier. We noted that they often suffer the disadvantage of being poorly designed, offering little of the userfriendliness that is a key imperative of commercial software, and this can often be alienating for people. The work of design studio IF aims to help organizations and companies respond to an increasing need for trust through transparency and clarity about how data is gathered, used and shared. They work with commercial clients but also work on research projects that provoke critical conversation. Their Data Licences project (IF, n.d.) speculates on ways that open licensing might be simplified to fit specific uses by inviting people to answer questions, rather than having licences thrust upon them.

Figure 3.5

IF’s data licences IF’s data licences is a speculative design project that examines ways in which licencing might be different. Rather than commercial or open licences defined by companies and communities, it allows users to pick their own settings at the interface, giving them autonomy and understanding over their political and legal rights.

Legal and political interfaces

9781350068278_txt_app.indd 73

73

08/03/2021 11:13

IF’s founder, Sarah Gold also examined mesh networks as an alternative form of network infrastructure, similar to that of the Athens Wireless Metropolitan Network. Mesh networks are technically complicated and demand a high degree of knowledge and maintenance. Her project, The Alternet (Gold, 2014), uses speculative design to examine use cases for a prefabricated mesh network. Part of the process of its construction is an examination of the political and technical conditions, which might precipitate its use. James Bridle’s Dronestagram is an example of where an interface is deployed for expressly political purposes. Dronestagram was an Instagram account that posted Google Earth images of US military drone strikes in Afghanistan and Yemen, as they were released by the Bureau of Investigative Journalism. Bridle attempts to engage in the process of making the invisible visible. He brings drone warfare to the fore by distributing it through Instagram, an interface that depends on an economy of attention. This paradox is exposed in Dronestagram. It explores where our attention is diverted and how opacity and secrecy is used to intentionally disguise activity. Designers of digital interfaces need to acknowledge their complicity in being shaped by, and shaping, the political context in which they practice. Acknowledging and responding to this complicity can be done in multiple ways – from activist work in practical and professional behaviour to engaging clients and collaborators in deeper discussion about their products. As revelations of abuse and exploitation attract more media attention, and the political relationships enabled by and coproduced in digital interfaces are established around the world, digital designers will use critical design to expose injustice and secrecy.

74

9781350068278_txt_app.indd 74

Chapter 3

08/03/2021 11:13

Figure 3.6

Dronestagram. Copyright James Bridle, 2012. Dronestagram, an Instagram account by artist James Bridle, posts images of sites of US drone strikes grabbed from Google Earth. It subverts the existing interface of Instagram for expressly political ends.

Legal and political interfaces

9781350068278_txt_app.indd 75

75

08/03/2021 11:13

Chapter 4 Ethical interfaces

76

9781350068278_txt_app.indd 76

08/03/2021 11:13

In this chapter we explore the ethics of designing and living with digital interfaces, specifically what the ethical considerations are for designers of digital interfaces as they construct and deploy their practice in the world. The ethical implications for users of digital interfaces is a constitutive part of how they are inhabited, or – echoing the title of this publication – how we live with digital interfaces from an ethical perspective. This implies the responsibility of designers to provide interfaces that are not detrimental or harmful to the health and wellbeing of their users, particularly with regard to exploitation, profiteering or bullying. Living ethically with digital interfaces also implies how users behave when they are online or acting through digital interfaces generally. As Oksana Zelenko and Emma Felton observe, designers are increasingly asked to explain design’s ‘potential to instigate meaningful social, cultural and environmental change’ (Zelenko, Felton and Vaughan (eds.) 2013: 3). Thinking about issues ethically requires us to think beyond our own needs and include the needs of others. Designing ethically means considering the wider consequences of our actions for society, individuals and the environment. Mike Monteiro puts a case for design forward as follows: Before you are a designer, you are a human being. Like every other human being on the planet, you are part of the social contract. By choosing to be a designer you are choosing to impact the people who come in contact with your work, you can either help or hurt them with your actions. The effect of what you put into the fabric of society should always be a key consideration. When designers do work that depends on a need for income disparity or class distinctions to succeed they are failing in their roles as citizens, and therefore as a designer. Monteiro, 2017

Ethical interfaces

9781350068278_txt_app.indd 77

77

08/03/2021 11:13

Design as exploitation Recent years have seen increased attention drawn to the complicity of designers in the unethical and, in many cases, illegal activities of technology companies and organizations. Regrettably, these practices often draw on established design principles like design patterns, which are discussed further in Chapter 5, and their adverse: dark patterns. Dark patterns are interactions that are deliberately designed to conceal or mislead users of digital interfaces. These are prevalent across some of the most heavily used interfaces and systems of the Web. For example, Facebook slows down its servers to make users feel safe. This is known as an ‘artificial waiting pattern’ intended ‘to construct a facade of slow, hard, thoughtful work’ (Wilson, 2016). Other dark patterns include ‘misdirection’, which involves making controls difficult to see, distracting users and ‘roach motel’, where users are tricked into a situation that is difficult get out of – such as online subscriptions. The phenomenon of dark patterns has been explored by Saul Greenberg and others (Greenberg et al., 2014) and, in the context of location tracking, by Colin M. Gray and others (Gray et al., 2018), who make the case for the ethical responsibilities of interface designers with regard to critical user experience design and human-computer interaction. A further dimension of questionable ethical practice in interface design is the issue of passive data capture, whereby the actions of people using a digital interface are captured in the form of data that is subsequently sold for advertising or marketing profit without users’ knowledge – what Anders Albrechtslund (2008) calls ‘participatory surveillance’. This phenomenon was spectacularly revealed by whistleblower Edward Snowden who demonstrated the close relationship between corporate providers of digital systems and the global state surveillance regime. Shoshana Zuboff suggests that ‘surveillance capitalism’ involves new forms of value ‘constituted by unexpected and often illegible mechanisms of extraction, commodification, and control that effectively exile persons from their own behavior while producing new markets of behavioral prediction and modification’ (Zuboff, 2015: 75). She describes the business model of Google and other data gathering technology companies as entailing ‘an emergent logic of accumulation’ (ibid: 77) by which personal data is collected and stored on a massive scale, often without the explicit awareness, consent or understanding of users. These massive archives of personal information are easy stepping stones to corporate or government surveillance, making all users effectively complicit in the unethical actions of their designers through participation.

78

9781350068278_txt_app.indd 78

Chapter 4

08/03/2021 11:13

An alternative to digital surveillance capitalism is shown in the ‘Indienet’ project by Aral Balkan, Laura Kalbag and others. Indienet is an attempt to build a more authentically democratic internet that avoids the concentration of ownership and data surveillance model of capitalism exemplified by Silicon Valley corporations. Indienet is currently implemented in the Belgian city of Ghent, with the support of city authorities. They state publicly that they ‘do not aspire to be a panopticon-like, surveillance-based “smart city’’’ (Balkan, 2018). Instead, using Indienet, they aim to create the conditions for a ‘node in a federated Web-based network where people can communicate with each other, the city, and with the world in general’ (ibid). Corporate surveillance demonstrates intentional acts of unethical behaviour from an unhealthy design culture. The contribution of design culture to ethical behaviour is explored further on, but equally as destructive to society is when designers fail to engage with the unforeseen consequences of their work, resulting in unethical use.

Figure 4.1

PRISM Slide. Public domain. An image from the cache of documents leaked by Edward Snowden of the PRISM surveillance program in 2013. The slide shows the involvement of several high-profile social networks and telecommunications companies complicit in state surveillance of citizens.

Ethical interfaces

9781350068278_txt_app.indd 79

79

08/03/2021 11:13

Unforeseen consequences Digital interfaces have further negative unforeseen consequences for society that we are presently struggling to devise ethical and legislative responses to. Trolling (making unsolicited negative comments) of public figures online has escalated to the point where death threats are common. The rise in online hate speech has been observed in many different contexts (Siegel, 2018; Banks, 2018) and is partly a consequence of the content-agnostic model that many social media platforms use as a monetization strategy. This method treats all content as essentially neutral or equal insofar as it delivers clicks for the platform, a concept usually defined under ‘freedom of speech’ arguments (Kuchler, 2018). The actual words used do not necessarily have an impact on the ‘clickability’ of a social media post. The lack of task boundaries online, the cover of anonymity and the ineffectiveness of countermeasures mean that the normative ethical boundaries of interpersonal communication are easy to suspend. The mass distribution of fake news is another effect of the contentagnostic model, which has proved to be easily manipulable. Fake news has been deployed in elections to sway members of the electorate and create a form of extreme partisanship that has the potential to subvert democratic processes. We saw this in the US election and Brexit referendum of 2016, where misinformation and conspiracy theories were deployed extensively to sway contested elections. It was also revealed that election campaigns deliberately, and in some cases illegally, accessed social media data in order to deliver highly targeted political advertising to voters who had been profiled on the basis of their social media identities. Christian Fuchs has explored this phenomenon from the perspective of social theory, attempting to ground the ‘social’ in social media more consciously in a critical examination of the power dynamics at play in the context of digital social media. Ethical approaches to this problem would include making users aware of the data being captured, when it is happening, and what uses it is put to (Fuchs, 2017). Throughout these campaigns, social media platforms alternately deployed arguments of ignorance, defences of ‘free speech’ or avowed responsibility (Levin, 2018). In the developing world, social media hate speech has the potential to trigger homicidal events as in Myanmar in 2017 (Stecklow, 2018) and South Sudan in 2017 (Ojok, 2017). An ethical discussion of digital interfaces does not only concern the ethics of the designers, who, as in the previous section, may intentionally misuse a platform, but also the unethical behaviours enabled by them. In their reluctance to effectively address the ethics of hate speech, fake news or harassment, the ethical frameworks of social media companies 80

9781350068278_txt_app.indd 80

Chapter 4

08/03/2021 11:13

are just as inadequate. An ethical view of living with, and designing for, digital interfaces means considering the unforeseen consequences on individual freedom and on the overall resilience of society. This resilience, and the ethical standards that a society is held to, are usually enshrined in legislation. However, digital interfaces, in their transnational, ephemeral nature, throw up operational dilemmas for legislative procedure.

Legislation One way that ethical behaviour is both encouraged and enforced is through government legislation; the creation of laws explicitly designed to protect people from exploitation and abuse. This increasingly applies to digital interfaces, particularly in the wake of the General Data Protection Regulation (GDPR), a legislative framework implemented across the EU in 2018. GDPR places the onus on digital providers to ensure they are compliant with the detail of the regulations, which lay out a series of requirements designed to protect the online privacy of EU citizens. GDPR applies, crucially, in situations where user data generated from within the EU is exported elsewhere, and thus represents the first significant attempt to address the transglobal corporate information networks that drive much online activity. Any company that gathers or stores personal data such as names, addresses, or credit card details must provide appropriate technical and organizational measures to mitigate any negative effects. This means eliciting informed consent from users when a website uses cookies to store identifying information. GDPR goes a long way in ensuring that individuals have control over their own data, and how it is used. The recent UK House of Lords report ‘AI in the UK: ready, willing and able?’ (2018) focuses on computational intelligence, paying particular attention to the potential for loss of livelihood through automation, and the effects of deploying technologies of such complexity that they cannot easily be understood by users. The report proposes that in situations where users may be negatively affected by algorithmic decision-making – such as healthcare, jurisprudence and defence – they should have the right to demand that a human makes the decision, not an algorithm. In addition, it suggests that if an algorithm is not explainable to the people whom it affects, it should be illegal to use it – a concept known as ‘explainable AI’ (Gunning, n.d.). This attempt to articulate an ethical framework for the use of artificial intelligence in society must balance the benefits with the potential risks. From the perspective of design, it will be important for designers of digital interfaces to Ethical interfaces

9781350068278_txt_app.indd 81

81

08/03/2021 11:13

be involved in the early stages of product development where those products are driven by machine learning or similar technologies. Only if designers are aware of when, how and why AI is being used in a series of digital interactions can they respond ethically. However, even where existing regulation exists to prevent exploitation, we can still see that providers of digital services often deliberately seek to undermine individual rights, either through the nature of the service they offer, or by using technical loopholes. For example, Uber, the taxi-hailing company, openly seeks to disrupt the licensed taxi trade by implementing a business model inspired by the piecework employment system. Drivers for Uber are defined by the company as self-employed and they are therefore not eligible for holiday pay, paid rest breaks or the minimum wage. This allows Uber to maximize profit and minimize costs. In addition, Uber were shown to have used a covert software system called ‘Greyball’ to evade regulatory inspection, denying service to government inspectors (Pasick, 2017). Sometimes interface ethics are directly implemented by legislation, including those used in healthcare or education. These are both fields with existing ethical frameworks and legislation is used to govern how personal rights, such as privacy and consent, are handled, including the right to information and the right to opt out of technical services. Medical ethics and, consequently, regulation are built around the Hippocratic oath of ‘do no harm’. ‘Do no harm’ is often seen as the indivisible counterpart to ‘doing good’ in that one is impossible without the other. Doing no harm as a digital designer in the medical context might imply allowing patients to make their own decisions, and empowering them to do so through accurate information and accessible processes. This has particular resonance when considering interfaces that are driven by computational intelligence technologies, such as machine learning, where information may be opaque and processes inaccessible.

Ethical legibility ‘Black box’ refers to the impenetrability of many computational systems. Black box systems are not available to general inspection and understanding, not least because they are often closely guarded commercial secrets, but more usually because the emphasis of computer engineering and software development has been on efficiency and productivity, further driving complexity. Knowledge of the workings of most software products is hidden from users – they are closed-source. Black boxing is also used by programmers to refer 82

9781350068278_txt_app.indd 82

Chapter 4

08/03/2021 11:13

to the permanently changing parts of a computational system that cannot readily be tested. Bruno Latour explored this idea as part of an examination into how scientific knowledge comes to be validated and established. He presciently described the effects of black box technologies as a paradox in ‘the way scientific and technical work is made invisible by its own success. When a machine runs efficiently, when a matter of fact is settled, one need focus only on its inputs and outputs and not on its internal complexity. Thus, paradoxically, the more science and technology succeed, the more opaque and obscure they become’ (Latour, 1987: 304). A prominent counter to the tendency for digital interfaces to be concealed inside black boxes is the open-source movement. Opensource software is available to anyone who can access the internet, and it depends on a community of developers who maintain and update the software. Mozilla’s Firefox Web browser and Google’s Android operating system are examples of open, widely distributed and used open-source software products. From an ethical standpoint, users can examine how these products work, contribute to their improvement and point out their flaws in online public forums. Beyond devices and interfaces, the entire means of production of the interfaces we use every day can often be opaque. To take one example, the dangerous and exploitative working conditions of the Foxconn factory in Shenzhen, where Apple’s iPhones are made, have been covered in press reports for almost a decade (BBC, 2010; Sin, 2016; Merchant, 2017; Elstrom, 2018). These places could also be considered black boxes, protected by corporate policy, poor regulation and legal loopholes, making them somewhat impenetrable. An additional issue is the often toxic and exploitative resource extraction necessary to make digital devices, which commonly contain rare earth metals mined using child labour in areas of widespread poverty and deprivation (Bach et al., 2015). Designers and others have responded in various ways. For example, Fairphone is an ethical smartphone built using a principled supply chain. Non-profit organization Students and Scholars Against Corporate Misbehaviour (SACOM) investigates multinational supplier factories and works towards improving workers’ rights in such places. However, the cognitive distance between using a smartphone and the prevailing conditions of employment of the people who make it, is considerable. The conditions of design production are explored by John Hartley, who positions what he calls ‘distributed expertise’ (Hartley, 2013: 111) as an explicitly ethical response to the industrial era concept of top-down designer-as-expert paradigm. This view suggests that an appropriate ethical practice includes a narrowing of the gap between designer and user, so as ‘to navigate the shift onto the paradigm of mass productivity’ Ethical interfaces

9781350068278_txt_app.indd 83

83

08/03/2021 11:13

(ibid: 111). This implies going further than user testing or consultation, in order to create meaningful design situations that give creative and active roles to users of digital interfaces. In this sense, living with a digital interface would include the opportunity to consult, design and configure it from the beginning of its creation, in a way similar to what Fairphone are exploring. Hartley’s proposition makes a set of ethical studio practices necessary. These would include articulating ethical choices at the start of a product development cycle, perhaps with specific champions or advocates of ethical practice deployed throughout the process. Cennydd Bowles echoes this idea saying, ‘Designers should be active in these conversations, advocating for user needs, identifying areas for deeper research and highlighting ethical concerns even at the risk of short-term unpopularity’ (Bowles, 2018).

Figure 4.2

The Fairphone 2. Copyright Fairphone, Creative Commons 2.0 generic licence. The Fairphone 2 is claimed to be ‘the world’s first ethical, modular smartphone’ (Fairphone, n.d.). Fairphone engage with the entire supply chain of their product, and use open-source software where possible. In doing so, they also expose the often hidden physical and labour dimensions of consumer technology.

84

9781350068278_txt_app.indd 84

Chapter 4

08/03/2021 11:13

Ethical design cultures The so-called ‘techlash’ – the growing public animosity towards the unethical behaviour of technology companies – is a result of the many large tech corporations who have been caught up in disturbing ethical transgressions (Financial Times, 2018). These include illegal harvesting of user data (Facebook), sexual harassment in the workplace (Uber), battlefield drone technologies (Google) or denial of employment rights (Amazon). These transgressions have resulted in prosecutions and large fines but contribute to a wider sense that the industry has established a monopolistic grip on contemporary life. This has a lot to do with the Silicon Valley attitude to innovation. New products are released at great speed and are often intended to reach the marketplace before they have been thoroughly evaluated for their long-term impact. Taking time to consider the ethical implications of digital interfaces and the systems they both depend on and influence represents a significant challenge to the ‘move fast and break things’ (Taplin, 2017) mentality. The UK Design Council suggests an ethics of digital design to counter exploitative and unethical design and technology cultures. They make the point that ‘disruption is Silicon Valley’s current watchword. Start-ups are optimized for shaking up vulnerable industries rather than assessing the resulting social, legal and ethical impact. Progress itself is the yardstick; whether that progress is in a worthwhile direction is sometimes secondary’ (Bowles, 2018). The move fast and break things mentality and ‘the industry’s revolutionary mindset, combined with the narrow perspective of a homogeneous workforce, means companies sometimes act with questionable ethics’ (Taplin, 2017). To counter this narrative, the Design Council guidelines propose that designers start with individual responsibility and encourage their co-workers to adopt ethical practices that take account of the potential for their designs to do harm to users. They also suggest a healthy balance between seamless and seamful design, one which avoids the black-boxing of technologies that are interrogable and opaque. An ethics of digital design should also include a commitment to team diversity so that the resulting interfaces have a balanced and representative effect. Finally, foregrounding communication between system and user, between designer and legislators, and pointing out unethical and damaging practices would have the effect of establishing a standard – a default set of expectations for designers of digital interfaces to live up to. In the field of digital design there are a number of codes of practice intended to frame and guide ethical behaviour. The Association of Computing Machinery (ACM) is the biggest learned society dedicated to computing. The ACM code of ethics is a collection of principles Ethical interfaces

9781350068278_txt_app.indd 85

85

08/03/2021 11:13

and guidelines designed to help computing professionals make ethically responsible decisions in professional practice. It translates broad ethical principles into concrete statements about professional conduct. However, even this code is under question in the age of mass surveillance and AI, with some members commenting that ‘rose-coloured glasses are the normal lenses through which we tend to view our work’ and that the ‘computing research community needs to work much harder to address the downsides of our innovations’ (Hecht, 2018). Ethical considerations have always played a role in design, but the development of scientific knowledge and technology has deepened our awareness of the ethical dimensions of design. As designers incorporate new knowledge of physical and human nature, as well as new forms of technology into their products, people are increasingly aware of the consequences of design for individuals, societies, cultures and the natural environment. Designers should work not only to develop ethical products, but also ethical cultures in design practice.

Futuring ethical principles Living with digital interfaces involves adapting and conforming to the behaviours they support. However, new interfaces can mean completely new types of human behaviour; with technological development happening so fast that there is often a lag while the ethics of that behaviour develops. In the relatively recent world of online dating, facilitated not just by apps such as Hinge, Tinder and Grindr, but by the algorithmic matching of interest and profiles and the speed of communication, new types of online conduct include ‘ghosting’, ‘benching’ and ‘deep liking’ (Young, 2016). ‘Ghosting’ involves a sudden withdrawal of digital contact as a means to end a relationship; ‘benching’ is a way of stringing potential dates along with minimal contact; while ‘deep liking’ means clicking back through a potential date’s social media feed and liking old posts. Users of these systems consensually develop ethical responses to their online lives over time, but as new technologies appear, so new responses are needed. For example, the ethical concerns around virtual reality experiences have been explored by Kate Nash in the context of immersive journalism. She examines what she calls ‘improper distance’ referring to the collapse of spatial distinction in VR work that seeks to connect users to distant suffering through immersive environments (Nash, 2018). Similarly, the emergence of a mature industry in sex robots has provoked an ethical response that examines the extremes of objectification embodied in 86

9781350068278_txt_app.indd 86

Chapter 4

08/03/2021 11:13

machines designed for the sexual pleasure of men (Wagner, 2018; Nascimento et al., 2018). Similarly, the contemporaneous expansion of free online pornography raises ethical issues around new behaviours and norms. It provides new opportunities for the exploitation of minors, online blackmail, and the phenomenon of revenge porn. Online pornography also has increasingly negative consequences for human sexuality. The normalization of the availability of extreme content is evident in the YouTube ‘Elsagate’ scandal. Elsagate refers to the appearance of homemade videos often viewed by children featuring the protagonist from Disney’s Frozen in a range of violent and sexualized scenarios. ‘Elsagate’ was facilitated by the autoplay interface feature, which automatically plays videos without the user selecting what to watch (Di Placido, 2017). The algorithm governing autoplay prioritizes content with high viewer numbers in order to serve users what the system categorizes as the most popular – this is incentivized by the YouTube monetization model. Critics have suggested this feature is easily subverted by automating clicks, thus bypassing the popularity ranking algorithm. An interface feature intended to reduce friction, by removing the need to select what to watch, thus ended up enabling the production of extreme content viewed by children. The increasing role of algorithms in everyday interactions have been well explored, perhaps most extensively in Cathy O’Neil’s Weapons of Math Destruction (2016). However, new ethical dilemmas continue to evolve. Open AI is a non-profit, ostensibly open and collaborative company established by Elon Musk, with the ambition of steering AI development altruistically and beneficently along a strict set of ethical guidelines. In 2019, in a startling show of prudence, it withheld the release of a new chatbot software because of ‘concerns about large language models being used to generate deceptive, biased, or abusive language at scale’ (Sinning, 2018). The speed of societal change brought about by these and other new and future technologies requires that ethical approaches in design maintain the ability to respond rapidly to a changing landscape of potentially damaging consequences for individuals, and society. This requires an ethical approach that is inherent in the entire product cycle. It requires a relationship with designed products and services that is ongoing – taking responsibility for any development long after it has been released.

Ethical interfaces

9781350068278_txt_app.indd 87

87

08/03/2021 11:13

Ethical designers Hila Yonatan has suggested a set of ethical guidelines for designers of digital interfaces, which includes evaluating the balance between limiting user actions and preventing errors, or how registration procedures and passwords are handled (Yonatan, 2017). This kind of pragmatic advice is complemented by Colin M. Gray and others (Gray, 2018), who call for a formal ethics training in user experience (UX) and human-computerinteraction (HCI) education, and emphasize the structural level of work necessary to implement an ethical framework for designers of digital interfaces. At a different level, Richard Buchanan suggests ethical design starts with the personal values of the designer, and that we project our values through our actions and through our designs (Buchanan, 2001). This was brought into focus at the London Design Museum in 2018 when designers who were showing work in an exhibition about the political nature of graphic design removed their exhibits after it was revealed that the museum had hosted an event for an aerospace company implicated in the sale of paramilitary equipment (Pownell, 2018). The complex political positioning of design, and the role of designers in shaping these discussions, is further explored in Chapter 3. An ethics of digital interface design can be articulated along a range of needs and effects. An initial position when designing a digital interface may include the desire to supply a delightful and enjoyable experience or to emphasize a seamless progression towards a welldefined goal. The aims may be to make the technology disappear so that people are not explicitly aware they are using it, an aesthetic decision further explored in Chapter 5. The distinction between human behaviour and machine behaviour is deliberately obscured to ensure people stay as long as possible in the system, or so that they do not choose rival services. A commitment to human-centredness thus enables a concealment of deeper motives, such as the invisible harvesting of user data or maximization of profit and maintenance of brand loyalty. In these situations, designers are encouraged to focus on immersion, engagement and involvement. At the other end of the spectrum are the ‘UX for good’ and ‘ICT4good’ movements, which explicitly encourage designers to create digital products intended to address inequality and have social impact. These are digital interfaces not designed to sell advertising or to collect data on their users, instead their aims are ethical. A specific area of interface design where calls for ethical practices have been widely shared is the field of the Internet of Things (IoT), generally defined as physical objects that are internet connected. Francine Berman and Vint Cerf propose the categories of an ethical framework for IoT to include: a policy for safety, security and privacy, a 88

9781350068278_txt_app.indd 88

Chapter 4

08/03/2021 11:13

legal framework for determining appropriate behaviours of autonomous entities, a focus on human rights, and the sustainable development of IoT devices as part of a larger societal and technological system (Berman and Cerf, 2017). Kyle Ebersold and Richard Glass echo these arguments, identifying how ‘the large increase in personal information that will become available, the potential loss of control over the information and types of actions that the IoT may initiate autonomously raises significant ethical issues related to autonomy of things and humans, privacy, security, freedom, liberty… and others’ (Ebersold and Glass, 2016: 146). These ethical suggestions include that ‘researchers and industry must fairly represent IoT through metaphors that not only highlight its conveniences, but also its dangers’ (ibid: 149). Designers are well placed to do this work. Simone Rebaudengo’s ‘Ethical Things’ brings this nicely into focus by imagining a near future in which machines are empowered to make ethical decisions – in this case, a fan. The fan connects to a crowdsourcing website and elicits ethical decisions whenever it faces a moral dilemma. By representing the complex web of moral certainty and interaction in the form of an Internet of Things-esque fan, Rebaudengo’s strategy is an explicitly ‘designerly’ one – it points to the possibilities for designers to deploy materials and forms in the pursuit of an ethical interface design paradigm.

Figure 4.3

‘Ethical Things’ by Simone Rebaudengo Simone Rebaudengo’s ‘Ethical Things’ explores many intersecting ethical dilemmas in digital culture, such as the Internet of Things and crowdsourcing. It uses crowdsourced ethical decision-making in order to make decisions about access.

Ethical interfaces

9781350068278_txt_app.indd 89

89

08/03/2021 11:13

Much of the canon of critical design, further explored in Chapter 3, concerns itself with the ethical dilemmas presented by new technologies, encouraging audiences to grapple with difficult ethical conundrums that are often obscured by the way new technologies are presented. Superflux’s film, Uninvited Guests (discussed in Chapter 6), explores ethical issues around care in the Internet of Things. Choy Ka Fai’s project, ‘Prospectus for a Future Body’ (2011), explores the interface between body and machine at a time when technological modification of the body is increasingly prevalent. He directs the movements of performers using software connected to electronic nerve stimulation hardware. In doing so, the project raises questions of control and agency in regard to body modification. These issues are particularly relevant when the organizations developing these types of devices might not have rigorous ethical guidelines, or if they hold interests that contradict the best interests of human users.

Figure 4.4

Still from Choy Ka Fai’s ‘Prospectus for a Future Body’ (2011). The performers are connected to electronic nerve stimulation software, which directs their movements. The project raises critical ethical questions in regard to body modification and technological augmentation.

These practices explicitly position themselves as critical, concerned with engaging, and hoping to engage, audiences in discussions of future technologies and interfaces. For other practices, ethical considerations need to be juxtaposed with the imperatives of distribution, production and profit. What we should design, then, may be the most ethically 90

9781350068278_txt_app.indd 90

Chapter 4

08/03/2021 11:13

important question of all. To this end, a code of ethics will only go so far, since it will describe only how we should comport ourselves as designers. At what point should designers, unhappy with the ethical behaviour of their employers, clients or collaborators, take action? Recently, this has taken a sharper focus with the protests from Google employees against the company’s involvement in Project Maven, an artificial intelligence system for the US military (Fang, 2019) as well as in the removal of works from the Design Museum previously described. The question of what we should do as designers would necessarily involve the setting of thresholds, since all designers, while they may have a shared set of values, do not have the same motivations or priorities. Nonetheless, as increasing focus is drawn to the role of design in shaping the interfaces of technologies that affect everyday life. Open discussion and articulation of ethical conduct, even as it evolves with technology, is going to be central to positive design cultures.

Ethical interfaces

9781350068278_txt_app.indd 91

91

08/03/2021 11:13

Chapter 5 Aesthetic interfaces

92

9781350068278_txt_app.indd 92

08/03/2021 11:13

This chapter describes the broad range of aesthetics invoked in the processes of designing digital interfaces, and how designers assess, understand and deploy these to achieve specific results. Although aesthetics are shaped by human perception through the human sensorium, aesthetics are to a great degree socially constructed by mutual appreciation and cultural experience. This chapter aims to describe what aesthetics are for, and how they come together in specific ways in digital interfaces to create value and meaning. This chapter also demonstrates how digital interfaces are generating new aesthetic sensibilities that go well beyond the domain of the visual. Philosophically, the meaning of aesthetics belongs to the pragmatism of John Dewey, in that it recognizes useful formal conventions that have come to carry meaning through widespread experience and appreciation (Dewey, 1934). The aesthetics of digital interfaces can be understood as formal visual, audio or haptic properties, and how we interact with these elements makes us feel and understand. Digital interactions often evolve over short periods of time and involve a complex mix of visual, auditory, temporal, haptic, affective and cultural cues that can provide a powerful motive force. This motive force is equally capable of inculcating positive user behaviour or reinforcing negative modes of user behaviour for purposes such as profit. As such, an understanding of how the aesthetics of digital interfaces are produced, deployed and evaluated is an important part of how designers develop value, and how they may act to manage technological change. This helps to ensure that powerful computational technologies are used in ways that are empowering and less prone to reinforcing inequity and bias.

Aesthetic interfaces

9781350068278_txt_app.indd 93

93

08/03/2021 11:13

Aesthetics and the senses The visual aesthetics of digital interfaces are a mix of both cultural and technical production that owe much to the tools of production, economic imperatives and the dominant stylistic conventions developed in the design industries and design schools. A tension between technical usability and stylistic differentiation is clear in much consumer-facing interface design. A strong example of this was the positive reception that greeted the 2013 redesign of the UK government’s websites by the newly formed Government Digital Service (GDS). In radically redesigning many of the UK government’s Web properties, the work was so well received that it won its design team the London Design Museum’s prestigious Design of the Year Award (Chalcraft, 2013), an award usually scooped up by less prosaic and more spectacular entries than utilitarian websites.

Figure 5.1

GOV.UK landing page A screenshot of the gov.uk government website, which won the 2013 Design of the Year award for its clarity and usability.

The success of the GDS redesign was in its functionality; it took a great deal of friction out of mundane informational tasks associated with interacting with government by, for example, providing a highly legible, utilitarian common visual language across a vast range of services. As Deyan Sudjic, director of the Design Museum, said at the time, the new 94

9781350068278_txt_app.indd 94

Chapter 5

08/03/2021 11:13

website ‘makes life better for millions of people’ (Chalcraft, 2013). In this example, the functional aesthetics were both appropriate and valuable. These functional aesthetics were a product of choices made by the GDS design team, which were in large part driven by a highly iterative and user-centred process that involved significant user testing of variants. This user centred-process, however, rests upon visual practices from the domains of information design, for example, from experts on information design and data visualization, and also from utilitarian graphic design exemplified by modernists, which more recently have been corporately embodied by the likes of Apple and Google. In addition, the process rests upon visual practices that come from systems design and usability experts, and subsequently, the myriad design frameworks produced by large technology companies managing complex interactive products and services. The visual aesthetics that deliver a strong result in the context of GDS’s redesign arise from the successful integration of all of these points of reference, and in the process further develop our communal aesthetic sense. A simple example of this can be seen in the visual language of hyperlinks on the GDS’s redesign. Despite the design team’s instinct to not provide hyperlinks with underlines so as to keep them visually simple, the design team found through iterative user testing that they were considerably less legible as links without the underlining, as this visual cue was so embedded in culture. Despite an attempt at modernization, this core functionality was found to be underpinned by a visual cue from the dawn of the World Wide Web. Visual aesthetics are shaped by our experience of technology over time, and form a part of our culturally constructed stylistic sensibilities. This can be utilized across digital products and services in ways that both add value and ways that are counterproductive. Developing a sense of visual aesthetics in a digital context can be a valuable asset when negotiating an increasingly digitally mediated world. Auditory or sonic aesthetics within digital interface design are less present in what has historically been a visually dominated field. However, digital experiences are increasingly developed to include many nonscreen-based devices and sound has an ever-increasing role to play. Recent technical developments, such as Bluetooth, have driven the need for more auditory cues. Both of these have crossed thresholds where wearable and wireless audio devices have seen an explosion in popularity, and technology companies have extended their product ecosystems to include sophisticated audio capabilities distributed in various connected devices. In 2016, Apple introduced a new product, the AirPod. These Bluetooth headphones have no visual interface but respond to touch and extend a limited set of interface interactions to a very small wireless in-ear headphone. The auditory aesthetics of such a product become Aesthetic interfaces

9781350068278_txt_app.indd 95

95

08/03/2021 11:13

important, as auditory cues are the only way to provide user feedback on the device and to make the user experience feel like a natural extension of the Apple brand. The aesthetics of the auditory cues of such a product are seemingly simple: a cue to alert the user to the fact that a device is connected, a sonic alert when the battery is getting low, and so on. However, how these feel – active, positive, negative but friendly, a call to action – is very much an aesthetic practice. On top of these individual sounds, considering how they fit into an ever-growing brand ecosystem across a multitude of digital experiences can make the management of these aesthetics complex. Apple therefore provides designers and developers with frameworks and resources such as the Human Interface Guide – colloquially known as the HIG – which give guidance on how to manage audio feedback and broader audio context with the interface (Apple Inc., n.d.).

Figure 5.2

Apple AirPod headphones. Bloomberg / Getty Images. First generation Apple AirPod in-ear headphones make use of several audio cues in order to signify different types of interactions. At the time, these cues fell outside what might be considered the normal vocabulary of digital interaction, which was mostly visual, and so they demanded a careful and considered design process, drawing on new experiments in interaction.

96

9781350068278_txt_app.indd 96

Chapter 5

08/03/2021 11:13

It is easy to think of the digital realm as being separate from physical reality, but in most interactions with digital interfaces there is a point of contact between a physical device and a person, or a reaction to the movement of the body. Haptics pertain to how the sense of touch contributes to a person’s perception of objects – real and virtual – and a person’s sense of proprioception. Proprioception is our coherent sense of the relative position of parts of the body in relation to each other, the forces acting on them and our sense of motion. For example, placing the left hand on top of the right hand requires our sense of proprioception. Due to the proliferation of wearable devices, virtual and augmented reality, hardware-enabled physical experience can be integrated with software to create a specific integrated user experience. A good example of haptic design that is enabled by an impressive integration of hardware and software is the ‘Force Touch’ trackpad on Apple’s line of laptops. In earlier generations, Apple’s laptop trackpads were hinged and spring loaded – meaning you could depress the trackpad slightly to elicit the ‘click’ involved in standard system navigation. This design was replaced by Force Touch – a new system in which the trackpad does not move. This system senses the force of a user’s touch and when it detects a ‘click’ it triggers a vibration engine that simulates the feel of a ‘click’. That the result is so convincing is indicative of how powerful your sense of proprioception is even in the smallest of interactions such as a ‘click’. To achieve tricking your proprioception so fully is a striking example of haptic design. Reproducing an authentic ‘click’ by synthetic means requires considering the aesthetics of what constitutes a ‘click’ in the user’s experience, an experience that has been built up by the embodied memory of previous physical experiences. Technology like Force Touch represents an important extension of the domain of the digital interface into physically reactive surfaces that engage the senses to create feedback models specific to software environments and contexts. Given the technical complexity required to achieve good haptics, the formal properties of haptic feedback are hard to define and generalize, and this has resulted in touch being relatively understudied in the context of digital interface design (Carbon and Jakesch, 2013). What is clear is that haptics will play an increasingly important role in the wearable devices ecology we see emerging all around us in smart watches, fitness trackers and other wearable devices, as well as virtual and augmented reality. Temporal aesthetics are even more ephemeral to identify and describe than perhaps any other major realm of aesthetics within digital interface design. They can be described broadly as ‘how temporal experience can carry meaning’. This can be seen in the consideration of page load times by Google as a key driver of positive Web user Aesthetic interfaces

9781350068278_txt_app.indd 97

97

08/03/2021 11:13

experience (Google Inc., n.d.a.) and more interestingly, in the recent slew of conversational interfaces built on top of natural language processing platforms such as Microsoft’s ‘Cortana’ or Amazon’s ‘Alexa’. These interfaces ostensibly allow the user a hands-free experience, enabling them to conduct other tasks simultaneously. In conversational interfaces, the vocal cadence of the assistant is a powerful tool to impart meaning, and the speed of response and interjection are powerful carriers of meaning. Again, these may seem prosaic for everyday consideration, but become more vital in the context of an autonomous vehicle attempting to refocus the driver’s attention in an emergency. With the death of the driver of a semi-autonomous Tesla car in 2017 (Hull and Smith, 2018) much attention was drawn to the interactions between the driver and software. This software was designed to return control of the car to the driver in the case of an emergency, but something clearly went wrong in a situation in which reaction times are vital.

Cultural aesthetics and meaning ‘Cultural aesthetics’ is a term that describes the unpredictable way that sensory aesthetics are not inherently meaningful, but acquire their meaning through their cultural associations in use. This fact is often so easy to see it can go almost unnoticed. For example, the use for green or red indicating positive or negative outcomes, go or stop. However, these powerful cultural associations evolve all the time as more of contemporary life becomes mediated by digital experience. As these cultural aesthetics emerge, they become useful for designers and can become integral to the designed experience of digital products and services. One area of digital interface design that has seen rapid development and spurred cultural debate are voice interfaces where auditory aesthetic approaches pose larger social questions. As personal intelligent assistants (PIAs), such as Amazon’s ‘Alexa’, have become more prevalent, it has been pointed out by researchers and commentators that these assistants have been gendered, largely as female, through their voices and their models of conversational response (Neff, 2016). Decisions to furnish PIAs with reassuring and subservient female voices has drawn much robust criticism but, in part, this situation owes much to aesthetic assumptions at play in the early stages of voice interface design (Bogost, 2018). The voice interface designers and developers responsible for choosing the tone, cadence and response model of 98

9781350068278_txt_app.indd 98

Chapter 5

08/03/2021 11:13

the voice of these PIAs rely on an uncritical set of personal aesthetic preferences, usually the anachronistic aesthetics of a female secretary or a mother. Given this, it can be argued that the stereotypical reassuring and subservient female voices and answers of these PIAs perpetuate the idea that assistants and caregivers are female roles, at a time when patriarchal assumptions like these are widely viewed as destructive and limiting of human potential. It is of concern that these dominant auditory aesthetics in conversational interfaces have become so ubiquitous, as they encourage negative interactions with female technology personas. Noticing these emerging issues, Amazon updated the ‘Alexa’ platform to include a disengage mode, which means that it will provide a response such as, ‘I’m not sure what outcome you expected’ when being asked sexually explicit questions (Crum, 2018). Designers hold a critical role in ensuring that previous social and cultural biases such as sexism, previously embedded in the aesthetics of advertising and pop culture, do not become embedded in digital interactions. Organizations like ‘Feminist Internet’, amongst others, engage designers and students in workshops, and consult and engage the public in discourse about these value systems in an attempt to create a more equitable digital cultural aesthetic through our devices (Feminist Internet, n.d.). Another simple example of the cultural power that aesthetic decisions within digital interfaces can have is Apple’s ‘iMessage’ platform’s use of coloured text bubbles. Apple made the decision to colour SMS text messages bubbles green as opposed to their own native iMessages, which appear in blue bubbles. This is a simple indication of the cost of messages for users. Over time, developers and commentators began to notice that iPhone users had a strong preference for the blue bubbles of other iPhone users and explored this issue, finding that younger users commonly expressed such views as, ‘Texting someone with green bubbles is such a turn off. I’m petty’ and, ‘I’ll never seriously date a guy who texts with green bubbles. I’m sorry. But never’ (Ford, 2015). These statements demonstrate that the colour of text bubbles has, to some, become a marker of specific cultural associations and values. The associations that users draw to blue bubbles are consummate with those of Apple products as somehow premium or exclusive, and acting on these perceptions reinforces these social hierarchies. These social hierarchies are further reinforced by the technical exclusivity of the system; leaving the Apple ecology of products means leaving the archives of ‘iMessages’ (Stern, 2018). The cultural weight of aesthetics can also be leveraged for meaning and critique. This kind of practice can be seen in the critically oriented practice of artists and designers such as James Bridle. Bridle’s work often co-opts the aesthetics of digital culture to make statements Aesthetic interfaces

9781350068278_txt_app.indd 99

99

08/03/2021 11:13

about the increasing use of remote technologies to prosecute wars at a distance, and explores the cognitive dissonance opened up by such technologies. Bridle’s work has also included an exploration of what he himself has termed the ‘New Aesthetic’, which exposes ‘images and things that seem to approach a new aesthetic of the future’ (Bridle, 2012). These are largely formed of the vernacular language of digital and Web culture, taking in glitch art, compression artefacts, render ghosts and other visual cues common to a life lived increasingly mediated by digital interfaces. This ‘New Aesthetic’ also includes references to many of the tools that are used in the generation of digital experience, such as operating systems, 3D software and non-organic geometries that these tools produce. One of Bridle’s projects – Dronestagram – exploits the tendencies of social network aesthetics, and it is explored in Chapter 3. The imposition of digital aesthetics into everyday life has also spawned a range of creative practices beyond design that parody or critique this aesthetic hegemony. Glitch art exploits flaws and failings in digital aesthetics, while the music and visual genre of ‘vaporwave’ plays to the disposition of copying, duplication and fidelity enabled by digital culture.

Figure 5.3

Blinx7 by artist Rosa Menkman. Copyright Rosa Menkman, Creative Commons 2.0 generic licence. Menkman’s work exploits the usually unintentional side effects of digital work – glitches. Glitches are errors caused in compression or processing of video or audio data, but the resulting aesthetics have inspired artists.

100

9781350068278_txt_app.indd 100

Chapter 5

08/03/2021 11:13

Design patterns and behaviours In the late 1970s architect and design theorist Christopher Alexander described what he termed ‘design patterns’ in his seminal book A Pattern Language (Alexander, 1978). Design patterns can be described as reusable forms of a solution to a design problem. Alexander observed that there was often a core way to resolve design issues, from the layout of streets, to effective lighting in restaurants, to arranging interior elements to encourage social interaction. For Alexander, documenting a significant amount of the design patterns allowed you to develop a framework for working through particular design problems in a modular and ordered way, without predetermining the solution. This powerful idea has since been taken up by other disciplines – not least by computer science, which frequently uses modular approaches to software design problems. The computational tools of digital interface design reunite the design patterns of Christopher Alexander with the design patterns of software development, as the contours of our experience are extended into a digital domain that is made from software. What we mean here is that because the scope of the digital domain is often much more coherent and modellable, in a technical sense, than the less predictable real world, the design patterns that emerge within common digital interfaces, such as Web and smartphone applications, emerge quickly through mass adoption due to their utility. This is enabled both by the fact that dominant Web and smartphone applications are made of highly modular software design patterns and that successful interface design patterns can be implemented quickly if they prove popular. To see an example of an influential framework that marries the design patterns of software with the design patterns of responsive Web interfaces, we can look to the Bootstrap framework (Bootstrap, n.d.). This software framework delivers modular user interface design solutions for Web applications. The modular commonality of much digital interface design is easy to understand since many interface design problems have common technical solutions. Given this, it is also important to recognize that within these frameworks are aesthetic assumptions that are not technical in nature and belong to the design patterns of Alexander, in that they have previously proven to be preferable solutions for people. However, this patterning may limit innovation if designers consistently reuse old patterns without exploring alternatives. The Bootstrap framework’s name also says much about the politics and economics of the culture that created it. Bootstrapping implies economic self-reliance, frugality and ingenuity. It has also come to signify heroic individual effort. It is worth considering the influence of these ideas as they come to be embedded in both the language and tools of digital interface design, and as such can reinforce design patterns Aesthetic interfaces

9781350068278_txt_app.indd 101

101

08/03/2021 11:13

that prioritize one set of values to the exclusion of others. The subtle and considerable power of digital tools to shape our values is explored elsewhere in this book, but it is worth mentioning here as part of how aesthetic values are conceived of.

Figure 5.4

Bootstrap Bootstrap is a common framework used in application development. The common use of this technical framework as a pattern means that the designer’s choices reflect the structure of the framework, thus affecting the aesthetics.

An example of this in action is the valorization of the ‘frictionless’ or ‘seamless’ interface, or transaction. Many retail services will attempt to reduce friction in the interactions of a customer or user in the attempt to streamline the shopping experience and encourage quicker and higher quantity shopping. This seamless aesthetic permeates other services that benefit from lowering the cognitive load of users – the amount of time they spend deciding on interactions. For example, seamless logins ease a user’s journey across websites and services but also enable companies to gather and compare data quickly and easily. Increasingly, we come to expect this behaviour from digital platforms, and designers are taught that points of friction in digital interactions are generally negative. However, an increasing number of designers and theorists espouse the value of ‘seamful’ design, where the interaction or experience is interrupted, and the user is forced to make decisions. This was first suggested by ubiquitous computing mavens Mark Weiser and John Brown in 1997, when they proposed that ‘the unit of design should be social people, in their environment, plus your device’ (Brown and Weiser, 1997) and digital interfaces should not seek to be seamless but instead ‘seamful’, allowing people to be aware of how the technologies they use interface in different ways with the various 102

9781350068278_txt_app.indd 102

Chapter 5

08/03/2021 11:13

aspects of their lives. The benefits are ethical as well as aesthetic, they allow the user to give or receive feedback, critically consider their decisions or take their own time (Chalmers and MacColl, 2003). Away from the broad implications of digital platforms and services, digital interface design patterns emerge with new technologies in interesting and varied ways, and with significant speed. A good example of an interface design pattern that saw extensive adoption as touch screens became widespread is the ‘pull-to-refresh’ gesture. This familiar gesture is widely credited to Apple developer Loren Brichter, who added a simple way to refresh a timeline to his early Twitter client application, as an extension of the scrolling motion (Spencer, 2012). The sense of this experience proved satisfying to a large number of users and felt like a contemporary evocation of the new platform and form of the smartphone, which resulted in it being quickly emulated by others. This example shows how the design patterns of Alexander emerge within digital interface design, as the aesthetics of use are tied up in technology and software innovation.

Figure 5.5

Twitter’s pull-to-refresh patent. The interaction was born as a natural extension of the scrolling motion, already common to smartphones. This example demonstrates how aesthetics are subject to, and create new patterns.

Aesthetic interfaces

9781350068278_txt_app.indd 103

103

08/03/2021 11:13

Interface design patterns reference preferred ways of solving common problems that satisfy user expectations and needs, but they also give rise to a more nefarious phenomenon: dark patterns. User experience specialist Harry Brignull coined the term ‘dark pattern’ in 2011 to mean interface design patterns that are specially employed to exploit or deceive the user (Brignull, 2011). Many examples of dark patterns exist. Brignull calls one of the most common the ‘roach motel’, which involves websites hiding the ability to delete a digital account behind layers of obscure interface choices and language or extraneous verification processes. Other examples of dark patterns can be seen in popular smartphone games where in-app purchases use the same visual language as is used to progress between levels. Another well-cited example of a dark pattern is the banner advert that appeared for a SMS text service called Chatmost, which included some errant dark pixels that looked like specs of dust encouraging touchscreen users to ‘clean’ the screen over the advert, inadvertently clicking it.

Figure 5.6

Dark patterns Dark patterns use intuition exploitatively to encourage users to click on an advert, make micropayments or to prevent them from leaving services.

There are also many examples of intentional deceptions that utilize standard patterns to deceive users beyond the subtle trickery of dark patterns. Phishing, where websites or legitimate emails are duplicated in order to extract private information from users is a good example. Digital aesthetics, particularly visual ones, are easily replicable with modest skills and, as with anything connected, security should always be a concern of the designer. 104

9781350068278_txt_app.indd 104

Chapter 5

08/03/2021 11:13

Design patterns become standard templates of aesthetics for use in multiple situations. Users can understand new interfaces based on the experience of previous patterns, and so designers need to conform, at least to a degree, to those expectations in order to ensure usability. However, design patterns should also be challenged. The seamless experience can be critiqued as not engaging the user fully in their choices, making it easy to make purchases or decisions they may regret. Equally, these patterns and aesthetics can be maliciously deployed by those using dark patterns to trick and exploit users.

Aesthetics for use The increasing prevalence of digital interfaces has given rise to what is often called user-centred design. Simply put, this type of design process aims to capture the user experience of a digital product or service, break it down into its component parts, and refactor it to be as friction-free as possible. Embedded in this idea of ‘frictionless-ness’ is the idea that users do not want to think consciously about the process of achieving a task, and that anything that adds cognitive load during interface interaction is detrimental. This idea has become more consequential as poor user experience of digital products and services is widely seen as detrimental to brands in the increasingly digitally integrated product ecosystem of the contemporary global economy. Books like the usability consultant Steve Krug’s 2005 Don’t Make Me Think was, and remains, an influential example of this approach to user-centred design where conscious effort is the mortal enemy of usability (Krug, 2005). Within the field of user-centred design, aesthetics are often understood and deployed in utilitarian ways: visual designers will have an understanding of the utility and power of individual aesthetic elements and will try to balance these with the less utilitarian concerns of look and feel that underpin brand experience and the overall cohesion of digital products and services. The trade-off involved here can result in both over-rigorous visual approaches that come from an engineering perspective, or exuberant design excess manifesting in intensely frustrating interface design. Google’s transition to the ‘Material Design’ framework represents a fundamental reorientation of the interface ecosystem away from a metrics-based, utilitarian approach to interface design decisions, to one that prioritizes a formal aesthetic framework to manage user experience across a huge and evolving digital ecosystem (Google Inc, n.d.b.). As well as establishing formal frameworks for digital ecologies, designers often turn to skeuomorphism. Skeuomorphic interface design Aesthetic interfaces

9781350068278_txt_app.indd 105

105

08/03/2021 11:13

describes design where digital visual objects mimic relevant real-world objects. For example, digital dials on radio apps that look like retro radios, or game controls that represent classic arcade game buttons. This approach is popular as it is often viewed as a shortcut to user understanding. This kind of thinking had clear benefits in the early days of digital interfaces, given that many of the common affordances of digital interaction we have come to know were yet to be formed. Skeuomorphs have now become less common as forms of digital interaction since contemporary ones, such as pull-to-refresh, have been normalized. In fact, some are so common that reports suggest children learn the ‘swiping’ interaction common to touchscreens before many other interactions, and attempt to use it on other artefacts like books (Turner, 2018). Apple purged skeuomorphic interface design in its release of its seventh iteration of its iOS operating system because they ‘understood that people had already become comfortable with touching glass, they didn’t need physical buttons, they understood the benefits’ (Ive, J. in Hein, 2013). It’s likely that skeuomorphs of analogue devices will soon be obsolete, but the method of drawing on past behaviours and symbolic systems to enhance usability through aesthetics will continue. Designers working today should consider how designs of future systems, such as autonomous vehicles, artificial intelligence or virtual reality applications, will have to be designed with the existing expectations of users in mind.

Aesthetics for empathy People have described design as an act of empathy, imagining the needs of someone else and embedding them into an object, product or service ‘that offers its empathy to anyone for as long as the product exists. A chair might be crafted by someone to afford bodyweight relief to his or her loved one, but once materialized, that chair will service anybody who comes upon it’ (Tonkinwise, 2016). This is a useful frame to use in the production of digital interfaces. Asking the question of how an interface promotes empathy between people, for example, would be a good place to start when designing social Web platforms. The consequences of not asking these kinds of questions can be seen in the many forms of abuse and exploitation that digital interfaces have enabled. One design tool that user experience designers use to build empathy into their products is personas. Personas aim to develop an understanding of individual user need and so develop a framework to evaluate the success of a particular interface design approach. 106

9781350068278_txt_app.indd 106

Chapter 5

08/03/2021 11:13

Figure 5.7

Personas Personas are speculative or fictional personalities, researched and developed in order to test a design before user testing. By considering the needs and concerns of a range of fictional users, designers can better foresee the potential for negative and positive experience in their products.

While the utility of this approach seems straightforward, UX researchers have discussed the potential of design personas to reinforce biases and perpetuate Western-centric ideas of ‘user needs’. In codifying what ‘normal users’ are, we run the risk of excluding many in a reductive idea of who we are designing for. The structure of participation represented by large online products and services make this an acute risk for the digital interface designer (Marsden and Haag, 2016). In considering the emerging aesthetics of empathy for users of digital interfaces, the design theorist Cameron Tonkinwise describes how design as a form of empathy might be considered: ‘Not only inspired by a fundamental empathy for someone who is suffering, but is structured by that empathy — the form of what is made materializes the specifics of the wish that “pain be gone” in someone. Only this account explains the animism of designs, their capacities to be actively sensitive to someone’s capabilities and needs’ (Tonkinwise, 2016). Much like other aspects we consider in this book, aesthetics are fed by, and feed the tendencies existing in digital interfaces and culture. They are informed by previous aesthetics as well as the behaviours designers seek to encourage. These designs in turn go on to dictate future aesthetic choices. And, as in other chapters, as the forms of digital interfaces continue to proliferate, designers will need to be prepared to reflect on how the choices they make will influence their users and future designers. Aesthetics are the most sublime yet vital part of the everyday experience. Good choices can make an interface intuitive and easy to use, but this can equally have unintended side effects when intuitiveness can be used for deception. Aesthetic interfaces

9781350068278_txt_app.indd 107

107

08/03/2021 11:13

Chapter 6 Uncertainty, deviance and futures

108

9781350068278_txt_app.indd 108

08/03/2021 11:13

This chapter explores the connection between how digital interfaces are imagined and built, and the mutual feedback loop that exists between future technologies – real and imagined – and present innovation. We explore this in the particular context of uncertainty, due to the fact that there are specific methods and tools of design practice that have evolved to embrace the uncertainty of the future with the aim of rendering it malleable in the present. Also, with the enormous social, political and ecological changes the world is currently undergoing, embracing uncertainty as an attitude provides opportunities for the designer to engage meaningfully in these difficult issues. By exploring the ways in which technologies might develop and multiply through some of the ideas here, designers can be more responsive, adaptable and responsible in regard to the things they create and how they are deployed in the world. Thinking about uncertainty goes handin-hand with thinking about and, to a degree, working in the future. The contemporary design and technological landscape is naturally geared towards a sense of futurity, but in the long arc of history, a technologically driven, future-facing culture is relatively new. In Europe, futurity grew up during the Industrial Revolution, with the technologization of everyday life where, despite the enormous human cost of industrialization, there was a prevailing optimism about the quality of life that new technologies would bring. However, despite almost two hundred years of future images, from postcards like the one on the following page, to blockbuster Hollywood science fiction, the future is suffused with notions of uncertainty. While the future visions of science fiction cinema and design renderings undoubtedly become beacons for future development, by creating desirable directions for innovation, they should not be interpreted as predictions. This is important because adapting and being responsive to change in an uncertain world is more useful than making predictions, because predictions, more often than not, turn out wrong.

Uncertainty, deviance and futures

9781350068278_txt_app.indd 109

109

08/03/2021 11:13

Figure 6.1

‘Postcards from the Future’, Jean Marc Cote (1899). Copyright Jean-Marc Cote, public domain. Images of the future grew up in the Industrial Revolution with the technologization of everyday life. Technology enchanted new possibilities for future technologies in the imaginations of the public. This series of postcards from the turn of the twentieth century speculated about the year 2000.

Figure 6.2 The Futures Cone. The Futures Cone, first described by Joseph Voros, provides a diagram of the future – as we look forward from our current position, the range of possibilities expands further into the future. This diagram is often used by designers as a way of describing the value in speculating about the future in relation to uncertainty. However, it is not without criticism, due to its simplistic nature.

110

9781350068278_txt_app.indd 110

Chapter 6

08/03/2021 11:13

The Futures Cone, attributed to Joseph Voros (Voros, 2017: 11), is an image often used in design to describe the relationship between futures and uncertainty. In this simple diagram, we look toward the future where the range of possibilities increases, and with it the uncertainty of outcome. This relatively simplistic model is not without its detractors, as it assumes a fixed viewpoint, has a particularly western interpretation of futurity and does not factor in the power of the past to influence the future. However, for our purposes, it provides a useful starting point, and evidence for the importance of embracing uncertainty as a responsible attitude to design by describing usefully the indeterminacy of future conditions.

Embracing uncertainty There are good reasons to embrace uncertainty in design. Let’s begin by discussing the conditions in which technological products are developed and how these conditions affect their imagined deployment. Any designed object is necessarily a product of the research and contextualization that went into it. An easy example of this is the design of the mobile phone. The mobile phone in its earliest forms – from which the smartphone has evolved – was designed to fit in pockets, because of the unquestioned design context in which they were mostly conceived – by and for men with pockets. If early phone designers had taken an uncertain attitude and embraced the multiplicity of possible applications (phones for women, children, the elderly) would they have taken on the same dominant form factor? A specific case for embracing uncertainty is made by initiatives such as circular design (MacArthur, 2017), a rich field of design which in part aims to recognize the uncertainty of the environmental impact of post-use products and, through design, turn them into raw materials for new product life cycles. Viewed from this perspective, circular design can be seen as a project of mitigating uncertainty around waste and material scarcity. Circular design principles make it clear that to address uncertainty, designers need to consider supply chain dynamics, technological development, social attitudes to specific objects and a range of plausible futures. Sticking with the example of the smartphone, we can speculate about how this form factor might be different based on circular design principles. Google’s Project Ara and others like it aimed to develop a modular phone to enable users to upgrade the elements of the phone independently, rather than update the whole product every twelve months at huge financial and environmental cost. Uncertainty, deviance and futures

9781350068278_txt_app.indd 111

111

08/03/2021 11:13

Research projects, such as Ara, provide designers the opportunity to mitigate uncertainty by exploring alternatives. While debate continues about electronic waste and the cost of consumer habits, these projects provide scope to materially explore the uncertainty of future developments.

Figure 6.3 Google’s Project Ara Google’s Project Ara was a speculative prototype for a modular phone. The concept would be that outdated or damaged components could be quickly and easily replaced by users, thereby reducing waste.

Science fiction and design Beginning at the turn of the twentieth century, science-fiction films established themselves as a powerful genre and became a mainstay of popular culture. In doing so, a dialogue emerged between fiction and pop culture. This section explores the way in which ‘SF [science fiction] plays an important role in the shaping of desire – for change, for progress, for novelty, for a sense of wonder and of discovery’ (Bassett, Steinmueller and Voss, 2013), particularly in regard to how we build expectations and imaginaries of future technology drawn from science-fiction narratives. 112

9781350068278_txt_app.indd 112

Chapter 6

08/03/2021 11:13

Steven Spielberg’s 2002 science-fiction blockbuster film Minority Report provides an excellent example of the impact of science-fiction narratives on real innovation. The film created several artefacts of fictional technology that became beacons of innovation in subsequent years – amongst others, iris and facial recognition technology, predictive policing systems and targeted advertising. For the subject of interfaces, the gestural interface seen in early parts of the film is perhaps the most lucid realization of the connection between science-fiction and real-world innovation in recent memory.

Figure 6.4

Film still from Minority Report (2002). Copyright Steven Spielberg. The gestural interface from an early sequence in the film Minority Report. The user wears gloves that allow the character to move elements around the screen in an almost haptic way. This has an obvious aesthetic appeal to designers.

The gestural interface of Minority Report ‘defined the future of user interface design’ and became the predominant go-to headline and driver for interface technology (Gorczyca, 2017). The fictional system was designed by John Underkoffler as technical advisor on the film. An alum of MIT Media Lab, he would later found Oblong Industries to develop the interface system further as G-Speak, a ‘spatial computing platform’ (Oblong Industries, n.d.) using the same principle forms of interaction demonstrated in the film. Simultaneously, the interface witnessed by millions in the original film generated dozens, perhaps hundreds, of spinoff start-up companies looking to capitalize on the cultural weight of the film’s technological imaginaries, by developing gestural interfaces of their own. The influence is traceable in the development of mainstream Uncertainty, deviance and futures

9781350068278_txt_app.indd 113

113

08/03/2021 11:13

devices like Microsoft’s now defunct Kinect system and the core principles of virtual reality interaction (Loughrey, 2017). The use of science and technical advisors in Hollywood is well explored by David Kirby’s 2010 book Lab Coats in Hollywood (Kirby, 2010b). Increasingly, expertise in technology and science is brought into cinema to lend realism. However, doing so transforms or persuades viewers of the perception of the field and subject of technology or science. In other words, the gestural interface in Minority Report was so appealing to the hundreds of individuals and organizations who would attempt to duplicate it, precisely because it was so well designed by John Underkoffler. This isn’t to suggest that without Minority Report the gestural interface would have never been developed – it was already in development when Underkoffler was demoing its principles in the film – but that the cultural imagination around these types of interface inevitably draws comparison with the film. Science-fiction cinema relies on a high level of fidelity in order to convince viewers of the plausibility of the narrative. The creators of science-fiction narratives, both in film and text, aim to convince the viewer to willingly suspend their disbelief. This concept of suspending disbelief means that the emphasis is on the writer or filmmaker to create a narrative world so compelling that viewers or readers do not question its veracity. One particularly notable progenitor of this technique was Stanley Kubrick’s 1968 film 2001: A Space Odyssey. The renowned science-fiction film is often noted for Kubrick’s intense attention to detail, working with world-leading furniture designers, NASA and US Air Force advisors, and even Marvin Minsky, one of the forefathers and theorists of AI technologies. Similar to Minority Report, this attention to detail resulted in incredible design, technological and scientific accuracy that have contributed to 2001: A Space Odyssey remaining a cornerstone of the corpus of science-fiction imagination to this day. The implications of these approaches for those working expressly with design fiction – i.e. using fiction to demonstrate or explore a product or scenario concept rather than entertain or tell a story – rather than science-fiction are explored further on. However, it’s worth discussing how and why rich and detailed images of fiction become believable. American academic Carl DiSalvo has highlighted the way in which designers use certain techniques in order to compel audiences to believe in the realism of future visions: [The] activity of making apparent is pursued with sophisticated attention to the aesthetic characteristics of possible future conditions. The products, models and photographs — the choice of materials, colors, shapes, and composition — are deftly fashioned. The projection 114

9781350068278_txt_app.indd 114

Chapter 6

08/03/2021 11:13

is plausible and persuasive because the representations are so easily consumed in the present (they are visually striking) and imaginable to be consumed in the future (they appear like we envisage such ‘real’ products would appear). DiSalvo, 2009: 55

In other words, the richness of images of the future, whether in cinema or product renderings, is what makes them immediately believable in a way that simple sketches would not be. Joel McKim refers to the ways in which images of the future are represented as having a ‘foreclosing’ property stating that the ‘protocols, conventions and prompts’ of future images perform an ‘aesthetic conditioning’ (McKim, 2017: 290). McKim describes the idea that aesthetically rich images prevent us from imagining alternatives precisely because they are so rich, and that the saturation of popular culture, cinema and even the built landscape with these images preconditions us in to an almost permanent state of suspension of disbelief. Here, we find a rationale for the continued popularity of the dream of the Minority Report gestural interface. Over fifteen years later, the world has still not seen a satisfactory deployment of the technology beyond highly specialized environments. It’s possible to simply speculate that it’s a bad design, overthrowing the conventions of user experience and attempting to implement new ones (Norman, 2010), and even John Underkoffler himself admits they are antithetical to the function and operation of contemporary computational systems (Arthur, 2015). However, the system of belief is maintained because it should work, rather than because it does. The vision is so rich and compelling in its design and representation that, as media theorist Erik Davis suggests, ‘in reflecting the “as if” character in the world, they are actually realer than they appear’ (Davis, 2015: 375).

Design fiction It’s arguable that design has always been somewhat fictional and that ‘speculation is part of every designer’s practice’ (Ward n.d.) in the sense that, as a design is developed and tested, it is based on assumptions and speculations (no matter how well evidenced) about the conditions it will be deployed in. However, in recent years, the intentional use of fiction as a way of engaging designers, users and organizations has become a standard tool under the broad rubric of ‘design fiction’. Design fiction, as Uncertainty, deviance and futures

9781350068278_txt_app.indd 115

115

08/03/2021 11:13

defined by Bruce Sterling, is ‘the deliberate use of diegetic prototypes to suspend disbelief about change’ (Bosch, 2012). This definition is worth exploring further. Diegetics is a concept borrowed from film, meaning literally ‘another world’. In the world of fiction, these are ‘objects that function properly and which people actually use’ (Kirby, 2010a: 43). In the ‘real’ world, they are props and sets. It’s the sophisticated design and use of these elements that transforms them into diegetic objects and places through suspending our disbelief that they are in fact props and sets. In other words, design fiction uses convincing objects and scenarios from fictional worlds in order to draw us into a narrative discussion of how things might otherwise be. As a technique, design fiction is deployed for all manner of purposes. It was first developed in the context of critical design – creating designs for questioning and critiquing social and cultural tendencies – but it is also easy to see it applied in the context of advertising. For example, Microsoft’s popular series, Productivity Future Vision (Microsoft: Productivity Future Vision, 2015), use fictional interfaces and devices to tell stories about ways in which people might work in the future, to demonstrate Microsoft’s development direction.

Figure 6.5

Still from Microsoft’s Productivity Future Vision (2015). Copyright Microsoft in Business. Microsoft and other technology companies use design fiction to communicate their ambitions for future development. None of the products in the Microsoft Productivity Future Vision are available but are used to help innovate and ideate.

It’s through analysing these various purposes of design fiction – from critical discussion to product advertising – that we can start to analyse the techniques used. The technology world is currently suffused with design fictions celebrating the transformative power of new products. Similar to Microsoft’s work, these films, images and prototypes usually deploy in line with the aesthetic expectations of consumers, drawing on tropes of science fiction, in order to slide easily into the ‘aesthetic conditioning’ of the current time. Devices and surfaces are clean, smooth and operate without difficulty. These images are extremely effective, and 116

9781350068278_txt_app.indd 116

Chapter 6

08/03/2021 11:13

convey a sense of futurity to users, as they inevitably line up with the way that science-fiction cinema suggests the future will look and feel. As a result, these images become effective tools for easily advertising a convincing future, since they do not challenge existing assumptions. Design fiction provides useful opportunities for thoughtful research and development beyond advertising. Near Future Laboratory’s work has consistently concerned itself with engaging clients, stakeholders and other designers in reconsidering the future of digital interfaces. Their seminal ethnographic work, Curious Rituals, is an in-depth examination of the ways in which ‘technological objects are domesticated by people, integrated into their own daily routines. Fixing strategies, nervous tics, device juggling or courtesy postures’ (Chiu, W., Kwon, N., Miyake, K. and Nova, N. 2012). Near Future Laboratory deployed ethnographic techniques, observing users of interfaces and devices ‘in the wild’. These behaviours were then described and analysed and used to create a design fiction film about what future ritualistic forms of interaction might come with emerging technologies.

Figure 6.6

Still from Curious Rituals (2014). Copyright Near Future Laboratory. The film Curious Rituals was a design fiction responding to in-depth research about the behaviours people develop with technology. It speculated on what new behaviours might emerge with future technologies.

In response to this research, they produced a short design fiction film, also called Curious Rituals (2014), which brought the principles discovered through observation to bear on future technologies such as autonomous vehicles and virtual reality. In doing this, Near Future Laboratory sought to critically engage designers in consideration of Uncertainty, deviance and futures

9781350068278_txt_app.indd 117

117

08/03/2021 11:13

how technologies are affected by their social and political contexts and how they are adapted or misused, sometimes in direct contradiction of regulation or legal frameworks. One of the members of Near Future Laboratory, Nick Foster, proposed principles for designers to adequately engage in the political context of their users in an essay titled ‘The Future Mundane’: 1. The Future Mundane is filled with background talent. 2. The Future Mundane is an accretive space 3. The Future Mundane is a partly broken space. Foster, 2014

Foster highlights that the lives of users are complex and messy, full of broken equipment, failing services, unconsidered needs and strange ambitions. The technological experience of the average user is rarely like those of Microsoft’s Productivity Future Vision or the testing conditions. In fact, by these rules and in line with critical practice, it might be easier to say that there is no average user. What is the purpose of this kind of ‘messy’ design fiction? If we return to the notion that all design is speculative in the sense that it assumes certain conditions of its users and deployment, then these types of fictions invite a more rigorous engagement with the uncertainty of future conditions. While Microsoft’s visions assume a world of functioning social structures, nuclear families and stable infrastructure, critical design fictions (sometimes under the more specific description of ‘speculative design’) use their devices, interfaces and technologies as windows into other ways of living, exploring the cracks, flaws and breakages in the everyday experience of technology. Comprehending, considering, provoking and predicting futures are a cornerstone of contemporary design practice. It’s the responsibility of the designer to consider the context in which their work is deployed. This allows the designer and user to exist in a dialogue, thinking through the future considerations in which an interface or technology will be lived with.

Design imaginaries One concept that relates strongly to design fiction is the concept of imaginaries. This concept, borrowed from sociology, refers to the shared mental model that people possess for complex and often ephemeral 118

9781350068278_txt_app.indd 118

Chapter 6

08/03/2021 11:13

things. A good example here would be artificial intelligence (AI). This catch-all term means many things to many people, ranging from discrete applications of machine learning process that power fairly prosaic systems of digital recommendation, to manifestations of sentient intelligence seen in Hollywood science-fiction films. These imaginaries are an important design material in the crafting of design fictions. The possibility space proposed within any design fiction rests upon the shared mental models of its audience. What this means in practice is that you can use a range of imaginaries to develop a coherent speculation about a new technology that on its face seems implausible, but is scaffolded by the intersecting imaginaries within the design fiction as a whole. For example, the common trope of an AI that seeks to somehow ascend or dominate humanity is well known to science-fiction. This trope can be used to explore alternative notions of AI ascension and domination such as in Lawrence Lek’s 2017 Geomancer, in which an AI simply wishes to ascend humanity by becoming a great artist. These imageries also have important consequences for the design of complex new technologies as they have yet to be established amongst a significant base of users. This is again where design fiction and imageries intersect. For example, with every new input device – mouse, touchscreen, gesture tracking – the heuristics around these devices can be demonstrated in simulated use in their intended future context, which in turn go on to define future imaginaries and user applications. Demonstrations of technology have a long history of their own, and it is worth considering the place of imaginaries in both the most famous technology demonstration of all, and in something much more recent from the canon of design fiction. The most famous technology demonstration is perhaps the 1968 Association for Computing Machinery / Institute of Electrical and Electronics Engineers (ACM / IEEE) Computer Society’s Fall Joint Computer Conference in San Francisco, at which Douglas Engelbart demonstrated the graphical user interface and the computer mouse. This demonstration has become known as the ‘mother of all demos’ (Levy, 1994) as it both foreshadowed the personal computer and importantly defined what technology demonstrations could do. The well-choreographed demonstration of what was essentially a mouse and early graphical user interface (GUI) was designed to elicit the idea in the audience that computers could be used for more than just scientific application and could one day be used by anyone. In 1968, most had few reasons to think that an individual could find a use for, nor operate, technically complicated computational processes. Engelbart’s demonstration created a new imaginary that took the next twenty years to be fully realized. Without the shared imaginary of what a personal computer could be, that development may never have happened. Uncertainty, deviance and futures

9781350068278_txt_app.indd 119

119

08/03/2021 11:13

Technology demonstrations today, such as those of Apple, are designed and choreographed similarly to corporate design fictions. The aim is to convince the audience of the vision and innovation of the company, as well as establish ambitions for users. However, as we have explored, design fiction can also be used critically to unpick popular imaginaries. Design studio Superflux’s Uninvited Guests (2015) is a design fiction that aims to critique popularly held imaginaries around the Internet of Things. The Internet of Things (IoT) is an ecosystem of internet-connected objects and devices that may be automated or controlled through the internet, such as fridges, thermostats or even cars. Uninvited Guests introduces an older man who has been gifted several objects by his children, who hope to monitor his health: a walking stick that counts and ‘shares’ his steps; a fork that monitors his diet and a bed that monitors his sleep patterns. This reflects a commonly promoted use case for these types of devices: care for distant relatives, children or those in need of assistance. However, Uninvited Guests seeks to critique the underlying issues of surveillance and responsibility present in imaginaries of the Internet of Things, by seeing the man ignore the devices and finally subvert them using deception. He uses methods such as weighting his bed down with books to fool the device into thinking he is sleeping, or pushing the fork around a plate of salad to simulate a healthy diet.

Figure 6.7

Superflux’s Uninvited Guests (2015). Copyright Superflux. Uninvited Guests, by Superflux, challenges imaginaries around the Internet of Things. In the scenario, an older man who lives alone is given connected objects by his children, in order for them to monitor his health and lifestyle. Though ostensibly meant for the purposes of care, the project critiques the notions of surveillance and responsibility present in the popular imaginary of the Internet of Things.

120

9781350068278_txt_app.indd 120

Chapter 6

08/03/2021 11:13

Of particular note here is the design strategy that Superflux employ to communicate with their audience. In technology demonstrations and corporate design fiction, the aesthetics are slick and perfected, conforming with expectations built by science fiction and pop culture, aimed at foreclosing critique or questions and convincing us of the vision. In Superflux’s film, the devices are intentionally poorly rendered. They are bright yellow, demonstrating no technical functioning such as lights, screens or other technical components. Superflux seek to highlight these as diegetic objects so that we can move our imagination past the objects and their interactions and into the world of the character. It is clear from the film that there is no intention for the objects to be read as propositions, they are stand-ins for a conversation about devices like them, and the imaginaries we have about the Internet of Things.

Deviant interfaces As discussed, one of the opportunities that future-focused design approaches allow for is examination of the specific contexts in which design is deployed. If we consider again the future visions of Microsoft and other mainstream product companies, they invariably make certain sets of assumptions about the society and culture in which their design will exist; the home will be clean, functioning, spacious and well connected. The workplace will be intellectually challenging, creative and liberating. We explored how these types of assumptions impact design in Chapter 2. These ideal conditions reflect the aspirations of the product and the designers, but rarely reflect the reality of the way that things work. Science-fiction author William Gibson’s famous maxim, ‘The street finds its own uses for things – uses the manufacturers never imagined’ (Gibson, 2012), describes the way in which technology often assimilates into the world, once it has left the laboratory, studio or showroom. The potential deviance of technology – broadly speaking, how far it deviates from its intended purpose – is quite vast. As devices and interfaces become more advanced, networked and adaptable, their potential uses and misuses multiply. Some of these uses may be benign. For example, accompanying the cryptocurrency boom of 2017 was a sharp increase in demand for, and the cost of, graphic processing units, or graphics cards, developed and usually used for PC gaming. Due to their computational structure, these technologies are particularly adept at mining cryptocurrencies compared to a normal CPU (Field, 2018). At the other end of the spectrum is the intentional misuse and abuse, as previously discussed, of Facebook and Cambridge Analytica. Uncertainty, deviance and futures

9781350068278_txt_app.indd 121

121

08/03/2021 11:13

The use of social networks for what might be considered deviant purposes are explored in Chapters 2 and 3 where we look at how social networks are used to organize protests or disaster relief when other infrastructures are unreliable. We also explored the ethical dimensions of how devices can be misused in Chapter 4. Analysts recognize the value of encrypted chat apps or platforms in allowing for freedom of information or movement in repressive nations, but equally recognize how these interfaces can enable deviant criminal behaviour (Romanosky, et al., 2015). Where these interfaces are not intentionally designed for certain use cases or are misused, the designer needs to consider their responsibility in providing opportunity, while limiting destructive or exploitative misuse. As previously mentioned, Internet of Things devices, which generally have very poor security, are now widely used as nodes in distributed denial-of-service (DDOS) attacks. DDOS attacks paralyze servers or devices by overloading them with requests for access, just like when a website goes down because too many people are trying to access it. According to Cocero Security, 2017 saw a 91 per cent rise in the number of DDOS attacks due to the proliferation of unsecured Internet of Things devices that could be used as nodes to make requests on servers (Rayome, 2017). The potential for misuse en masse of an exciting technology like the Internet of Things precludes discussion about how it might be used maliciously or deviantly. Equally, the uncertainty around future technologies still in development can lead to malevolent deviance, where the public or clients are intentionally misled, swept up in the excitement of a convincing future vision and the uncertainty of its realizability. To see this in action, it is worth considering the case of the now defunct medical device company Theranos. Theranos received hundreds of millions of dollars of investment for their blood collection vessel the ‘Nanotainer’, which promised to revolutionize blood testing. Theranos used convincing pitches, keynotes and conferences to demo the device. However, the device turned out not to work as promised, and consequently the company and its founders were arrested for fraud (Bilton, 2016). The case highlights the intersection of a popular imaginary about innovation in medical science, convincing future visions and how the uncertainty of complex technologies can entice us into a narrative. The pressure of working to innovate and develop new technologies and services can often lead to misleading and convincing visions. The video game No Man’s Sky, which used an innovative system of procedural generation to create a near-infinite universe to explore was announced by Sony in 2014 to a rapturous reception. What followed was two years of intense hype around the product, putting the small team of developers under intense pressure to live up to the public’s expectations. When the game released without many of the promised features, the 122

9781350068278_txt_app.indd 122

Chapter 6

08/03/2021 11:13

team received death threats and abuse from angry fans (MacDonald, 2018). The development process was no different to that of other games, but the added pressure of huge public scrutiny changed the imaginary and drove many journalists and fans to accuse the company of deception. The future is an exciting place for interface designers to work in. The ways that technology has expanded in recent years alone is astounding, with the realization of previously only science-fiction based interfaces like gestural controls, virtual reality and the Internet of Things. However, designers need to recognize the power that cultural imaginaries and narratives have in directing these developments. Designers should take time to utilize techniques like design fiction in order to explore alternatives, provoke debate about popular assumptions and innovate beyond the sometimes exploitative imaginaries that have been entrenched by pop culture and science-fiction.

Uncertainty, deviance and futures

9781350068278_txt_app.indd 123

123

08/03/2021 11:13

Interviews

124

9781350068278_txt_app.indd 124

08/03/2021 11:13

Anab Jain Anab Jain is a co-founder of Superflux, a London-based design studio that creates worlds, stories and tools that provoke and inspire us to engage with the precarity of our rapidly changing world. The studio works with a range of clients across design, policy, education and the cultural sector to engage audiences in difficult conversations about the future.

Tobias Revell What is an interface design? Anab Jain I think there is a functional definition within interaction and user experience design of what an interface is. It’s a valid definition about how well I can use my phone, or a website, or a kiosk or a ticket vending machine. But I also wonder if we could open up that term, to think about an interface also as a provocation. What if an interface has been designed to provoke me to think differently about the world that lies in front of me? I think it would be far more interesting if interfaces provoked us to think differently about our surroundings. An interface can be a point at which you start seeing things in a different way – in new ways; in engaging and telling ways. But also, in ways that might make you uncomfortable. In our project, Song of the Machine, we were thinking about how the human body could be redefined to better interact with the technology, rather than the technology being designed to address the human body or human needs. The interface in the Song of the Machine helps create a dialogue between the human being and the underlying technology. So, Anab Jain

9781350068278_txt_app.indd 125

125

08/03/2021 11:13

with the Song of the Machine we were working with a technology that enabled you to see the electromagnetic spectrum, which isn’t usually visible to the naked eye.

Figure 7.1

Superflux’s Song of The Machine (2011). Copyright Superflux. Song of The Machine by Superflux speculated on a future headset that allowed the user to see the electromagnetic spectrum by injecting a virus into the eye.

This was achieved by injecting a virus into the test subject’s eye, similar to the process involved in gene therapy. Basically, you inject the virus into the eye, modifying the body so that when the person puts on the headset, they see the world differently. The body was modified to work with the technology rather than the other way around. For me, those kinds of interfaces, those ways of challenging the human body or the human mind, I find interesting. Song of the Machine was about creating a provocation about what we take for granted about the human body, and what we take for granted about how technology should be at the service of the human. Normally, in user experience design everything is seamless, and the interface is so well designed that everything is hidden and you get no sense of how the system works. I’d like to see the opposite of that. TR How do you think interfaces affect and shape human behaviour? AJ Are the interfaces of Facebook and Instagram changing how I behave? 126

9781350068278_txt_app.indd 126

Interviews

08/03/2021 11:13

Yes, over a period of time we’ve all become victims to these seductive systems that require constant adulation from groups of people in order to legitimize who we are. So, in that sense, these interfaces are changing human behaviour. But that is user experience design; it’s directing users to certain types of behaviour and designers create constraints to steer that behaviour. You need those constraints. I mean, you want to be able to navigate through an airport and find your flight, so the designers create constraints that must work to do that. But by creating these constraints you remove agency. Agency is my individual capacity to be able to have some autonomy over my decisions, over my thinking, or in fact even expand my own thinking and my ability to see things and be. TR What do you see as the importance in imagining and working with the future? AJ I think it’s just thinking about time differently, isn’t it? It is mostly about acknowledging that your life is going to continue forward into time and the future is not something detached from you that will happen ‘one day’. The next moment is also future. Whatever you’re designing will have a consequence and an effect in time. For me, the value is in thinking about designing not for a fixed, rigid, solidified artefact in a moment of time but something that stretches beyond that fixed moment and changes the future. For example, today we are living with the consequences of decisions made in the twentieth century about the design of the car. I’m not suggesting that every object that designers are designing today will have the scale of impact of the car. That said, the design world is caught up in churning out products that change society without consideration of the consequences. I think it’s very important that when designers leave university and leave education, they understand the weight of their responsibility as designers and the consequences and impact of their designs. Thinking about the future is not an abstraction, we need to consider the consequences of our practices. TR What design strategies do you deploy to help people imagine other futures? AJ I think one of the big things in our work, which we don’t talk about, is fidelity. For example, with the Song of the Machine, the fidelity of the interface was integral to the outcome. The fidelity of the design matters. It matters to the intention of the work, to who we are talking and what Anab Jain

9781350068278_txt_app.indd 127

127

08/03/2021 11:13

change we want to affect. But in our work, we have to be careful not to make anything too polished or it becomes an object to be consumed, and it can’t be a useful tool for imagination. But if it is somewhere between a finished product and a sketch, it is compelling enough for people to suspend disbelief and imagine with but avoids being appropriated as an idea for the design of a consumer product. I think we have a lot to blame on science-fiction. I didn’t really grow up with the science-fiction that is quite popular in the West. I think that the power of imagination is in showing different, possible futures. In a way, it is actually an interface between the brute facts of everyday life and what could be. Any imagination about the future is not saying that that is the future. It’s just asking people for a reflection of a possibility that would not happen otherwise. TR What advice would you give to designers currently entering the practice? AJ If we look at what’s happening politically, economically, but most importantly with climate change, I feel like we are on a slide heading down, and the upward climb is going to be really quite hard. I don’t want to put a big level of responsibility on young designers, but I think at the same time, they’re our hope. It’s important that they are critical but flexible. Don’t get fixated on an ideological position. Our models of reality are ideological models, but they can make us stubborn and we need to develop ways to keep integrity as practitioners while continually moving forward and being aware of how we need to adapt to our changing environmental and economic conditions.

128

9781350068278_txt_app.indd 128

Interviews

08/03/2021 11:13

Dan Lockton Dan Lockton is Assistant Professor and Chair of Design Studies at Carnegie Mellon University School of Design in Pittsburgh, PA, and founder and director of the Imaginaries Lab. Dan is interested in questions of how we interpret, imagine and interact with the world – institutions, the environment, cities, infrastructures, technologies and complex systems around us – how they, in turn, model us, and what the consequences could be for design, which seeks to enable human agency as part of a transition to more sustainable futures. He is currently writing a practitioner-focused book on these topics, Design with Intent. Dan is also a Faculty Affiliate of Carnegie Mellon’s Scott Institute for Energy Innovation, and a visiting tutor at the Royal College of Art, London.

John Fass From your perspective and the work that you do, what do you think living with digital interfaces involves? What are the most significant effects? Dan Lockton The biggest thing that stands out for me is the feeling that there’s this huge continuous screen or world of stuff that is happening; that there’s a responsibility to pay attention to, because some of it is directed at you. But equally, you can’t possibly deal with all of it. It’s not really possible to step back from it and imagine what it would be like otherwise. You’re continually connected to it, but you can only ever see glimpses of it. Once you’ve put in the commitment to try to deal with it in some way, you feel more drawn into it. I remember as a teenager phoning into a radio station because you could get to be on the radio. It feels like a system that is continuously running and which finds its way into every part of our lives even though we can’t directly see all of it. 129

9781350068278_txt_app.indd 129

08/03/2021 11:13

There are bits of it that are experienced as a wave or a continuous flood of information you really need to pay attention to and deal with. That’s a major characteristic of it. I know you could see it as context collapse, but it’s also just the fact that it’s all mixed up. I had a student who did an experiment where he decided to put masking tape over his phone for a week. He got one of those wide rolls of masking tape, put it over the screen. He could see that there were messages, but he couldn’t read them. He could see dimly through it, and see what did it to his life. He called the project FOMO in the end. He felt like he was missing out. People would be talking about things he had no idea of. They’d be referring to stuff, even just conversations they’d had. He felt that it was just not possible to disengage from or ignore this continuous digital life. JF I feel like attempts by digital interfaces to handle that, by infinitely slicing the pie – so that I have my pictures on my Instagram and I have relatives on Facebook, and I have something different on WhatsApp and again on Twitter and Snapchat – that doesn’t seem to make it easier. DL It doesn’t enable the kind of prioritization you’d hope for. I remember when I used to check my emails once a week. I wonder whether it’s possible to live like that in a professional context or whether people develop interesting routines or self-erected ways of managing it. I don’t know. The biggest characteristic for me is it’s there all the time, whether or not you pay attention to it and it makes you pay attention to it. None of this stuff is static, it’s continuously running and developing and changing so that it requires attention. It’s not like a book and you can just say ‘well, I’m not going to read that right now’ or ‘I’ll put it in the stack in the corner and look at it at some point’. The next time you look at it, it’ll be different. So, what did you miss? I think it’s just the pervasive aspect that’s important as well. Maybe it’s just the digital approaches that have enabled this to happen. It’s as if there were millions of radio stations and millions of channels, and your friends are in them. The digital approach has enabled it in a way that wasn’t possible before. JF Maybe in a different way, I’ve been reflecting on this while writing this book, and one of the dominant paradigms seems to be the list, the scroll. The enforced repetition, here’s an item, here’s another one, here’s another one.

130

9781350068278_txt_app.indd 130

Interviews

08/03/2021 11:13

DL You know the idea of a feed, which is a horrible metaphor, isn’t it? Does it mean you’re actually being fed by it? You’re being sustained by this thing or is it more like force feeding of something or is it like animals in a feed lot or a factory farm? You eat it whether or not you like it because that’s what’s forced down your throat. People use phrases such as, ‘This was in my feed’ or, ‘Get out of my feed’ or, ‘I don’t want to see this in my feed’. That displays an inevitability, I’m obviously having to consume this feed. So, if something’s in it that I don’t like, the only thing I can do is complain about it – I can’t switch off the feed. JF Do you think there’s something about the design of those interfaces that very consciously created this kind of dependency? DL I think it would be rare for the designers to talk about creating dependency, but if you look at the decisions made, they would frame it in terms of engagement, or flip-throughs, or increase the amount of engagement. Former Facebook or Google developers or designers have said the priority is with the little red dot or however many unread notifications we need to continually make it look like there’s unread things. So, we’ll give you an unread notification that it’s been two years since you last did something, so they will create opportunities to continuously drive and reinvent. I think it’s definitely design decisions that are made, and some of it is like lists, structure – a perpetual set of endless to do lists almost. Or, even just the idea of endless scroll. When that first appeared, the idea was that this thing went on forever, you’re never going to get to the end of it. Because it’s baiting at both ends of the list as you do it, that type of thing – that is a design decision isn’t it? There are other ways you could do that. I feel like there are probably better structures for curating information, obviously more than the dominant feed-type model. But they may not be as exciting or engaging to people. JF It’s been an interesting one to think about from a design perspective. There seems to be some evidence that the behavioural effects of some of these interfaces might be related to our abilities to attend to things – attention span – and we see this in some of the death of long-form journalism. Do you think there are other ways that these interfaces shape our behaviour?

Dan Lockton

9781350068278_txt_app.indd 131

131

08/03/2021 11:13

DL I think in some cases the structure of digital interfaces prioritizes certain ways of thinking. The obvious one is the length of tweets or status updates – it means that you have to look at forms of thought that can be distilled into small units. I think that definitely affects the structure of the way people think, or the way people feel they have to think. Car crashes or accidents seem to be popular with people on the internet, you know they’ll click on a story that’s about something really bad, therefore the system optimizes to show things that are really bad. The internet seems to want car crashes, so car crashes are what we’re given. And I think that’s a big part of this, having any kind of algorithm that uses popularity, or assumes popularity based on something being clicked. That’s obviously going to favour sensational things, or things that are shocking, or things that are intended to rile people up or to be controversial, to upset them or to offer ten weird tricks. There’s one weird trick that doctors don’t want you to know, or the top ten reasons number two will shock you. That format of click bait, that’s an optimization based on designer systems that prioritize that sort of content, so it’s almost like the content becomes optimized to what the system seems to reward. That’s the result of the design decisions. JF We’ve spoken a bit about metaphors in other conversations, but in some way it strikes me that a lot of the metaphors at play in digital interfaces have been smuggled past in a way. This idea of lists and storage and updating; even the most basic ones – menu, navigation, buttons – they don’t present themselves as metaphorical representations at all. And they seem to have become so acculturated they pass without comment, without notice. DL Well they’re a different thing, the idea of a window. No one who’s using it is thinking of it as a metaphor for a window in the physical world, so it’s no longer really a window. It is, but it’s not. The affordances of it are not questioned, maybe, or not revealed. For example, the metaphor of following someone is not really questioned, if you were starting from scratch and had no concept of how or what the practices were of social media, how you did it, would you feel the need to ask for someone’s permission to follow them? It’s a bit weird in the real world, if you literally just followed someone it’s the same as stalking. Imagine if someone said I’m going to stand outside your house and listen through the walls to everything you say because I like your opinions. We’ve created and adopted some metaphors without really 132

9781350068278_txt_app.indd 132

Interviews

08/03/2021 11:13

teasing out what it means. Follow hasn’t always meant that, I guess it means more like a disciple maybe, sort of people following people because they’re enthralled to someone’s opinions. I don’t feel we’ve had enough alternative approaches. Maybe it’s just because I’m intrigued by unusual interface design, I feel it’s sad how quickly we converged on basically the same structure for almost all digital media. We’ve converged very quickly on one particular way of doing interface design that is optimized for addiction, or endless anxietyproducing things. I must pay attention to everything – what happens if I don’t? What if people are talking about me? It’s like that. JF What do you think about interfaces dematerializing even more and becoming invisible? I would never have a smart assistant in my house, but lots of people do and some of them even have all of them across different brands. And I met someone the other day who’s got one in every room, and that just seems to open up a whole level of interactions where it’s not even very apparent to the senses that you’re using an interface. DL If you could get something like an Amazon Echo but with all the processing in it, without connecting to anything else, without sending any data. I’m not against voice recognition systems, but I want them to be trained in the factory and then left at a level that’s good enough. Or I want them to be able to train themselves without any data going anywhere else. So I’m not against the idea of it, I just want to be able to know what’s happening with it. I think the invisibility of it is surprising how people don’t question that more, you can’t see what’s happening. You can’t see the ways in which your data is being used. JF Do you think people would think differently about it if there was a way of making all of that visible? Here’s your data going somewhere, here’s something happening to it, here it is being processed and stored, here it is being put into something else. Do you think that would have an effect if that was more visible? DL I’d like to say yes, people would say, ‘Oh wow, here’s the companies that are getting the data from me, this is now why I receive a load of adverts for this thing.’ Because you happen to mention this particular product and this is what’s happening, and this is how much they’re paying for Dan Lockton

9781350068278_txt_app.indd 133

133

08/03/2021 11:13

your data. I don’t know, I think making that visible would affect some people very deeply and it would have some effect, but I don’t think it would stop everyone doing it. In the same way, what proportion of people have ad blockers for example? I don’t know, 20 per cent maybe. I feel like it would probably be about that level, it would be people who are interested enough to explore what it is, they don’t want to break it open and understand all of it. I think visualizing it would be useful anyway. I think there’s a value in showing the stuff that’s going on in those systems whether or not it actually causes a major change in people’s interactions with it. It seems irresponsible not to try and visualize it. JF Picking up on that word – ‘responsible, responsibility’ – how do you think the responsibilities of interface designers should evolve? DL There’s always the question of what’s realistic for designers to do within organizations where they’re not led by the designers in most cases, very rarely. They’re not led by ethicists either. This is a big question because there are so many of these ethics checklists now, which have been mostly well-intentioned, and I think they’re probably pretty good. But on the other hand, at what point in the design process is that supposed to happen? All these deadlines, this team is doing that, at what point do you bring in your ethics checklist and say, ‘Well we shouldn’t have done that’? So, I think designers have a huge responsibility to do this stuff better, but they’ve been trying to work out when the appropriate moment is. Because it almost seems like it needs to work at the level that’s above the designers in most organizations. A company needs to decide – well our priorities are not going to be this, they’re going to be this instead. Or, ‘Our business model is not going to be driven by how many clicks’, or advertising based on number of clicks or invasion in that way. We’re going to find an alternative way of doing things. So, I feel that the designers have got a huge responsibility. There’s a set of questions that I’ve been trying to develop as an alternative to a checklist because it’s very difficult to know the answers. You could imagine this as a set of best practices, or an ethically developed design process. There’s been some really interesting work on that, where people have said, ‘Well don’t ever have a feature that does this, do this instead.’ Or, ‘Here’s a better approach.’ The way I’ve been trying to approach this is by asking a set of questions that designers can ask – that’s not to say it’s a yes or no answer. They’re slightly higher-level questions, so things like, ‘Will this feature that we’re introducing, is it going to really enforce an existing 134

9781350068278_txt_app.indd 134

Interviews

08/03/2021 11:13

power relation? And is that a good thing?’ It’s those sorts of questions that I think are probably too high level, they’re not just abstract things to ask. Will our service make it easier for designers to overcome structuring in society? They’re too high level for actual interaction design, I think. They were based on looking at different ethical frameworks and trying to extract things in the form of questions. Some of them are based on actual questions from philosophical ethics. Do we want everyone to use the product in the same way? Do we decide, as designers, who we would want to use it in one way and who we wouldn’t? What would the criteria be? Or, are we as designers also going to be using the product in the same way we’re expecting other people to? Would we want our friends and family to? Why or why not? Well, this is for others, not for us, of course. We know that it’s those sorts of questions, but they’re not really at the level of designing an actual interaction and more at the strategic level of, ‘Should we even make this product? Should we even do this thing?’ What would daily life look like if this product were part of it? Will people actually incorporate it into their lives? That was based on a question on how the introduction of microwaves changed family dinners.

Dan Lockton

9781350068278_txt_app.indd 135

135

08/03/2021 11:13

Mushon Zer-Aviv Mushon Zer-Aviv is a designer, an educator and a media activist based in New York and Tel Aviv. In his work, he explores the boundaries of interface and the biases of techno-culture as they are redrawn through politics, design and networks. Among Mushon’s collaborations, he is the Co-founder of Shual.com – a foxy design studio; YouAreNotHere.org – a tour of Gaza through the streets of Tel Aviv; Kriegspiel – a computer game version of the Situationist Game of War; the Turing Normalizing Machine – exploring algorithmic prejudice; Alef – the open-source multiscript font; the Collaborative-Futures.org – a collaboratively authored book and multiple government transparency and civic participation initiatives with the Public Knowledge Workshop. Mushon is also in charge of map design at Waze.com. Mushon is an honourary resident at Eyebeam – an art and technology centre in New York. He teaches digital media as a faculty member of Shenkar School of Engineering and Design.

John Fass How do you consider the effect of the interface of what you’re doing? Mushon Zer-Aviv I think there’s a certain idea of discovery, and I would say the curiosity and discovery that we’re trying to kind of embed within the experience. We analyse a lot of layers of open data and we try to accumulate that on a building-by-building basis, even apartment-by-apartment. Every apartment in New York has a set of insights that we expose to our product. So, if an apartment is for sale or rent, you can learn how it would feel to live in this place. As someone who’s very interested in the sector of technology and 136

9781350068278_txt_app.indd 136

Interviews

08/03/2021 11:13

data, about the perception of cities and the perception of the future, this kind of challenge is very interesting for me. I came to it with a lot of scepticism, which I maintain, and this is something that I’m trying to embed into the design work. So, it really depends on what you’re looking for. If you’re a family with children or you’re single, or you’re a couple, or you’re emptynesters, it really changes how you see different places and the different characteristics of these places. At the same time, one of the biggest challenges for digital interfaces, specifically ones that are very much data driven and future oriented, is to present the future as a set of possibilities, rather than a pre-deterministic prediction. JF It seems as if the interface of the system you’re working on reveals a lot of information about properties. Do you think there are things that the interface also conceals? MZ All the work we do with data is defined by what we can quantify, and the data exposes a set of signals that we can grasp. Missing data sets are not present, they’re not a part of the story. And that is a system level problem within these types of products. There are so many things that we cannot quantify [for example], if it’s an old apartment and it hasn’t been renovated recently, if the layout kind of sucks, if it’s very noisy. But the challenge here would be to say: ‘We don’t know the truth. It’s not like we have a whole complete vision of reality in which we’re just presenting you with an interface to that. We’ve just run some numbers that you might find useful.’ For me, this a huge challenge in the context of working in a commercial concern. That’s somewhere I would like to get better, but the culture of digital interface and specifically data-driven interfaces, is definitely not there. It’s definitely one of complete control and ultimate authority and passivity from the client’s side. And I don’t just see that as an ethical or philosophical challenge, I see that as a business challenge for this company. JF There are still interfaces that very often depend on some longestablished metaphors for how to use them and how to proceed through them. For example, the metaphor of a map, the metaphor of a button, the metaphor of progression through a system – all these kinds of things. Have you put any thought into how you deploy those metaphors? And, what do you think the effect is of the ones that you use? Mushon Zer-Aviv

9781350068278_txt_app.indd 137

137

08/03/2021 11:13

MZ One of the things that is very present in our product is the idea of layers on top of the map, the invisible layers. So, there are the basic layers of the map that represent more concrete, visible elements like roads, buildings, water bodies, landmarks and so on. Then there are the more dynamic layers that we keep on loading on top of the map. To that we add a lot of text in our interfaces. It’s always a mix of a map and text and the texts try to be as colloquial as possible, so we attempt to speak about our places. Data is part of culture and a product of language, and rather than beautiful evidence as this kind of concrete external, proof of reality, youcan’t-argue-with-the-evidence, I would take it the other way around. I would say this is a beautiful argument. And it might be beautiful or not, but it’s an argument. It’s a visual argument and by attaching the visual layers of the map to the textual and colloquial language, the attempt is to say: ‘This is what we want to ask about.’ And there are also dynamics that might change. The buildings might change, we represent that, too. But there’s something playful about speaking about place and speaking both with words and with graphic layers on top of the map. So that is my attempt to respond to this idea of evidence rather than argument. JF It seems from what you’re saying that you would bring about on this interface, perhaps, a new type of syntax that connects colloquial textural annotation to perhaps a grid-like representation of the New York streets. If the language is colloquial, that would have some sort of familiarity, I imagine, with your users. At the same time, you have two levels of abstraction: a visual and a linguistic, and the interface is required to integrate those two levels of abstraction. Do you think that works? And how did you get there? What was necessary to think about? MZ Do I think it works? It’s very hard to measure, in a company that is quite obsessed with measurement. I think what we tried to achieve, at least in the initial proof of concept, was to keep the whole product mobile-based, and to swipe it text by text. And the text would be not more than two sentences and with a very clear map layer. In what we’re developing now and what we’ve started, there is something that is much more integrated. The kind of knowledge and the insights are much more integrated to the page. The map is still very integral to the experience, but the interface is more dynamic and not just one type of interaction. 138

9781350068278_txt_app.indd 138

Interviews

08/03/2021 11:13

JF How do you think your interface specifically shapes the behaviours of the people who use it? MZ I think what our interface does is try to do one of the things that people speak about when they speak about this interface. The experience is that they feel more confident about the decisions that they make, and they feel smarter about a process that they often don’t feel as powerful in. I think we’re getting to a point of plateau in machine learning and artificial intelligence where there are lots of ‘The machines are coming!’ articles about, ‘Oh, you wouldn’t believe what this New Yorker Network just did and how amazing it is.’ These have become such popular ‘clickbaity’ articles, but at the same time, they all embed this idea of a very unified progress, to work as fewer human agencies and more machine, and more machine agency. I think that things are different when we put aside these colourful articles and look at the actual products that we use on a daily basis to make decisions that affect our lives, rather than, say, allow a machine to decide how many rabbits it sees in an image. We see that there’s a plateau on deciding whether these artificial intelligence mechanisms are a filter or inside the engines, and they get to the point that there’s only so much they can tell you, especially when we talk about prediction algorithms. So, the potential is much bigger because if I can tell you that the school you’re looking at, is zoned for the apartment that you are looking at, is actually struggling in comparison to other schools, but it actually has a new leadership and we’ve seen other examples of other schools that are struggling as well. So we can’t actually digest all of that for you, because it’s not a cookie cutter kind of question, but we can formulate this question to you in the sense of keeping the interface colloquial. We can’t tell you if it’s good or bad. We’re not going to skew a kind of result for you, but we are going to tell it to you trusting what is important to you. And maybe it’s a question you’ve never asked before. Maybe it’s a question that you actually didn’t want to ask, you know, because it means that you have the responsibility for your children’s education, not something that we’re going to solve for you. JF What do you think about the digital interface designers of the future? What are they going be doing? What skills do you think will be necessary? Mushon Zer-Aviv

9781350068278_txt_app.indd 139

139

08/03/2021 11:13

MZ I think designers should be much more involved with the set of affordances that make the technology, and specifically data-driven technology, provides. If design is very much interested in problemsolving, I see design as an opportunity to better formulate the problem, not only provide a kind of a unified solution. It seems like too much of technology is trying to become this ultimate solution to fill the blank, rather than an opportunity to get a bigger perspective on where we are, what’s ahead of us. I have this love/hate relationship with data because I feel like there’s so much we can get out of a healthier relationship with data-driven interfaces. We often kind of stop at the low-hanging fruits that are not even very healthy for us. They’re kind of rotten, actually, most of them. I see a lot of potential, and I think most of it is about the different use of technology rather than new technological advancement. I do need to add something. I almost sound like a salesperson to what we’re trying to do with my design team in my company, but this is such a huge struggle because the things that I’ve defined as the challenges between technology – deterministic technology versus technology as a tool to explore possibilities. So, the challenges are very interesting and indeed very challenging, and it gets me very excited about what I’m doing because I feel a much deeper kind of intellectual integrity towards the kind of change that I want to see. I don’t have the blessing or the curse of just saying what I want. I have the blessing and the curse of actually trying to find out how – how can something like that come about? So yeah, that’s where I’m at.

140

9781350068278_txt_app.indd 140

Interviews

08/03/2021 11:13

Sarah Gold Sarah is a designer interested in privacy, security and systems change. She creates interventions that show how technology can respect more of our rights. She founded IF (projectsbyif.com) – a technology studio specializing in ethical and practical uses of data, who have completed work for Google, the Co-Op, Bulb and others.

Ben Stopher So, can I ask you how you consider interfaces in your organization’s practice? Sarah Gold Interfaces are really at the heart of what we do and that’s because they are a prime place to communicate what’s going on with data in a service. You can use interfaces to tell a user of that service what’s going on throughout the stack. I think it’s also an important part of our practice, because we challenge the kind of interface design that’s done today. Both because of this kind of focus that we’ve seen of seamless user experiences whereas we think that, increasingly, to have services that people are able to trust we need to reintroduce things like seams and friction into certain parts of the interface, to make sure that people have the opportunities they need to understand or to hold a service to account in some way. We also like to spot patterns across interfaces, too. One of the first public-facing pieces of work that we did at IF was to create the patterns catalogue, which is a collection of data patterns where you can see repeatable solutions to common problems emerging across the design of services and products. I think there’s also a big pull from designers Sarah Gold

9781350068278_txt_app.indd 141

141

08/03/2021 11:13

and developers to understand what kinds of interface patterns they could use to explain data sharing in ways that are more conducive to user control and agency. BS In that answer you sort of eluded to the fact that you consider interfaces potentially an unequal point of contact between product services and people. Is that fair? SG I think when you look at certainly where or how power has shifted in society, you now have companies that self-identify as tech companies which hold a great deal of power where perhaps that power sat elsewhere before. And their main connection, with people, has been through digital interfaces. And I think that kind of complexity and power that sits with those big organizations is something that has both been shifted to individuals to understand, for example, through things like terms and conditions. The way that the system works has been so obfuscated from the user that there is a kind of power dynamic at play where I think there is a kind of unfair power dynamic, because I don’t think at the moment that we are designing interfaces that truly care for people, or that really help them to understand complexity within something that they’re trying to do to give them agency. BS In terms of how designers working on interfaces understand that, I think there’s a bit of a gap there. I think that design as a discipline hasn’t sufficiently kept up with the agency of the systems it’s producing. Would you agree with that? And if so, how do you or IF get into that? SG I think that designers today [often] do not have enough technical intuition to understand which parts of the system may need to be explained or understood by different actors within a much bigger system, so within thinking about other actors, like civil society organizations or journalists, not just individual users. I think there’s a knowledge gap, which is a technical intuition one. I think it’s more about critical thinking about digital. And I think that the other piece of that is that the organizations that can also attract the top design or digital talent and then retain that talent over time, because frankly, they can pay the biggest salaries, also have an internal culture that means that product teams more broadly, I think, find it harder to question their own thinking or look at society more broadly. 142

9781350068278_txt_app.indd 142

Interviews

08/03/2021 11:13

So, I think that that’s also problematic at the moment – that we don’t have enough good work to point at where there have been truly multidisciplinary teams working on issues at a society level to point at to say, ‘This is really great design that has both commercial benefit and public interest at heart.’ So I think that that’s hard if you’re a designer coming through, whether that be university or an apprenticeship – where do you take great work to copy? BS The ecosystem of digital interface design is a consumer-centric one, yet a lot of what’s being produced in some of these systems is [impactful for] civic and social relationships, so the ethos, if you think in old money, you had public service and government and you have commercial sector and they’re differently motivated. And in digital, in italics, you get the complements of two, but it’s led by the money. And I think that sort of tells you why some of the things you’re saying are true currently. SG Absolutely. And I think if we are to move beyond that, we try to ask ourselves or understand in any project that we do, where power might sit and to what or whom that power serves. And so, what we’re looking for in Designer IF is this question of does what we are designing or does this interface that we’re designing, if that’s what we’re doing, does this demonstrate care for people? And I think that’s a really important litmus test for us at IF about whether the work we’re doing is field-building excellent work. There are a bunch of other principles that we’ve set ourselves to try and move the designers here to question work in that slightly different way. And I think the other key piece of that is that we’re always thinking beyond individuals. We’re trying to look at how what we do can create services that meet the realities of fair relationships with other people today, both in terms of people they might be responsible for. For instance, whether that be a child or an elderly grandparent, for instance, but also the relationships that we have today with institutions and companies, too. I think that that has always been a helpful way for us to think about designing not just for individual user needs but for other kinds of needs too. Some of that comes down to the need for explainability or the need for accountability. And I don’t think that any of those needs that society hasn’t ever needed those before. I don’t think these are needs that are particularly new, but I think they are transformed in a digital age, because technology can help us to provide those things in different ways.

Sarah Gold

9781350068278_txt_app.indd 143

143

08/03/2021 11:13

BS Given this, how do you design for genuine user need? What is a bit of IF practice that really helps you do that? SG To design for where we see genuine user needs that are currently unmet, we always look beyond individual needs. We think about the system more broadly, so we’re trying to find ways of designing for multiple relationships that people have today. Relationships where someone may rely on someone else to trust something or where you might delegate someone else permission to do something on your behalf. So these kinds of relationships we have with other people, we think that services should be designed in a way that enable those relationships still to be true and possible, so we break away from this kind of individualistic model. And I think it’s then where you can start to think about user needs being much more about community and togetherness. At IF, we try to push that through thinking beyond the individual and bringing in questions about how the service is held to account. When might a user of that system want to have some method of recourse? So we’re trying to then tie some of those topics about discrimination, fairness and accountability, into the service so that those things happen at point of use. It’s not something that an individual has to do at another time. It’s trying to build those characteristics of how you might build trust into the needs that we’re identifying. BS Given that, what issues in interface design do people often get wrong? SG The first one, which ties into what you’ve just mentioned, is I think that interface design often focuses on the user always being an individual. I think what’s problematic there is that interface design doesn’t help people to step back to see wider society, to see the communities around them. And I think that we are currently going down a route of hyperpersonalization, and whilst that will have many benefits to particular users in particular situations with particular services, I think it has the potential to make it harder to do that stepping back piece, to look more broadly. I think the other one to bring up is that there’s a focus to create these kind of seamless user experiences where the quantitative research done on them is about how many users go through a particular flow at what particular speed, but I think what we don’t ask is we do that, but at what cost? I think that, increasingly, we need to show the seams in services, and Matt Jones [Google] has talked previously about designing 144

9781350068278_txt_app.indd 144

Interviews

08/03/2021 11:13

beautiful seams, where the interface design can explain how a service works, which enables someone to then interrogate it, should they want to. And I think the other part of that is also knowing when slowing down or friction is a good thing. BS What kind of culture does good interface design come from? SG This is a really hard question and I think we are seeing the answer to this emerging in different pockets. My first thought on this is certainly a multidisciplinary and diverse team. I think something that we’ve heard a lot about, about the importance of having diversity in teams, but I think it’s still such an unsolved problem that it would be a mistake not to continue talking about it and giving it the attention and airtime that it needs. That also comes with a culture that encourages discussion and different points of view, and I think that point of view piece is where you then come into that multidisciplinary characteristic, because I think it’s drawing on different specialisms. For instance, academia, I think, is one key area. We worked with the London School of Economics recently on a project that was funded by open society foundations looking at accountability and what explainability meant in automated decision-making services. That work was then picked up by Google AI and we were commissioned to do a further sprint, because they could see the advantages of bringing both academic practice and product design together to see what new ideas would come from that collaboration. Multidisciplinary teams are really important and I think good interface design will only come from a place where we are also thinking about the most vulnerable users in society. BS Based on your experience, what is your advice for future digital interface designers? SG I think to understand and be critical of the power that you’re executing and the things that you’re building. And alongside that, consider what relationships you’re designing for, that you’re giving power for, and which relationships you’re not. And that’s not just between people, but also people and institutions. The third piece would be that future digital interface design is a really exciting place to be working, and I’m full of hope for what’s possible. It’s also a place where there’s so much invention yet to be done, because interface design is not just about a screen. It won’t just be about Sarah Gold

9781350068278_txt_app.indd 145

145

08/03/2021 11:13

buttons, but it will be other forms of input, too, like haptic or voice. These kinds of things need to be designed for, too, and might be shown on interfaces that aren’t screens, that are in fact buildings, for instance. So I think that there’s a lot more creative thought in interface design to do and that’s not going away. I think it’s a really influential place to be in for the future, as it is right now.

146

9781350068278_txt_app.indd 146

Interviews

08/03/2021 11:13

9781350068278_txt_app.indd 147

08/03/2021 11:13

Glossary

Algorithm An automated set of instructions for processing data. These can be incredibly simple but tend to increase in complexity as the data sets being dealt with get larger. Anthropomorphization The intentional or assumed attribution of human traits or behaviours to non-human things. Designers often use anthropomorphization to convey familiarization and encourage attachment to designed artefacts. Apparatus Apparatus is a term used to describe devices (dispositif) or mechanisms that enable and increase existing power structures within social structures. The focus is on the system of relations of discourses. For Michel Foucault, the apparatus referred to institutional mechanisms like universities or prisons, while Friedrich Kittler extended the term to technological devices. See: Michel Foucault, The Confessions of the Flesh; Gilles Deleuze, What is a Dispositif?; Friedrich Kittler, Gramophone, Film, Typewriter

148

9781350068278_txt_app.indd 148

Artificial Intelligence (AI) A broad concept embracing many fields of computation including machine learning. It refers to the idea of computers or interfaces that behave like, and in some cases are indistinguishable from, humans. Black box A black box is a term used to describe complex systems or devices whose internal mechanisms are hidden or obscured. Digital devices like personal computers, smartphones, etc are black boxes. See: Bruno Latour, Pandora’s Hope

Blacklisting The censoring of websites or services. This can be done at multiple levels by corporations such as internet service providers, search engines, organizations’ local area networks or government policy. Bluetooth A very common protocol and system for exchanging data wirelessly over a short distance. It is commonly used in audio interfaces for headphones and speakers.

Design and digital interfaces

08/03/2021 11:13

Circular design An approach to design that considers the entire lifecycle of a product, from production to disposal and recycling. A derivation of ‘cradle to cradle’ in industrial design. See: Ellen MacArthur, The Circular Design Guide

Dark patterns Interactions that are deliberately designed to conceal or mislead users of digital interfaces, usually by exploiting users’ intuitive behaviour. These are prevalent across some of the most heavily used interfaces and systems of the Web. Design fiction A design method that involves creating fictional scenarios or products to tell a story, ideate or invite debate. It differs from speculative design in that it is not necessarily critical. For example, companies can use design fiction to promote future products that are still in development.

Cognitive load The amount of time and effort dedicated to thinking about certain tasks or interactions. Cookies Small packets of data downloaded to user’s devices when they access websites or services. They store information relating to user preferences and make load times faster if a user returns. Also an American word for a biscuit. Critical practice Design or creative practice that aims to provoke and interrogate or is aware of its social, cultural and political context. Examples include critical and speculative design. See: Matt Malpass, Critical Design in Context

Cryptocurrency A type of currency that is generated by computational ‘mining’. Its value is calculated based on the difficulty of mining it and demand. In this sense, computation is a method of resource extraction. Examples include Bitcoin and Ether. Cybernetics A philosophy and theory dominant in the early age of computation. It deals in the feedback loops between input, output and the resulting automation of processes.

Design patterns A concept pioneered by Christopher Alexander in 1977. It involves the use of recyclable, modular frameworks in software or design, which can be applied to different projects or products rather than bespoke designs for each separate project. See: Christopher Alexander, A Pattern Language

Diegetic prototype A fictional artefact designed to elicit a fictional scenario or world. Film props are diegetic in that they ‘work’ in the world of the film but not in the real world. They are often used in design fiction. See: David Kirby, The Future is Now

Distributed denial-of-service (DDOS) A malicious hack in which a device or server is bombarded with too many requests, forcing it to crash. Hackers will infect other devices, called ‘zombie nodes’, with malware in order to control them and direct them to attack their target.

Glossary

9781350068278_txt_app.indd 149

149

08/03/2021 11:13

End-user license agreement A common contract between a user and a company or service outlining the rights and terms of use for the product. They usually apply to software or apps at point of purchase and are often very extensive and rarely, if ever, read by users.

Graphical user interface (GUI) First developed by Apple computers in the early 1980s, a GUI is now standard for most human-computer interactions. GUIs use a system of icons to describe the processes and structures of a device rather than a command line.

Fake news A recent phenomenon where misinformation is presented as legitimate fact by design, often on websites claiming to be legitimate news outlets. The structures of social media, geared towards sensationalism and engagement, often means this spreads quickly and mimetically.

Haptics Any form of interaction that involves touch.

Filter bubble A phenomenon where users, through the services they interact with and the way they are constructed, are increasingly solely exposed to information that confirms their worldview rather than a range of views and opinions. See: Eli Pariser, The Filter Bubble

General data protection legislation (GDPR) A key piece of EU legislation designed to protect people’s data. It puts the onus of clarifying use of and being responsible for storage and deletion of data on those who gather it. Gestural interface An interface that uses interactions with the hands, face or body as opposed to a keyboard, mouse or other peripheral for control.

150

9781350068278_txt_app.indd 150

Human-computer interaction (HCI) A broad field of research and development that predates contemporary user experience (UX) and user interface (UI) design. It refers to the study and design of interactions between humans and computers. Imaginary A broad concept borrowed from social science that describes the collective symbolic understandings of a group or society. For example, we have a collective imaginary of artificial intelligence built from popular discourse, science-fiction and technological innovation. Internet of Things A broad, catch-all term used to refer to the idea of internet-connected devices and objects. This particularly refers to devices not necessarily used for communication, and household objects such as fridges, toasters and thermostats, which may be automated or controlled through the internet. Internet protocol (IP) address A numerical identification assigned to each device connected to the internet.

Design and Digital Interfaces

08/03/2021 11:13

Internet service provider (ISP) Usually a telecommunications company that services and provides internet access to users. Last mile problem A term borrowed into telecommunications from supply chain logistics. It refers to the financial inefficiency at the outer reaches of systems where there is little profitable activity but the same cost, for instance, in distant or rural areas. Machine learning An approach to artificial intelligence in which computers are ‘trained’ on large data sets to infer or learn patterns from which they can then formulate algorithms or instructions. This allows the computer to ‘predict’ future data. Mesh network An alternative model of networks infrastructure. Rather than centralized servers and nodes, the weight of the network is distributed across all users. This makes it hard to censor or take down, with no centralized points, but technically difficult to operate. Net neutrality The concept that all information on the internet should be equally accessible. The alternative allows companies to regulate the speed of access to different sites. So, for instance, a video streaming service might bid for more bandwidth from an internet service provider, giving them preferential, easier and more reliable access to their users.

Open source Usually software or information that is free for public reading, consumption and/or modification. It is opposed to closed source where the software or information is proprietary and protected. Personal intelligent assistants (PIA) Apparently intelligent operating systems or interfaces. These often use conversational interfaces. Examples include Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa. Phishing A technique used to trick users into handing over sensitive or personal information by duplicating or faking legitimate communication. Often in the form of apparently legitimate emails or websites. Protocol The base-level methods and agreements by which data is transferred and exchanged. Most commonly, these refer to DNS (Domain Name System) and TCP/ IP (Transmission Control Protocol/ Internet Protocol). See: Alexander Galloway, Protocol

Roach motel Design that makes it easy for the user to get into a certain situation and difficult to get out again, for example, making it easy to subscribe to a publication and difficult to opt out.

Glossary

9781350068278_txt_app.indd 151

151

08/03/2021 11:13

Seamlessness An approach to design that encourages smoother, frictionless interactions between users and services. The imperative is on reducing the amount of clicks and user input between the user and their goal. The opposite is seamfulness. Skeuomorphism When the design of a new interface or system is iconically or symbolically represented by older interactions. For instance, music apps may be designed to look like the front of analogue music systems like hi-fis. Solutionism / Technological-solutionism The belief that any problem can be solved by using technology. See: Evgeny Morozov, To Save Everything, Click Here

Speculative design A critical approach to design that utilizes design fiction to provoke debate about the development of technology and society. It proposes fictional devices or scenarios for audiences to critically consider and discuss. See: Anthony Dunne and Fiona Raby, Speculative Everything

The stack A concept popularized by theorist Benjamin Bratton. It conceptualizes the Earth and technological systems as a planetary-scale computational process, connecting resource extraction, pollution, commerce, society and everything else.

Systems theory A set of theories across many domains that suggest the interrelations of things and the way they interact as systems. Technological determinism A theory, dominant in the early twentieth century, which suggests that technology shapes society and culture. Terms of service / Terms of use An agreement between a user and a service for how a service can be used. These are most often used on websites, and consent is assumed if a user accesses the site or service. Troll farm A group of people or an organization with the explicit aim of disrupting or corrupting information through social networks, sometimes en masse, sometimes directed at specific individuals. Historically, these were ad hoc groups of activists. Recent years have seen governments financing troll farm operations. Ubiquitous computing A speculative description of a mode of computation and digital devices embedded in everyday life, first posited by Mark Weiser and others in the 1990s. It is often seen as a theoretical precursor to the Internet of Things. User-centred design A very common design method that involves adapting and responding to user needs and expectations.

See: Benjamin Bratton, The Stack

152

9781350068278_txt_app.indd 152

Design and Digital Interfaces

08/03/2021 11:13

User experience / UX design A design field and approach that aims to enhance the usability of interfaces and digital systems. Virtual reality An interactive experience taking place in a computer-generated environment. It often utilizes gestural interfaces and audio and visual immersion. War-chalking An early form of network activism where security protocols and access methods for Wi-Fi networks were written in chalk on streets and walls near hotspots by activists with the ambition of helping others to access them. WYSIWYG (What you see is what you get) An environment or interface in which the results of a process are shown immediately or are directly viewable. It’s one of the foundation concepts of graphical user interfaces.

Glossary

9781350068278_txt_app.indd 153

153

08/03/2021 11:13

References

2001: A Space Odyssey (1968), [Film] Dir. Stanley Kubrick. UK: Stanley Kubrick Productions.

Ashby, W. R. (1956), An Introduction to Cybernetics, London: Chapman and Hall.

Albrechtslund, A. (2008), ‘Online social networking as participatory surveillance’, First Monday, 13 (3). Available online: https:// firstmonday.org/article/view/2142/1949; (accessed 16 August 2020).

Bach, V., Berger, M., Helbig, T. and Finkbeiner, M. (2015), ‘Measuring a product’s resource efficiency: a case study of smartphones’, in VI International Conference on Life Cycle Assessment, Lima, Peru: 133–136.

Alexander, C. (1978), A Pattern Language: Towns, Buildings, Construction (Center for Environmental Structure Series), USA: Oxford University Press.

Banks, J. (2010), ‘Regulating hate speech online’, International Review of Law, Computers & Technology, 24 (3): 233–239.

Apple Inc. (n.d.), ‘Apple human interface guidelines’, Apple. Available online: https:// developer.apple.com/ios/human-interfaceguidelines/user-interaction/audio/ (accessed 16 August 2020).

Barbrook, R. and Cameron, A. (1996), ‘The Californian ideology’, Science as Culture, 6 (1): 44–72.

Arthur, C. (2015), ‘Whatever happened to Minority Report’s technology predictions?’ The Guardian, 18 September. Available online: https://www. theguardian.com/technology/2015/sep/18/ minority-reports-technology-gestural-controlleap-motion (accessed 16 August 2020).

Artificial Intelligence Committee. (2018), ‘AI in the UK: ready, willing and able?’, Available online: https://publications.parliament.uk/pa/ld201719/ ldselect/ldai/100/10014.htm (accessed 16 August 2020).

Ash, J. (2015), The Interface Envelope: Gaming, Technology, Power, London: Bloomsbury.

154

9781350068278_txt_app.indd 154

Bassett, C., Steinmueller, E. and Voss, G. (2013), ‘Better made up: the mutual influence of science fiction and innovation’, Nesta Working Paper, 13 (07), London: Nesta.

Bateson, G. (1972), Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology, Chicago: University of Chicago Press.

BBC (2010), ‘Apple boss defends conditions at iPhone factory’, BBC News, 2 June. Available online: https://www.bbc.co.uk/news/10212604 (accessed 16 August 2020).

Berman, F. and Cerf, V.G. (2017), ‘Social and ethical behavior in the internet of things’, Communications of the ACM, 60 (2): 6–7.

Design and digital interfaces

08/03/2021 11:13

Berners-Lee, T. (2017), ‘Three challenges for the Web, according to its inventor’, World Wide Web Foundation. Available online: https:// webfoundation.org/2017/03/web-turns-28-letter/ (accessed 16 August 2020).

Bridle, J. (2012), ‘Dronestegram: the drone’s eye view’. Available online: http://booktwo.org/ notebook/dronestagram-drones-eye-view/ (accessed 16 August 2020).

Bilton, N. (2016), ‘Exclusive: How Elizabeth Holmes’s house of cards came tumbling down’, Vanity Fair, 6 September. Available online: https:// www.vanityfair.com/news/2016/09/elizabethholmes-theranos-exclusive?verso=true (accessed 16 August 2020).

Bridle, J. (2014), ‘The algorithm method: how internet dating became everyone’s route to a perfect love match’, The Guardian, 9 February. Available online: https://www.theguardian.com/ lifeandstyle/2014/feb/09/match-eharmonyalgorithm-internet-dating (accessed 16 August 2020).

Bogost, I. (2018), ‘More bridges will collapse’, The Atlantic, 14 August. Available online: https://www. theatlantic.com/technology/archive/2018/08/ the-age-of-precarious-infrastructure/567493/ (accessed 16 August 2020).

Brignull, H. (2011), ‘Dark patterns: deception vs honesty in UI design’, A List Apart, 1 November. Available online: https://alistapart.com/article/ dark-patterns-deception-vs.-honesty-in-ui-design (accessed 16 August 2020).

Bogost, I. (2018), ‘Sorry, Alexa is not a feminist’, The Atlantic, 24 January. Available online: https://www.theatlantic.com/technology/ archive/2018/01/sorry-alexa-is-not-afeminist/551291/ (accessed 16 August 2020).

Buchanan, R. (2001), ‘Human dignity and human rights: thoughts on the principles of humancentered design’, Design Issues, 17 (3): 35–39.

Bootstrap. (n.d.), Bootstrap.org. Available online: https://getbootstrap.com/ (accessed 16 August 2020).

Burrell, J. (2016), ‘How the machine “thinks”: understanding opacity in machine learning algorithms’, Big Data and Society, 3 (1): 1–12.

Carbon, C. and Jakesch, M. (2013), ‘A model for haptic aesthetic processing and its implications for design’, Proceedings of the IEEE, 101 (9): 2123–2133.

Bosch, T. (2012), ‘Sci-fi writer Bruce Sterling explains the intriguing new concept of design fiction’. The Slate, 2 March. Available online: https://slate.com/technology/2012/03/brucesterling-on-design-fictions.html (accessed 16 August 2020).

Chalcraft, E. (2013), ‘UK government website wins Designs of the Year 2013’, Dezeen, 16 April. Available online: https://www.dezeen. com/2013/04/16/gov-uk-government-websitewins-designs-of-the-year-2013/ (accessed 17 August 2020).

Bowles, C. (2018), Future Ethics, London: NowNext Press.

Bratton, B. (2012), ‘The Cloud, the State, and the Stack: Metahaven in conversation with Benjamin Bratton’, Mthvn.tumblr.com. Available online: http://mthvn.tumblr.com/post/38098461078/ thecloudthestateandthestack (accessed 16 August 2020).

Bridle, J. (2011), ‘The new aesthetic’, James Bridle. Available online: https://jamesbridle.com/works/ the-new-aesthetic (accessed 16 August 2020).

Chalmers, M. and MacColl, I. (2003), ‘Seamful and seamless design in ubiquitous computing’, in Proceedings of Workshop at the Crossroads: The Interaction of HCI and Systems Issues in UbiComp.

Chambers, W&R. (2000), Chamber’s Twentieth Century Dictionary (Vol. 1), Thomas Davidson: Dictionary. Jakarta: Bukupedia.

References

9781350068278_txt_app.indd 155

155

08/03/2021 11:13

Chandrashekar, A., Amat, F., Basilico, J. and Jebara, T. (2017), ‘Artwork personalization at Netflix’, Medium Netflix Technology Blog, 7 December. Available online: https://medium. com/netflix-techblog/artwork-personalizationc589f074ad76 (accessed 16 August 2020).

Chiu, W., Kwon, N., Miyake, K. and Nova, N. (2012), Curious Rituals 2.0. Available online: https:// curiousrituals.wordpress.com/ (accessed 16 August 2020).

Connor, S. (2017), Dream Machines, London: Open Humanities Press.

Crum, R. (2018), ‘Amazon’s Alexa can now “disengage” if asked sexually harassing questions’, SiliconBeat, 17 January. Available online: https://www.thestar.com.my/tech/technews/2018/01/19/amazons-alexa-can-nowdisengage-if-asked-sexually-harassing-questions (accessed 16 August 2020).

Curious Rituals: A Digital Tomorrow, (2014), [Film] Dir. Near Future Laboratory. Vimeo. Available online: https://vimeo.com/92328805 (accessed 16 August 2020).

Davis, E. (2015), Techgnosis: Myth, Magic and Mysticism in the Age of Information, Berkeley, California: North Atlantic Books.

Deleuze, G. (1988), Foucault, Minneapolis: University of Minnesota Press.

Deleuze, G. (1992), ‘What is a Dispositif?’, in T.J. Armstrong (ed.), Michel Foucault Philosopher, Hemel Hempstead: Harvester Wheatsheaf: 159–168.

Dewey, J. (1934), Art as Experience, New York: Perigee Books.

Di Placido, D. (2017), ‘YouTube’s “Elsagate” illuminates the unintended horrors of the digital age’, Forbes, 28 November. Available online: https://www.forbes.com/sites/ danidiplacido/2017/11/28/youtubes-elsagateilluminates-the-unintended-horrors-of-the-digitalage/#6b04cb276ba7 (accessed 16 August 2020).

DiSalvo, C. (2009), ‘Design and the construction of publics’, Design Issues, 25 (1): 48–63.

Dubberly, H. and Pangaro, P. (2015), ‘How cybernetics connects computing, counterculture, and design’, in Hippie Modernism: The Struggle for Utopia, 126–141.

Dunne, A. (2006), Hertzian Tales: Electronic Products, Aesthetic Experience, and Critical Design, Cambridge, Massachusetts: MIT Press.

Dunne, A. and Raby, F. (2013), Speculative Everything: Design, Fiction and Social Dreaming, Cambridge, Massachusetts: MIT Press.

Dunne, A. and Raby, F. (n.d.), Critical Design FAQ. Available online: http://www.dunneandraby.co.uk/ content/bydandr/13/0 (accessed 16 August 2020).

Ebersold, K. and Glass, R. (2016), ‘The Internet of Things: A cause for ethical concern’, Issues in Information Systems, 17 (4): 145–151.

Elfline, R. K. (2016), ‘Superstudio and the “Refusal to Work”’, Design and Culture, 8 (1): 55–77.

Elstrom, P. (2018), ‘Apple supplier workers describe noxious hazards at China factory’, Bloomberg News, 16 January. Available online: https://www.bloomberg.com/news/ articles/2018-01-16/workers-at-apple-suppliercatcher-describe-harsh-conditions (accessed 16 August 2020).

Fairphone. (n.d.), Available online: https://www. fairphone.com/en/ (accessed 16 August 2020).

156

9781350068278_txt_app.indd 156

Design and Digital Interfaces

08/03/2021 11:13

Fallman, D, (2003), ‘Design-oriented humancomputer interaction’, Human Factors in Computing Systems, the Proceedings of CHI (Association for Computing Machinery, 2003): 225–32.

Foroohar, R. (2018), ‘The growing public animosity towards large Silicon Valley platform technology companies and their Chinese equivalents’, Financial Times, 16 December. Available online: https://www.ft.com/content/76578fba-fca111e8-ac00-57a2a826423e (accessed 16 August 2020).

Fang, L. (2019), ‘Google hedges on promise to end controversial involvement in military drone project’, The Intercept, 1 March. Available online: https://theintercept.com/2019/03/01/googleproject-maven-contract/ (accessed 16 August 2020).

Farman, J. (2012), Mobile Interface Theory: Embodied Space and Locative Media, New York: Routledge.

Felton, E., Zelenko, O. and Vaughan, S. eds., (2013), Design and Ethics: Reflections on Practice, UK: Routledge.

Feminist Internet (n.d.), Available online: http:// www.feministinternet.com (accessed 16 August 2020).

Field, M. (2018), ‘End of cryptocurrency gold rush hits demand for graphics card miners’, The Telegraph, 27 August. Available online: https:// www.telegraph.co.uk/technology/2018/08/27/ end-cryptocurrency-gold-rush-hits-demandgraphics-card-miners/ (accessed 16 August 2020).

Fisher, H. (2016), ‘Has online dating changed the way we love each other?’ Big Think. Available online: https://bigthink.com/videos/helen-fisheron-the-ancient-brain-and-online-dating (accessed 16 August 2020).

Ford, P. (2015), ‘It’s Kind of Cheesy Being Green’, Medium, 11 February. Available online: https:// medium.com/message/its-kind-of-cheesy-beinggreen-2c72cc9e5eda (accessed 16 August 2020).

Foster, N. (2014), ‘The Future Mundane’, Core 77, 7 October. Available online: https://www.core77. com/posts/25678/the-future-mundane-25678 (accessed 16 August 2020).

Franklin, S. (2012), ‘Cloud control, or the network as medium’, Cultural Politics, 8 (3): 443–464.

Frayling, C. (1993), ‘Research in Art and Design’, RCA Research Papers, 1 (1): London Royal College of Art: 1–5.

Fuchs, C. (2017), Social Media: A Critical Introduction, Thousand Oaks, California: Sage.

Fuchsberger, V., Murer, M., Wurhofer, D., Meneweger, T., Neureiter, K., Meschtscherjakov, A., and Tscheligi, M. (2014), ‘The Multiple Layers of Materiality’, in Proceedings of the 2014 Companion Publication on Designing Interactive Systems, New York, USA: 73–76.

Fuller, M. and Goffey, A. (2012), Evil media, Cambridge, Massachusetts: MIT Press.

Galloway, A. (2004), Protocol: How Control Exists after Decentralization, Cambridge, Massachusetts: MIT Press.

Galloway, A. (2012), The Interface Effect, Cambridge, UK: Polity Press.

Gell, A. (1988), ‘Technology and Magic’, Anthropology Today, 4 (2): 6–9.

Geertz, C. (2008), ‘Thick description: toward an interpretive theory of culture’, in The Cultural Geography Reader, Routledge: 41–51.

References

9781350068278_txt_app.indd 157

157

08/03/2021 11:13

Gibson, W. (2012), ‘Rocket Radio: William Gibson’s 1989 essay on “The Net”’, Motherboard, 5 April. Available online: https://motherboard.vice.com/ en_us/article/qkkmxx/essential-reading-rocketradio (accessed 16 August 2020).

Gold, S. (2014), The Alternet. Available online: https://www.sarah.gold/work.html (accessed 16 August 2020).

GoodGym (n.d.). Available online: https://www. goodgym.org (accessed 16 August 2020).

Google Inc. (n.d.a.), Developers. Available online: https://developers.google.com/speed/ (accessed 16 August 2020).

Google Inc. (n.d.b.), Material Design. Available online: https://material.io/ (accessed 16 August 2020).

Gorczyca, J. (2017), ‘Minority Report – 15 years later: how the film defined the future of interface design’, Medium, 20 June. Available online: https:// medium.com/helm-experience-design/minorityreport-15-years-later-328b15a7845a (accessed 16 August 2020).

Gray, C.M., Kou, Y., Battles, B., Hoggatt, J. and Toombs, A.L. (2018), ‘The dark (patterns) side of UX design’, in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM: 534.

Greenberg, S., Boring, S., Vermeulen, J. and Dostal, J. (2014), ‘Dark patterns in proxemic interactions: a critical perspective’, in Proceedings of the 2014 Conference on Designing Interactive Systems, ACM: 523–532.

Grey Ellis, E. (2016), ‘Gab, the alt-right’s very own Twitter, is the ultimate filter bubble’, WIRED, 14 September. Available online: https://www.wired. com/2016/09/gab-alt-rights-twitter-ultimatefilter-bubble/ (accessed 16 August 2020).

158

9781350068278_txt_app.indd 158

Gunning, D. (n.d.), ‘Explainable artificial intelligence (XAI)’, DARPA.mil. Available online: https://www.darpa.mil/program/explainableartificial-intelligence (accessed 16 August 2020).

Harris, T. (2016), ‘How technology hijacks people’s minds — from a magician and Google’s design ethicist’, Tristan Harris, 19 May. Available online: https://medium.com/thrive-global/ how-technology-hijacks-peoples-minds-froma-magician-and-google-s-design-ethicist56d62ef5edf3 (accessed 16 August 2020).

Hartley, J. (2013), ‘Interaction design, mass communication and the challenge of distributed expertise’, in Felton, E., Zelenko, O. and Vaughan, S. (eds.), 2013, Design and Ethics: Reflections on Practice, UK: Routledge.

Hecht, B. et al. (2018), ‘It’s time to do something: mitigating the negative impacts of computing through a change to the peer review process’, ACM Future of Computing Academy, 29 March. Available online: https://acm-fca.org/2018/03/29/ negativeimpacts/ (accessed 16 August 2020).

Hein, B. (2013), ‘Jony Ive explains why he decided to gut skeuomorphism from iOS 7. Cult of Mac.’ Cult of Mac. Available at: https://www.cultofmac. com/246312/jony-ive-explains-why-he-decidedto-gut-skeuomorphism-out-of-ios/ (accessed 16 August 2020).

Hern, A. (2018), ‘Google’s solution to accidental algorithmic racism: ban gorillas’, The Guardian, 12 January. Available online: https://www. theguardian.com/technology/2018/jan/12/googleracism-ban-gorilla-black-people (accessed 16 August 2020).

Hill, D. (2017), ‘For more identity, not less’, Medium. Available online: https://medium.com/dark-matterand-trojan-horses/for-more-identity-not-lesse5e85422e4e9 (accessed 16 August 2020).

Hodgkin, A.L. and Huxley, A.F. (1952), ‘The dual effect of membrane potential on sodium conductance in the giant axon of Loligo’, The Journal of Physiology, 116 (4): 497–506.

Design and Digital Interfaces

08/03/2021 11:13

Hull, D. and Smith, T. (2018), ‘Tesla driver died using autopilot, with hands off steering wheel’, Bloomberg, 31 March. Available online: https:// www.bloomberg.com/news/articles/2018-03-31/ tesla-says-driver-s-hands-weren-t-on-wheel-attime-of-accident (accessed 16 August 2020).

IF (n.d.), Data Licenses. Available online: https:// datalicences.projectsbyif.com/ (accessed 16 August 2020).

Ka Fai, C. (2011), Prospectus for a Future Body. Available online: http://www.ka5.info/prospectus. html (accessed 16 August 2020).

Kirby, D. (2010a), ‘The future is now: diegetic prototypes and the role of popular films in generating real-world technological development’, Social Studies of Science, 40 (1): 41–70.

Kirby, D. (2010b), Lab Coats in Hollywood: Science, Scientists and Cinema, Cambridge, Massachusetts: MIT Press.

Lee, D. (2018), ‘The tactics of a Russian troll farm’, BBC, 16 February. Available online: http:// www.bbc.co.uk/news/technology-43093390 (accessed 16 August 2020).

Leskin, P. (2018), ‘Over a million people asked Amazon’s Alexa to marry them in 2017 and it turned them all down’, Business Insider, 10 October. Available online: https://www. businessinsider.com/amazons-alexa-got-over1-million-marriage-proposals-in-2017-201810?r=USandIR=T (accessed 16 August 2020).

Levy, S. (1994), Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything, New York: Viking Adult.

Licklider, J.C.R. (1960), ‘Man-computer symbiosis.’ IRE Transactions on Human Factors in Electronics, (1): 4–11.

Liu, A. (2016), Drafts for Against the Cultural Singularity. Available online: http://liu.english.ucsb. edu/drafts-for-against-the-cultural-singularity/ (accessed 16 August 2020).

Kloc, J. (2013), ‘Greek community creates an offthe-grid internet’, The Daily Dot. Available online: https://www.dailydot.com/layer8/greek-off-thegrid-internet-mesh/ (accessed 16 August 2020).

Krug, S. (2005), Don’t Make Me Think: A Common Sense Approach to Web Usability, San Francisco: New Riders.

Lafferty, M. (2016), ‘Designing for television, Part 1: an introduction to the basic ingredients of a TV UI’, Medium, 24 August. Available online: https:// medium.com/this-also/designing-for-televisionpart-1-54508432830f (accessed 16 August 2020).

Lomax, B. (2015), ‘The secret world of Tinder’, Channel 4. Available online: https://www.imdb. com/title/tt4928498/ (accessed 16 August 2020).

Loughrey, C. (2017), ‘Minority Report: 6 predictions that came true, 15 years on’, The Guardian, 25 June. Available online: https:// www.independent.co.uk/arts-entertainment/ films/features/minority-report-15th-anniversarypredictive-policing-gesture-based-computingfacial-and-optical-a7807666.html (accessed 16 August 2020).

Kolko, J. (2010), ‘Abductive thinking and sensemaking: the drivers of design synthesis’, Design Issues, 26 (1): 15–28.

Latour, B. (2000), Pandora’s Hope: Essays on The Reality of Science Studies, Cambridge, Massachusetts: Harvard University Press.

Latour, B. (2005), Reassembling the Social: An Introduction to Actor-network-theory, Oxford: Oxford University Press.

Kuchler, H. (2018), ‘Twitter chief heads into free speech storm’, Financial Times, 3 September. Available online: https://www. ft.com/content/2efdfbe4-ae00-11e8-8d146f049d06439c (accessed 16 August 2020).

References

9781350068278_txt_app.indd 159

159

08/03/2021 11:13

Levin, S. (2018), ‘Is Facebook a publisher? In public it says no, but in court it says yes’, The Guardian, 2 July. Available online: https://www. theguardian.com/technology/2018/jul/02/ facebook-mark-zuckerberg-platform-publisherlawsuit (accessed 16 August 2020).

MacArthur, E. (2017), The Circular Design Guide. Available online: https://www.circulardesignguide. com/ (accessed 16 August 2020).

MacDonald, K. (2018), ‘No Man’s Sky developer Sean Murray: “It was as bad as things can get”’, The Guardian, 20 July. Available online: https:// www.theguardian.com/games/2018/jul/20/ no-mans-sky-next-hello-games-sean-murrayharassment-interview (accessed 16 August 2020).

Malpass, M. (2018), Critical Design in Context: History, Theory and Practices, London: Bloomsbury Academic.

McKim, J. (2017), ‘Speculative animation: digital projections of urban past and future’, Animation: An Interdisciplinary Journal, 12 (3): 287–305.

Merchant, B. (2017), ‘Life and death in Apple’s forbidden city’, The Guardian, 18 June. Available online: https://www.theguardian.com/ technology/2017/jun/18/foxconn-life-deathforbidden-city-longhua-suicide-apple-iphonebrian-merchant-one-device-extract (accessed 16 August 2020).

Michael, M. (2016), Introduction to Actor-Network Theory: Trials and Translations, New York: Sage Publications.

Microsoft: Productivity Future Vision, (2015), [Film] Microsoft in Business, YouTube. Available online: https://www.youtube.com/watch?v=w-tFdreZB94 (accessed 16 August 2020).

Minority Report, (2002), [Film] Dir. by Steven Spielberg. USA: 20th Century Fox. Manovich, L. (2001), The Language of New Media, Cambridge, Massachusetts: MIT.

Manovich, L. (2013), Software Takes Command, London: Bloomsbury.

Manson. M. (2014), ‘In the future our attention will be sold’, Mark Manson, 4 December. Available online: https://markmanson.net/attention (accessed 16 August 2020).

Marsden, N. and Haag, M. (2016), ‘Stereotypes and politics: reflections on personas’, Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, New York: ACM. CHI ’16: 4017–4031.

Maturana, H.R. (1980), ‘Man and society’, in Autopoiesis, Communication, and Society: The Theory of Autopoietic Systems in the Social Sciences, Campus Verlag: 11–32.

Mazzucato, M. (2018), The Entrepreneurial State, Debunking Public vs Private Sector Myths, London: Penguin/Random House.

160

9781350068278_txt_app.indd 160

Moll, J. (2018), The Data Brokers: An Autopsy of Online Love. Available online: https://datadating. tacticaltech.org/viz (accessed 16 August 2020).

Monteiro, M. (2017), ‘A designer’s code of ethics’, Dear Design Student. Available online: https:// deardesignstudent.com/a-designers-code-ofethics-f4a88aca9e95 (accessed 16 August 2020).

Morozov, E. (2013), To Save Everything Click Here: The Folly of Technological Solutionism, New York: Public Affairs.

Nascimento, E.C.C., da Silva, E. and SiqueiraBatista, R. (2018), ‘The “use” of sex robots: a bioethical issue’, Asian Bioethics Review, 10 (3): 231–240.

Nash, K. (2018), ‘Virtual reality witness: exploring the ethics of mediated presence’, Studies in Documentary Film, 12 (2): 119–131.

Design and Digital Interfaces

08/03/2021 11:13

Neff, G. (2016), ‘Alexa, does AI have gender?’ Oxford Alumni, 15 October. Available online: https://www.research.ox.ac.uk/Article/2018-10-15alexa-does-ai-have-gender (accessed 16 August 2020).

Nguyen, N. (2017), ‘If you have a smart TV, take a closer look at your privacy settings’, CNBC, 9 March. Available online: https://www.cnbc. com/2017/03/09/if-you-have-a-smart-tv-takea-closer-look-at-your-privacy-settings.html (accessed 16 August 2020).

Norman, D. (2010), ‘Gestural interfaces: a step backwards in usability’, Jnd.org, 28 May. Available online: https://jnd.org/gestural_interfaces_a_step_ backwards_in_usability_6/ (accessed 16 August 2020).

Oblong Industries (n.d.), ‘g-speak is Oblong’s core technology platform’, Oblong Industries. Available online: https://platform.oblong.com/ (accessed 16 August 2020).

Ojok, O. (2017), ‘South Sudan’s unholy trinity: ethnicity, hate speech and social media’, Medium, 6 June. Available online: https://medium.com/@ donnasojok/south-sudans-unholy-trinity-ethnicityhate-speech-and-social-media-c5073eb30d7c (accessed 16 August 2020).

Onarheim, B. and Wiltschnig, S. (2010), ‘Opening and constraining: constraints and their role in creative processes’, in Proceedings of the 1st DESIRE Network Conference on Creativity and Innovation in Design: 83–89.

O’Neil, C. (2016), Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, London: Penguin.

Osborne, H. and Parkinson, H. J. (2018), ‘Cambridge Analytica scandal: the biggest revelations so far’, The Guardian, 22 March. Available online: https://www.theguardian.com/ uk-news/2018/mar/22/cambridge-analyticascandal-the-biggest-revelations-so-far (accessed 16 August 2020).

Ostry, J. D. et al. (2016), ‘Neoliberalism: Oversold?’ Finance and Development, 53 (2): 38–41.

Pariser, E. (2011), The Filter Bubble: What the Internet is Hiding from You, London: Penguin.

Pasick, A. (2017), ‘If you thought it was creepy for Uber to track you at all times, it was actually much worse’, Quartz, 3 March. Available online: https:// qz.com/924459/ubers-greyball-surveillanceprogram-tracked-police-phones/ (accessed 17 August 2020).

Peirce, C. S. (1988), ‘Pragmatism as the logic of abduction’, in The Essential Peirce: Selected Philosophical Writings, 1893–1913, Bloomington: Indiana University Press.

Phillip, L., Cottrill, C., Farrington, J., Williams, F. and Ashmore, F. (2017), ‘The digital divide: patterns, policy and scenarios for connecting the “final few” in rural communities across Great Britain’, Journal of Rural Studies, 54: 386–398.

Pownell, A. (2018), ‘Design Museum removes a third of work from Hope to Nope exhibition, on artists’ request’, Dezeen, 8 August. Available online: https://www.dezeen.com/2018/08/08/ design-museum-hope-to-nope-artist-removework-design-news/ (accessed 17 August 2020).

Qiu, X., Oliveira, D.F., Shirazi, A.S., Flammini, A. and Menczer, F. (2017), ‘Limited individual attention and online virality of low-quality information’, Nature Human Behaviour, 1 (7).

Raymond, E. S. (2000), The Cathedral and The Bazaar. Available online: http://www.catb.org/~esr/ writings/cathedral-bazaar/cathedral-bazaar/ index.html (accessed 17 August 2020).

Rayome, A. D. (2017), ‘DDoS attacks increased 91% in 2017 thanks to IoT’ TechCrunch, 20 November. Available online: https://www. techrepublic.com/article/ddos-attacks-increased91-in-2017-thanks-to-iot/ (accessed 17 August 2020).

References

9781350068278_txt_app.indd 161

161

08/03/2021 11:13

Rebaudengo, S. (2015), Ethical Things. Available online: http://www.simonerebaudengo.com/ project/ethicalthings (accessed 17 August 2020).

Republic of Estonia e-Residency. Available online: https://e-resident.gov.ee/become-an-e-resident/ (accessed 17 August 2020).

Rheingold, H. (2000), ‘Tools for thought’. Available online: http://www.rheingold.com/texts/tft/09. html (accessed 17 August 2020).

Rittel, H. and Webber, M.M. (1973), ‘Dilemmas in a general theory of planning’, Policy Sciences, 4: 155–169.

Roch. A (1996), ‘Fire-control and humancomputer interaction: towards a history of the computer mouse (1940–1965)’, in Lab. Jahrbuch der Kunsthochschule für Medien in Köln. Trans. D. Mindell. Available online: http://web.stanford.edu/ dept/SUL/library/prod/siliconbase/wip/control. html (accessed 17 August 2020).

Romanosky, S., Libicki, M. C., Winkleman, Z. and Tkacheva, O. (2015), Internet Freedom Software and Illicit Activity: Supporting Human Rights Without Enabling Criminals, RAND Corporation.

Sanders, E.B.N. and Stappers, P.J. (2014), ‘Probes, toolkits and prototypes: three approaches to making in codesigning’, CoDesign, 10 (1): 5–14.

Schweidel, D.A. and Moe, W.W. (2016), ‘Binge watching and advertising’, Journal of Marketing, 80 (5): 1–19.

Scott, J. C. (1998), Seeing Like A State: How Certain Schemes to Improve the Human Condition Have Failed, New Haven, Connecticut: Yale University Press.

Selbst, A, D. and Barocas, S. (2018), ‘The intuitive appeal of explainable machines’, Fordham Law Review. Forthcoming.

162

9781350068278_txt_app.indd 162

Shafrir, T. and Türekten, F. (2017), ‘Modern love: how do we think about love in the age of online dating?’ Jajajanenene. Available online: https:// www.jajajaneeneenee.com/shows/fiber-festival-1/ (accessed 17 August 2020).

Shulevitz, J. (2014), ‘Siri, you’re messing up a generation of children’, New Republic, 3 April. Available online: https://newrepublic.com/ article/117242/siris-psychological-effects-children (accessed 17 August 2020).

Siegel, A. A. (2018), ‘Online hate speech’, in Social Media and Democracy: The State of the Field. Available online: https://www.cambridge. org/core/books/social-media-and-democracy/ E79E2BBF03C18C3A56A5CC393698F117 (accessed 2019).

Simon, H.A. (1988), ‘The science of design: creating the artificial’, Design Issues, 4 (1/2): 67–82.

Sin, B. (2016), ‘Latest Foxconn worker deaths build case for Apple to move operations from China’, Forbes, 22 August. Available online: https://www.forbes.com/sites/ bensin/2016/08/22/the-real-cost-of-the-iphone7-more-foxconn-worker-deaths (accessed 17 August 2020).

Sinning, S. (2018), ‘OpenAI withholds release of new language model because it’s too good’, DZone, 18 February. Available online: https:// dzone.com/articles/openai-withholds-release-ofnew-language-model-bec (accessed 17 August 2020).

Sony. (n.d.) Aibo. Available online: https://us.aibo. com/feature/feature1.html (accessed 17 August 2020).

Spencer, G. (2012), ‘Loren Brichter talks about pull-to-refresh patent and design process’, MacStories, 28 March. Available online: https:// www.macstories.net/news/loren-brichter-talksabout-pull-to-refresh-patent-and-design-process/ (accessed 17 August 2020).

Design and Digital Interfaces

08/03/2021 11:13

Stecklow, S. (2018), ‘Why Facebook is losing the war on hate speech in Myanmar’, Reuters, 15 August. Available online: https://www.reuters. com/investigates/special-report/myanmarfacebook-hate/ (accessed 17 August 2020).

Steiner, E. and Xu, K. (2018), ‘Binge-watching motivates change: uses and gratifications of streaming video viewers challenge traditional TV research’, Convergence.

Stern, J (2018), ‘Ugh, green bubbles! Apple’s iMessage makes switching to android hard’, Washington Post, 18 October. Available online: https://www.wsj.com/articles/ugh-green-bubblesapples-imessage-makes-switching-to-androidhard-1539867600 (accessed 17 August 2020).

Stilgoe, J. (2017), ‘Machine learning, social learning and the governance of self-driving cars’, Social Studies of Science, 48 (1): 25–66.

Stone, L., (n.d.), ‘Continuous partial attention’, Linda Stone. Available online: https://lindastone. net/faq/ (accessed 17 August 2020).

Turkle, S. (2005), The Second Self: Computers and The Human Spirit, Cambridge, Massachusetts: The MIT Press.

Turner, C. (2018), ‘Children are swiping books in attempt to turn pages after being raised on tablets and phones’, The Telegraph, 2 April. Available online: https://www.telegraph.co.uk/ news/2018/04/02/children-swiping-booksattempt-turn-pages-raised-tablets-phones/ (accessed 17 August 2020).

Tsimitakis, M. (2013), ‘The Greek government tried and failed to close their BBC’, Vice News, 15 June. Available online: https://www.vice.com/en_ uk/article/av4e9j/shutting-down-ert-could-bringthe-collapse-of-the-greek-government (accessed 17 August 2020).

Uninvited Guests, (2015), [Film] Superflux. Vimeo. Available online: https://vimeo.com/128873380 (accessed 17 August 2020).

Voros J. (2017), ‘Big history and anticipation’, in Poli R. (eds) Handbook of Anticipation, Cham: Springer.

Superstudio (1969), Il Monumento Continuo.

Taplin, J. (2017), Move Fast and Break Things: How Facebook, Google, and Amazon have Cornered Culture and What it Means for All of Us, London: Pan Macmillan.

Taylor, A. (2016), ‘Russia accuses Google Maps of “topographical cretinism”’, The Washington Post, 7 July. Available online: https://www. washingtonpost.com/news/worldviews/ wp/2016/07/29/russia-accuses-googlemaps-of-topographical-cretinism/?utm_ term=.0ef74ab60917 (accessed 17 August 2020).

Wagner, C. (2018), ‘Sexbots: the ethical ramifications of social robotics’ dark side’, AI Matters, 3 (4): 52–58.

Ward, M. (n.d.), ‘Matt Ward: speculation is part of every designer’s practice’, Speculative. Available online: http://speculative.hr/en/matt-ward/ (accessed 17 August 2020).

Weiser, M. and Brown, J.S. (1997), ‘The coming age of calm technology’, in Beyond Calculation, New York: Springer: 75–85.

Wilson, M. (2013), ‘How GE branded my unborn baby’, Fast Company, 18 September. Available online: https://www.fastcompany.com/3017504/ how-ge-branded-my-unborn-baby (accessed 17 August 2020).

Tonkinwise, C. (2016), ‘Designing in an era of xenophobia’, The Radical Designist. (4).

Tufekci, Z. (2017), Twitter and Teargas: The Power and Fragility of Networked Protest, London: Yale University Press.

References

9781350068278_txt_app.indd 163

163

08/03/2021 11:13

Wilson, M. (2016), ‘The UX secret that will ruin apps for you’, Fast Company, 7 June. Available online: www.fastcompany.com/3061519/the-uxsecret-that-will-ruin-apps-for-you (accessed 17 August 2020).

Wolman, D. (2013), ‘Facebook, Twitter help the Arab Spring blossom’, WIRED, 16 April. Available online: https://www.wired.com/2013/04/ arabspring/ (accessed 17 August 2020).

Wood, Z., Parry, G., Caruthers, J. and Rose, K. (2017), ‘Assessing the impact of digital innovations in the London transportation network’, University of West England.

World Design Summit 2017. Available online: https://worlddesignsummit.com (accessed 17 August 2020).

Wu, T. (2017), The Attention Merchants: How Our Time and Attention are Gathered and Sold, London: Atlantic Books.

Yonatan, H. (2017), ‘Ethics in user experience design’, Usability Geek, 4 April. Available online: https://usabilitygeek.com/ethics-in-userexperience-design/ (accessed 17 August 2020).

Young, S. (2016), ‘Ghosting, benching and DTR: what these 13 popular dating terms really mean’, The Independent, 20 December. Available online: https://www.independent.co.uk/life-style/lovesex/dating-relationship-terms-terms-what-theymean-game-ghosting-benching-dtr-fbo-thirsttrap-a7486511.html (accessed 17 August 2020).

Yury, C. (2014), ‘Turning desire into an app: 5 questions for Sean Rad, CEO of Tinder’, Huffington Post, 7 June. Available online: https:// www.huffingtonpost.com/carrie-yury/sean-radceo-of-tinder-on_b_5087420.html (accessed 17 August 2020).

Zuboff, S. (2015), ‘Big other: surveillance capitalism and the prospects of an information civilization’, Journal of Information Technology, (30): 75–89.

164

9781350068278_txt_app.indd 164

Design and Digital Interfaces

08/03/2021 11:13

Acknowledgements

John Fass would like to thank the students and staff of MA UX at LCC for informing much of what is in this book and of course Tobias, Eva and Ben for such an enjoyable writing experience. Special thanks to Mushon Zer-Aviv and Dan Lockton for giving their time to be interviewed. Finally, thanks to Stella, Skye, Leila and Lisa without whom nothing in life is possible. Tobias Revell would like to thank his friends and colleagues John, Eva and Ben for their support and insight during the writing of this book and the Design School at LCC for giving us the space to do it. We laughed, talked and thought. Thanks also to Anab Jain for her time for interview as well as the numerous contributors who responded to our requests for input. Thanks to Sydney Hogdahl for helping us tie up loose ends. Ben Stopher would like to thank his fellow contributors Tobias, John and Eva for their continued effort in pulling this book together, it has been an enjoyable journey and facilitated time together we would not otherwise have had. Special thanks from Ben also go to Sarah Gold for an insightful interview and Conor Rigby for editorial design and general patience. He would also like to thank Rose and Otto, who contributed through the sweetest distraction. Eva Verhoeven would like to thank the Design School at the London College of Communication for enabling me to have time for thinking, reflecting and writing. Most of all I want to thank Tobias, John and Ben for this collective experience of writing together. Thanks also to everyone who helped to bring this book together by chasing image permissions, designing layouts and contributing through insightful interviews. A special thanks to Rosa, Finn and Pete – for being there.

Acknowledgements

9781350068278_txt_app.indd 165

165

08/03/2021 11:13

Index

Page locators in italic refer to figures. access

Big Society 43



internet 49, 62, 67, 68–9, 70, 70

binge-watching 30



One Laptop per Child (OLPC) 46–8

black boxes 17, 71, 82–3

and openness 67–9

Bluetooth 95

actor network theory 34

Bootstrap framework 101–102, 102

aesthetic interfaces 92–107

Bratton, Benjamin 24, 62

aggregation interfaces 36

Bridle, James 74, 75, 99–100

Aibo 44, 45

Brignull, Harry 104

Alexander, Christopher 101

Burrell, Jenna 71

algorithms 22, 48, 54, 64, 81, 87, 132 Alibaba 21

Cambridge Analytica 53, 64, 72

Amazon 85

children 45, 87, 106



Alexa 32, 45, 98, 99





Echo 32

Choy Ka Fei 90, 90

One Laptop per Child (OLPC) 46–8

anthropomorphism 44

circular design 111

Apple 95, 103, 106, 119

cloud computing 29–30



AirPods 95–6, 96

codes of ethics 85, 91



Force Touch 97

collaborative interfaces 48–51



iMessage 99

complexity



iPhone 83



and fragmentation 26–39

artificial intelligence (AI) 44, 118, 139



theory 34–5



‘AI in the UK; ready, willing and able?’ 81

computational interface design 21–2



Open AI 87

constraints in design 36–7, 126–7



Project Maven 90

constructionism 46

Ash, J. 23, 32

critical

Association of Computing Machinery (ACM) 85, 119



interfaces 72–4



practice 65–7, 66, 90, 116, 118

Athens Wireless Metropolitan Network (AWMN) 68-69, 68

crowdsourcing 42, 50, 89

auditory aesthetics 95–6, 96, 98–9

cultural

automated design processes 22, 22



aesthetics and meaning 98-100

autopsies, digital 28



interfaces 18–20

cryptocurrencies 121

Cultural Norte Americano (CNA) 52 Berners-Lee, Tim 12, 69 biases 48, 64, 99, 107

166

9781350068278_txt_app.indd 166

Design and digital interfaces

08/03/2021 11:13

culture

‘Ethical Things’ 89



ethical design 85–6

exploitation, design as 78–80



good interface design and 145

Curious Rituals 117, 117

Facebook 36, 53, 55, 71

cybernetics 34-37



alternative models to 52–3, 53



Cambridge Analytica and 53, 64, 71

dark patterns 78, 104, 104, 105



dark patterns 78

data



effect on political change 64



capture 33, 71, 78, 80, 135–6



internet access project 49



patterns 141–42

Fairphone 83, 84



privacy and ownership 52, 71, 81

‘fake news’ 12, 64, 80

Data Licences project 73

Fallman, Daniel 36

dating apps 18, 54, 86

Feminist Internet 99

definition of interface 17

filter bubbles 12, 64

Deleuze, Gilles 41, 60

Foster, Nick 117–18

demonstrations of technology 119-121

fragmentation process 29

design

fragmented



approaches to complexity and fragmentation 35–7



attention 32–3



devices 31–33



fiction 114, 115–18



distribution 29–31



imaginaries 118–20

Franklin, Seb 29, 30



patterns and behaviours 100–105

Frayling, Christopher 38

Design Council 85

Fuchs, Christian 80

deviant interfaces 121–2

Fuller, Matthew 31, 32

Dewey, John 59, 93

functional aesthetics 94–5, 94

Diaspora 52–3, 53

future

DiSalvo, Carl 59, 65-66, 114



design fictions 115–18

distractions 47



design imaginaries 118–20

distributed denial-of-service (DDOS) attacks 121–22



science fiction 112–15



technologies and ethical approaches 86–7

diverse teams 85, 145



and uncertainty 109–111

Dronestagram 74, 75, 100

‘The Future Mundane’ 117–18

Dunne, Anthony 65, 67, 72

Futures Cone 110, 111

e-Residency cards 56-57

Gab 54

election campaigns 49, 64, 72, 80

Galloway, Alexander 23, 30, 32, 41, 62

‘Elsagate’ scandal 86–7

Gell, Alfred 60

empathy, aesthetics for 106–107

General Data Protection Regulation (GDPR) 81

End User License Agreements (EULA) 71

General Electric 27, 28

Engelbart, Douglas C. 19, 119

gestural interfaces 113, 113, 114, 115

entangled interfaces 61–62

Gibson, William 121

ethical

glitch art 100, 100



design cultures 84–6

Goffey, Andrew 31-32



designers 87–90, 134–5

Gold, Sarah 72, 141–46



interfaces 45, 76–90

GoodGym 42–3, 44



legibility 82–4



principles, futuring of 86–87

Index

9781350068278_txt_app.indd 167

167

08/03/2021 11:13

Google 22, 78, 83, 84, 97, 105, 145

Licklider, Joseph Carl Robnett 20



Earth 73, 75

oN-Line System (NLS) 19–20



Maps 50, 63

living with digital interfaces, effects of 129–32



‘material design’ framework 34, 105

Lockton, Dan 129–35



Photos 48

London Design Museum 88-91



Project Ara 111, 112





Project Maven 91

loneliness, combatting 43, 44–5

Design of the Year Award 94

Government Digital Service (GDS) 94–5, 94 GPS systems 64

machine learning 48, 70, 81, 82

graphical user interfaces (GUI) 20, 23, 119

Manovich, Lev 18, 19, 23, 31



Manson, Mark 33

Sugar 46–7, 47

Match Group 54 haptic design 96–7

Mazzucato, Mariana 43–4

Harris, Tristan 47

McKim, Joel 115

Hartley, John 83

media content, distribution of 30–1

healthcare 44-46, 81, 82

mediation 23, 41

historical interfaces 19–20

medical ethics 82

hyperlinks 95

Menkman, Rosa 100 mesh networks 67–8, 68, 74

IF 73, 73-74

meta-design 38



metaphors 130–1, 132, 137–8

Sarah Gold interview 141–46

imaginaries, design 118–21

methods of interface design 21–2

India 49

Microsoft 97, 113

Indienet 79



Productivity Future Vision 116, 116, 118

infrastructures 24, 61, 62



‘Tay’ 48



Minority Report 112–14, 113, 115

alternative forms of network 67–8, 68, 72–3

innovation and myth of public vs private sectors 43–4

misinformation online 33, 64, 80

inscrutability and opacity 69–72

Morozov, Evgeny 67

intellectual property safeguards 69, 71, 72

multidisciplinary teams 145

Monteiro, Mike 77

internet access 49, 62, 67, 68–9, 70, 70 Internet of Things (IoT) 88–89, 89, 120, 120-123

Nash, Kate 86

internet service providers (ISPs) 62

Natalini, Adolfo 65

interviews 124–46

Near Future Laboratory 117, 117-118

‘intrafaces’ 23

net neutrality 49, 62, 69 Netflix 21, 22, 30, 31

Jain, Anab 125–28

‘New Aesthetic’ 100 news sites 35–6

Kolko, Jon 36 Kubrick, Stanley 114

One Laptop per Child (OLPC) 46–8, 47 online

Lafferty, Molly 30–1



dating 18, 53–5, 86

‘last mile problem’ 68



hate speech 80

Latour, Bruno 34, 71, 83



pornography 86

legal agreements 71

opacity and inscrutability 69–71

legibility, ethical 82–4

Open AI 87

legislation 81–3

open-source software 46, 83, 84

168

9781350068278_txt_app.indd 168

Design and digital interfaces

08/03/2021 11:13

socializing interfaces 44–6

openness and access 67–9

solutionism 67 PARO 45

Song of the Machine 125–6, 126, 127

participative/co-design 38

sonic aesthetics 95–6, 96, 98–9

Peirce, Charles 36

Sony 44, 45, 122

personal computers 19, 20, 69, 119

Speaking Exchange 52

personal intelligent assistants (PIAs) 32, 45, 97–8, 98–100, 133

speculative design 73, 73, 74, 118

personas 106–107, 107

The Stack 24, 62

political action of interfaces 63–4

Stanford Research Institute (SRI) 19–20

political interfaces 59–60

Stilgoe, Jack 63



Stone, Linda 30

entangled 61–2

Spielberg, Steven 112, 113

political polarization 12

streaming services 22, 22, 30–1, 31, 62

popularity algorithms 87, 134

Sugar 46–7, 47

‘power of the crowd’ 42

Superflux

‘primal branding’ 27, 28



Anab Jain interview 125–28

Productivity Future Vision 116, 116, 118



Song of the Machine 125–6, 126

proprioception 97



Uninvited Guests 119–20, 120

‘Prospectus for a Future Body’ 90, 90

Superstudio 65, 66, 72

public vs private sectors, myth of 43–4

surveillance capitalism 78, 79, 79

pull-to-refresh 103, 103, 106

synthesis 36 systems theory 34-35

Raby, Fiona 65, 67, 72 Rad, Sean 55

tasks and skills for interface design 20–22

Rebaudengo, Simone 88–89, 89

techno-solutionism 67

research methods 37–8

technological

responsive interface design 32 ‘roach motels’ 78, 104

approaches to complexity and fragmentation 32–5

robots 44–5, 67, 86



interfaces 17–18

temporal aesthetics 97–8 Samsung 32, 71

Terms of Service (TOS) 71

science fiction and design 114–17

Tesla 98

Scott, James C. 63

theoretical perspectives 22–4

‘seamful’ and ‘seamless’ interfaces 85, 102, 105, 126, 141, 144–5

Theranos 122

senses, aesthetics and 94–8

Tonkinwise, Cameron 106, 107

Shafrir, Tamar 54–5

‘troll farms’ 64

skeuomorphism 105–106

Tufekci, Zeynep 64

skills and tasks for interface design 20–22

Türekten, Füsun 55

smartphones 23, 32, 83, 84, 101, 103, 111

Turkle, Sherry 45, 46

smartwatches 32, 61

Twitter 49, 53, 64, 103, 103

Snowden, Edward 78, 79

Two Thousand and One: A Space Odyssey 114

Tinder 54-55, 86

social

identities, constructing 53–6

Uber 42, 59–61, 82, 85



impact, design for 42–4, 42, 48, 49, 88

ultrasound 27, 28



interfaces 40–56

sociality, interfaces for 51–3

Index

9781350068278_txt_app.indd 169

169

08/03/2021 11:13

uncertainty 109–110, 122

embracing 111–112

Underkoffler, John 113, 114, 115 unforeseen consequences 80–1 Uninvited Guests 120, 120 user-centred design 105–106 user needs 69, 106–107, 143–4 Ushahidi 50, 50 Verificado19s 51 video conferencing 51 viral media 33 virtual reality 86, 106, 117 visual aesthetics 94, 95 voice interfaces 32, 45, 97, 98, 99, 133 Voros, Joseph 110, 111 war-chalking 70 web browser software 33, 83 western-centrism 48, 107

moving beyond 48–9

Wilson, Mark 27, 28 World of Warcraft 23 World Wide Web 12, 35, 69 Wu, Tim 33 Xerox PARC 20 YouTube 87 Zer-Aviz, Mushon 136–40 Zuboff, Shoshana 78 Zuckerberg, Mark 49

170

9781350068278_txt_app.indd 170

Design and digital interfaces

08/03/2021 11:13

9781350068278_txt_app.indd 171

08/03/2021 11:13

9781350068278_txt_app.indd 172

08/03/2021 11:13

9781350068278_txt_app.indd 173

08/03/2021 11:13

9781350068278_txt_app.indd 174

08/03/2021 11:13

9781350068278_txt_app.indd 175

08/03/2021 11:13

9781350068278_txt_app.indd 176

08/03/2021 11:13