Security and Privacy [III] 1315243563, 9781315243566

During the last decade in particular the levels of critical engagement with the challenges posed for privacy by the new

1,392 171 14MB

English Pages 540 Year 2016

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Security and Privacy [III]
 1315243563,  9781315243566

Table of contents :
Cover......Page 1
Half Title......Page 2
Title......Page 4
Copyright......Page 5
Contents......Page 6
Acknowledgements......Page 10
Series Preface......Page 14
Introduction......Page 16
PART I IDENTITY, SECURITY AND PRIVACY IN CONTEXT......Page 30
1 L. Jean Camp (2002), 'Designing for Trust', in R. Falcone, S. Barber, L. Korba and M. Singh (eds), Trust, Reputation, and Security: Theories and Practice, Berlin–Heidelberg–New York: Springer-Verlag, pp. 15-29.......Page 32
2 Roger Clarke (1994), 'The Digital Persona and Its Application to Data Surveillance', The Information Society, 10, pp. 77-92.......Page 48
3 Julie E. Cohen (2008), 'Privacy, Visibility, Transparency, and Exposure', University of Chicago Law Review, 75, pp. 181-201.......Page 64
4 Oscar H. Gandy, Jr (2000), 'Exploring Identity and Identification in Cyberspace', Notre Dame Journal of Law, Ethics and Public Policy, 14, pp. 1085-111.......Page 86
5 Helen Nissenbaum (2011), 'A Contextual Approach to Privacy Online', Dædalus, the Journal of the American Academy of Arts & Sciences, 140, pp. 32-48.......Page 114
PART II SURVEILLANCE, SECURITY AND ANONYMITY......Page 132
6 Benoît Dupont (2008), 'Hacking the Panopticon: Distributed Online Surveillance and Resistance', Surveillance and Governance, Sociology of Crime, Law and Deviance, 10, pp. 257-78.......Page 134
7 Kevin D. Haggerty and Richard V. Ericson (2000), 'The Surveillant Assemblage', The British Journal of Sociology, 51, pp. 605-22.......Page 156
8 Steve Mann (2004), '"Sousveillance": Inverse Surveillance in Multimedia Imaging', Proceedings of the ACM Multimedia, pp. 620-27.......Page 174
9 Torin Monahan (2011), 'Surveillance as Cultural Practice', The Sociological Quarterly: Official Journal of the Midwest Sociological Society, 52, pp. 495-508.......Page 182
10 Walter Peissl (2003), 'Surveillance and Security: A Dodgy Relationship', Journal of Contingencies and Crisis Management, 11, pp. 19-24.......Page 196
11 Torin Monahan (2006), 'Counter-Surveillance as Political Intervention?', Social Semiotics, 16, pp. 515-34.......Page 202
12 Oliver Leistert (2012), 'Resistance against Cyber-Surveillance within Social Movements and How Surveillance Adapts', Surveillance & Society, 9, pp. 441-56.......Page 222
13 Richard A. Posner (2008), 'Privacy, Surveillance, and Law', University of Chicago Law Review, 75, pp. 245-60.......Page 238
14 Joseph A. Cannataci (2010), 'Squaring the Circle of Smart Surveillance and Privacy', Fourth International Conference on Digital Society, pp. 323-28.......Page 254
PART III PRIVACY, DATA PROTECTION AND SECURITY......Page 260
15 Lee A. Bygrave (1998), 'Data Protection Pursuant to the Right to Privacy in Human Rights Treaties', International Journal of Law and Information Technology, 6, pp. 247-84.......Page 262
16 Ian Loader (1999), 'Consumer Culture and the Commodification of Policing and Security', Sociology, 33, pp. 373-92.......Page 300
17 Roger A. Clarke (1988), 'Information Technology and Dataveillance', Communications of the ACM, 31, pp. 498-512.......Page 320
18 Vincenzo Pavone and Sara Degli Esposti (2012), 'Public Assessment of New Surveillance-Oriented Security Technologies: Beyond the Trade-Off between Privacy and Security', Public Understanding of Science, 21, pp. 556-72.......Page 336
19 John T. Billings (2012), 'European Protectionism in Cloud Computing: Addressing Concerns over the PATRIOT Act', CommLaw Conspectus: Journal of Communications Law and Policy, 21, pp. 211-31.......Page 354
20 Lisa Madelon Campbell (2011), 'Internet Intermediaries, Cloud Computing and Geospatial Data: How Competition and Privacy Converge in the Mobile Environment', Competition Law International, 7, pp. 60-66.......Page 376
21 Jennifer Whitson and Kevin D. Haggerty (2007), 'Stolen Identities', Criminal Justice Matters, 68, pp. 39-40.......Page 384
PART IV SMART TECHNOLOGIES, SOCIAL CONTROL AND HUMAN RIGHTS......Page 386
22 Lee A. Bygrave (2010), 'The Body as Data? Biobank Regulation via the "Back Door" of Data Protection Law', Law, Innovation and Technology, 2, pp. 1-25.......Page 388
23 William Webster (2009), 'CCTV Policy in the UK: Reconsidering the Evidence Base', Surveillance & Society, 6, pp. 10-22.......Page 414
24 Barrie Sheldon (2011), 'Camera Surveillance within the UK: Enhancing Public Safety or a Social Threat?', International Review of Law, Computers & Technology, 25, pp. 193-203.......Page 428
25 Shara Monteleone (2012), 'Privacy and Data Protection at the Time of Facial Recognition: Towards a New Right to Digital Identity?', European Journal of Law and Technology, 3(3). Online.......Page 440
26 Jeffrey Rosen (2012), 'The Deciders: The Future of Privacy and Free Speech in the Age of Facebook and Google', Fordham Law Review, 80, pp. 1525-38.......Page 484
27 Donald A. Norman (1999), 'Affordance, Conventions, and Design', Interactions, 6, pp. 38-42.......Page 498
28 Deborah C. Peel (2013), 'eHealth: Roadmap to Finding a Successful Cure for Privacy Issues', Data Protection Law and Policy, 10, pp. 14-16.......Page 504
29 Daniel L. Pieringer (2012), 'There's No App for That: Protecting Users from Mobile Service Providers and Developers of Location-Based Applications', University of Illinois Journal of Law, Technology & Policy, 2012, pp. 559-77.......Page 510
30 Christopher Wolf (2013), 'The Privacy Bill of Rights: What Are the Expectations for 2013?', Data Protection Law and Policy, 10, pp. 4-5.......Page 530
Name Index......Page 534

Citation preview

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

The Library of Essays on Law and Privacy Series Editor: Philip Leith

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Titles in the Series:

The Individual and Privacy Volume I Joseph A. Cannataci Privacy in the Information Society Volume II Philip Leith Security and Privacy Volume III Joseph Savirimuthu

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Volume III

Edited by

Joseph Savirimuthu University of Liverpool, UK

First published 2015 by Ashgate Publishing Published 2016 by Routledge 2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN 711 Third Avenue, New York, NY I 0017, USA

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Routledge is an imprint ofthe Taylor & Francis Group, an informa business Copyright© 2015 Joseph Savirimuthu. For copyright of individual articles please refer to the Acknowledgements. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Wherever possible, these reprints are made from a copy ofthe original printing, but these can themselves be of very variable quality. Whilst the publisher has made every effort to ensure the quality of the reprint, some variability may inevitably remain.

British Library Cataloguing in Publication Data

A catalogue record for this book is available from the British Library.

Library of Congress Control Number: 2014946855 ISBN 9781409444879 (hbk)

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Contents Acknowledgements Series Preface Introduction

ix xiii XV

PART I IDENTITY, SECURITY AND PRIVACY IN CONTEXT

2 3 4 5

L. Jean Camp (2002), 'Designing for Trust', in R. Falcone, S. Barber, L. Korba and M. Singh (eds), Trust, Reputation, and Security: Theories and Practice, Berlin-Heidelberg-New York: Springer-Verlag, pp. 15-29. Roger Clarke (1994), 'The Digital Persona and Its Application to Data Surveillance', The Information Society, 10, pp. 77-92. Julie E. Cohen (2008), 'Privacy, Visibility, Transparency, and Exposure', University of Chicago Law Review, 75, pp. 181-20 I. Oscar H. Gandy, Jr (2000), 'Exploring Identity and Identification in Cyberspace', Notre Dame Journal of Law, Ethics and Public Policy, 14, pp. I 085-111. Helen Nissenbaum (2011), 'A Contextual Approach to Privacy Online', Dcedalus, the Journal of the American Academy ofArts & Sciences, 140, pp. 32--48.

3 19 35 57 85

PART II SURVEILLANCE, SECURITY AND ANONYMITY

6 Benoit Dupont (2008), 'Hacking the Panopticon: Distributed Online Surveillance and Resistance', Surveillance and Governance, Sociology of Crime, Law and Deviance, 10, pp. 257-78. 7 Kevin D. Haggerty and Richard V. Ericson (2000), 'The Surveillant Assemblage', The British Journal ofSociology, 51, pp. 605-22. 8 Steve Mann (2004), '"Sousveillance": Inverse Surveillance in Multimedia Imaging', Proceedings of the ACM Multimedia, pp. 620-27. 9 Torin Monahan (2011), 'Surveillance as Cultural Practice', The Sociological Quarterly: Official Journal of the Midwest Sociological Society, 52, pp. 495-508. 10 Walter Peissl (2003), 'Surveillance and Security: A Dodgy Relationship', Journal of Contingencies and Crisis Management, 11, pp. 19-24. 11 Torin Monahan (2006), 'Counter-Surveillance as Political Intervention?', Social Semiotics, 16, pp. 515-34. 12 Oliver Leistert (2012), 'Resistance against Cyber-Surveillance within Social Movements and How Surveillance Adapts', Surveillance & Society, 9, pp. 441-56. 13 Richard A. Posner (2008), 'Privacy, Surveillance, and Law', University of Chicago Law Review, 75, pp. 245--60.

105 127 145 153 167 173 193 209

vi

Security and Privacy

14 Joseph A. Cannataci (2010), 'Squaring the Circle of Smart Surveillance and Privacy', Fourth International Conference on Digital Society, pp. 323-28.

225

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

PART III PRIVACY, DATA PROTECTION AND SECURITY

15 Lee A. By grave ( 1998), 'Data Protection Pursuant to the Right to Privacy in Human Rights Treaties', International Journal ofLaw and Information Technology, 6,pp.247-84. 16 Ian Loader (1999), 'Consumer Culture and the Commodification of Policing and Security', Sociology, 33, pp. 373-92. 17 Roger A. Clarke (1988), 'Information Technology and Dataveillance', Communications of the ACM, 31, pp. 498-512. 18 Vincenzo Pavone and Sara Degli Esposti (2012), 'Public Assessment of New Surveillance-Oriented Security Technologies: Beyond the Trade-Off between Privacy and Security', Public Understanding of Science, 21, pp. 556-72. 19 John T. Billings (20 12), 'European Protectionism in Cloud Computing: Addressing Concerns over the PATRIOT Act', CommLaw Conspectus: Journal of Communications Law and Policy, 21, pp. 211-31. 20 Lisa Madelon Campbell (2011), 'Internet Intermediaries, Cloud Computing and Geospatial Data: How Competition and Privacy Converge in the Mobile Environment', Competition Law International, 7, pp. 60-66. 21 Jennifer Whitson and Kevin D. Haggerty (2007), 'Stolen Identities', Criminal Justice Matters, 68, pp. 39-40.

233 271 291 307 325 347 355

PART IV SMART TECHNOLOGIES, SOCIAL CONTROL AND HUMAN RIGHTS

22 Lee A. Bygrave (2010), 'The Body as Data? Biobank Regulation via the "Back Door" of Data Protection Law', Law, Innovation and Technology, 2, pp. 1-25. 23 William Webster (2009), 'CCTV Policy in the UK: Reconsidering the Evidence Base', Surveillance & Society, 6, pp. 10-22. 24 Barrie Sheldon (2011), 'Camera Surveillance within the UK: Enhancing Public Safety or a Social Threat?', International Review of Law, Computers & Technology,25,pp. 193-203. 25 Shara Monteleone (20 12), 'Privacy and Data Protection at the Time of Facial Recognition: Towards a New Right to Digital Identity?', European Journal of Law and Technology, 3(3). Online. 26 Jeffrey Rosen (2012), 'The Deciders: The Future of Privacy and Free Speech in the Age ofFacebook and Google', Fordham Law Review, 80, pp. 1525-38. 27 Donald A. Norman (1999), 'Affordance, Conventions, and Design', Interactions, 6, pp. 38-42. 28 Deborah C. Peel (2013), 'eHealth: Roadmap to Finding a Successful Cure for Privacy Issues', Data Protection Law and Policy, 10, pp. 14-16.

359 385 399 411 455 469 475

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

vii

29 DanielL. Pieringer (2012), 'There's No App for That: Protecting Users from Mobile Service Providers and Developers of Location-Based Applications', University of Illinois Journal of Law, Technology & Policy, 2012, pp. 559-77. 30 Christopher Wolf (20 13), 'The Privacy Bill of Rights: What Are the Expectations for 2013?', Data Protection Law and Policy, 10, pp. 4-5.

481

Name Index

505

501

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Acknowledgements Ash gate would like to thank our researchers and the contributing authors who provided copies, along with the following for their permission to reprint copyright material. Association for Computing Machinery, Inc. for the essays: Steve Mann (2004), '"Sousveillance": Inverse Surveillance in Multimedia Imaging', Proceedings of the ACM Multimedia, pp. 620-27. Copyright© 2004 ACM. Reprinted by permission; Roger A. Clarke (1988), 'Information Technology and Dataveillance', Communications of the ACM, 31, pp. 498-512. Copyright© 1988 ACM. Reprinted by permission; Donald A. Norman (1999), 'Affordance, Conventions, and Design', Interactions, 6, pp. 38-42. The Catholic University of America, Columbus School of Law for the essay: John T. Billings (2012), 'European Protectionism in Cloud Computing: Addressing Concerns over the PATRIOT Act', CommLaw Conspectus: Journal of Communications Law and Policy, 21, pp. 211-31. Copyright© 2012 The Catholic University of America. Data Protection Law & Policy for the essays: Deborah C. Peel (2013), 'eHealth: Roadmap to Finding a Successful Cure for Privacy Issues', Data Protection Law and Policy, 10, pp. 14-16; Christopher Wolf (20 13), 'The Privacy Bill of Rights: What Are the Expectations for 2013?', Data Protection Law and Policy, 10, pp. 4-5. Emerald Group Publishing Ltd for the essay: Benoit Dupont (2008), 'Hacking the Panopticon: Distributed Online Surveillance and Resistance', Surveillance and Governance, Sociology of Crime, Law and Deviance, 10, pp. 257-78. Copyright© 2008 by Emerald Group Publishing Limited. All rights of reproduction in any form reserved. Fordham Law Review for the essay: Jeffrey Rosen (2012), 'The Deciders: The Future of Privacy and Free Speech in the Age of Face book and Google', Fordham Law Review, 80, pp. 1525-38. Gibson, Dunn & Crutcher LLP for the essay: Lisa Madelon Campbell (20 11 ), 'Internet Intermediaries, Cloud Computing the Geospatial Data: How Competition and Privacy Converge in the Mobile Environment', Competition Law International, 7, pp. 60-66 (www. ibanet.org/Publications/competition_law_international.aspx). Hart Publishing Ltd for the essay: Lee A. Bygrave (2010), 'The Body as Data? Biobank Regulation via the "Back Door" of Data Protection Law', Law, Innovation and Technology, 2, pp. 1-25. IEEE for the essay: Joseph A. Cannataci (20 I 0), 'Squaring the Circle of Smart Surveillance and Privacy', Fourth International Conference on Digital Society, pp. 323-28. Copyright © 2010 IEEE.

X

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Journal of Law, Technology & Policy, University of Illinois College of Law, for the essay: Daniel L. Pieringer (20 12), 'There's No App for That: Protecting Users from Mobile Service Providers and Developers of Location-Based Applications', University of Illinois Journal of Law, Technology & Policy, 2012, pp. 559-77. Paul Maharg, editor of European Journal of Law and Technology, for the essay: Shara Monteleone (20 12), 'Privacy and Data Protection at the Time of Facial Recognition: Towards a New Right to Digital Identity?', European Journal ofLaw and Technology, 3(3). Helen Nissenbaum for the essay: Helen Nissenbaum (2011), 'A Contextual Approach to Privacy Online', Dcedalus, the Journal of the American Academy of Arts & Sciences, 140, pp. 32--48. Copyright© 2011 by Helen Nissenbaum. Notre Dame Journal of Law, Ethics and Public Policy and Oscar H. Gandy, Jr for the essay: Oscar H. Gandy, Jr (2000), 'Exploring Identity and Identification in Cyberspace', Notre Dame Journal ofLaw, Ethics and Public Policy, 14, pp. 1085-111. Oxford University Press for the essay: Lee A. Bygrave ( 1998), 'Data Protection Pursuant to the Right to Privacy in Human Rights Treaties', International Journal of Law and Information Technology, 6, pp. 247-84. By permission of Oxford University Press. SAGE for the essays: Ian Loader (1999), 'Consumer Culture and the Commodification of Policing and Security', Sociology, 33, pp. 373-92. Reprinted by permission of SAGE; Vincenzo Pavone and Sara Degli Esposti (20 12), 'Public Assessment of New SurveillanceOriented Security Technologies: Beyond the Trade-Off between Privacy and Security', Public Understanding of Science, 21, pp. 556-72. Copyright© 2010 Vincenzo Pavone and Sara Degli Esposti. Reprinted by permission of SAGE. Springer for the essay: L. Jean Camp (2002), 'Designing for Trust', in R. Falcone, S. Barber, L. Korba and M. Singh (eds), Trust, Reputation, and Security: Theories and Practice, BerlinHeidelberg-New York: Springer-Verlag, pp. 15-29. Copyright© 2003 Springer-Verlag Berlin Heidelberg. With kind permission of Springer Science+Business Media. Surveillance Studies Network for the essays: Oliver Leistert (20 12), 'Resistance against Cyber-Surveillance within Social Movements and How Surveillance Adapts', Surveillance & Society, 9, pp. 441-56. Copyright© 2012 Oliver Leistert; William Webster (2009), 'CCTV Policy in the UK: Reconsidering the Evidence Base', Surveillance & Society, 6, pp. 10-22. Copyright © 2009 William Webster. Taylor & Francis Group for the essays: Roger Clarke ( 1994), 'The Digital Persona and Its Application to Data Surveillance', The Information Society, 10, pp. 77-92. Copyright© 1994 Taylor & Francis. Reproduced by permission of Taylor & Francis Group LLC (http://www. tandfonline.com); Torin Monahan (2006), 'Counter-Surveillance as Political Intervention?', Social Semiotics, 16, pp. 515-34. Copyright © 2006 Taylor & Francis Ltd. Reproduced by permission of Taylor & Francis Group LLC (http://www.tandfonline.com); Jennifer Whitson and Kevin D. Haggerty (2007), 'Stolen Identities', Criminal Justice Matters, 68, pp. 39--40. Copyright© 2007 Centre for Crime and Justice Studies and Taylor & Francis. Reproduced by

Security and Privacy

xi

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

permission of Taylor & Francis Group LLC (http://www.tandfonline.com), on behalf of the Centre for Crime and Justice Studies; Barrie Sheldon (20 11 ), 'Camera Surveillance within the UK: Enhancing Public Safety or a Social Threat?', International Review ofLaw, Computers & Technology, 25, pp. 193-203. Copyright© 2011 Taylor & Francis. Reproduced by permission of Taylor & Francis Group LLC (http://www.tandfonline.com). The University of Chicago Press for the essays: Julie E. Cohen (2008), 'Privacy, Visibility, Transparency, and Exposure', University of Chicago Law Review, 75, pp. 181-201; Richard A. Posner (2008), 'Privacy, Surveillance, and Law', University of Chicago Law Review, 75, pp. 245-60. John Wiley & Sons Ltd for the essays: Kevin D. Haggerty and Richard V. Ericson (2000), 'The Surveillant Assemblage', The British Journal ofSociology, 51, pp. 605-22. Copyright© 2000 London School of Economics and Political Science. Published by Routledge Journals, Taylor & Francis Ltd on behalf of the LSE; Torin Monahan (20 II), 'Surveillance as Cultural Practice', The Sociological Quarterly: Official Journal of the Midwest Sociological Society, 52, pp. 495-508. Copyright © 2011 Midwest Sociological Society; Walter Peissl (2003), 'Surveillance and Security: A Dodgy Relationship', Journal of Contingencies and Crisis Management, 11, pp. 19-24. Copyright© 2003 Blackwell Publishing Ltd. Every effort has been made to trace all the copyright holders, but if any have been inadvertently overlooked the publishers will be pleased to make the necessary arrangement at the first opportunity. Publisher's Note

The material in this volume has been reproduced using the facsimile method. This means we can retain the original pagination to facilitate easy and correct citation of the original essays. It also explains the variety of typefaces, page layouts and numbering.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Series Preface It was a pleasure to be asked to produce this series of essays, following in the footsteps of

Eric Barendt's Privacy collection (Ashgate, 200 I). Barendt had focused on the philosophical aspects of privacy at a time when academic interest in privacy was beginning to develop more seriously, and his chosen essays had been useful to both me and my students as we studied what 'right to privacy' the individual might have in the networked world. That collection enabled me to see quickly the main themes delineating privacy and also push students towards quickly grasping these themes. Over the past decade or so, the research field has exploded and a much more cross-disciplinarian approach is needed to understand better current trends and responses by the academic community. This series of volumes thus moves into- perhaps- a less philosophical approach about the individual and a more 'ethical' one as society attempts to determine what role privacy should have and how regulation might be enabled whether through law, technology or social norm. The new context is that there is now no technical limitation as to how privacy might be undermined: both the state and commerce have tools and techniques to know more about any individual than they know about themselves, whether through the daily collection of everyday data or through targeting of individuals or populations. It is clear that ifthere is now no technical constraint to intrusion, the current debate must be over the ethics of privacy: what should society consider to be 'right behaviour' (in the moral sense) in a world where no-one appears to agree what that behaviour should be or where the moral lines should be drawn. There is, of course, no real help given by Conventions such as the European Convention on Human Rights (ECHR)- Art. 8 and Art. I 0 tell us only that we have a right to privacy and also a right to know about others, two abstract rights which clearly conflict. The much vaunted 'right to be left alone' hardly helps us understand privacy in the modern world either. The debate is now over how to construct more detailed rights and the ethical rationale for these constructions. Before this can be done, we must also understand the complexity ofthe concept of 'privacy'. My colleagues in this project, Joseph Cannataci and Joseph Savirimuthu have aided me enormously by broadening the series' vision of what privacy is. Our goal has not been to present a collection which follows our own views on the ethical choices around the regulation of privacy (each of us, it seems to me, has a different perspective anyway). What they have done is to help to disentangle the strands which are lumped together under the rubric of privacy, and provide the reader with a means to approach these strands. We have done so by taking a decidedly multi-disciplinary approach. Cannataci's volume, The Individual and Privacy, looks at privacy from a principally anthropological stance. What has been meant by privacy in the past? What has privacy meant in the various parts of the globe, each with their own culture? What is the nature of an individual's right as against the community? The reader could hardly leave Cannataci's volume without agreeing with his assertion that privacy is a complex multi-faceted matter. Understanding of that fact means that our proposed ethical solutions must match the complexity of the problem.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

xiv

Security and Privacy

Savirimuthu, in Security and Privacy, deals with that strand of privacy related to the state -that of surveillance. A state has obligations to protect the individual from others but also obligations to respect the rights of the individual, all within a framework where the state is the most powerful actor. It was, after all, state intrusion which had brought Art. 8 ECHR into being. Yet any state now has more powerful techniques for overseeing the citizen than the Nazi or Soviet states ever had. Can we implement, through law, an ethical framework in which we can trust the state to behave responsibly? Savirimuthu's chosen essays focus on whether and how regulation might be possible. My own collection. Privacy in the !riformation Society, looks at the conflict where the attempt to build an 'information economy' meets the attempt to protect privacy. The very notion of 'information economy' leads us to understand that there is value in infonnation of all kinds - celebrity private lives, customer databases, user provided information to social media, email contents, health data. etc. etc. Presuming that an infonnation economy is a 'good thing' (certainly most countries wish to develop this) does this mean that privacy is no longer possible? If it is, how might we set up the positive rights and responsibilities to match our expectations? We had much debate when we first got together on this project as to how we might structure the collections. Hopefully the reader will find that our chosen approach is useful. We also had debate about which essays might appear and where, a problem since although the titles of each collection differ, we are really interested in the same complex issue. We also certainly each felt that we could have chosen two or three times as many essays, but hopefully- again- the reader will not be disappointed with those upon which we did eventually rest. PHILIP LEITH Series Editor Queens University of Belfast, UK

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Introduction Google Glass, WikiLeaks, PRISM, facial recognition technologies, health and bio databanks and smart meters - these are some of the technologies that define convergence in a highly connected and networked environment. They also represent the context for ongoing security and privacy concerns. At one level, the essays in this volume attest to the particular attributes of the Internet infrastructure and technologies where the meme, 'information wants to be free' is prized as a crucial value. However, at another level, as we transition into an increasingly converged environment of ubiquitous computing, augmented reality and Internet of Things, there is a real need to understand how society and its institutions cope with growing demands that erosions to privacy and threats to personal data be addressed. The EU Barometer Study indicated that many individuals in society did not have confidence in industry or governments to respect their privacy. This is a sentiment not unique to citizens living in the EU. It is not that this is the effect of living in a risk society or that we are particularly sceptical about claims made by governments and industry that our individual privacy rights will be respected. It may be that we have very little confidence in the institutions that are meant to regulate the way personal data is collected, processed and used. One does not need to be a privacy scholar to notice that resolving the privacy paradox is far from straightforward- governments need access to personal data to fulfil some of their public roles, there is a gradual blurring of the on-line/off-line space, convergence is redefining the way we express our choices, identities and values. There are so many questions that remain unanswered and these can be approached at various levels. During the last decade in particular, the levels of critical engagement with the challenges new technologies pose for privacy have been on the rise. Many have continued to explore the big themes in a manner that typifies the complex interplay between privacy, identity, security and surveillance. This level of engagement is both welcome and timely particularly in a climate of growing public mistrust of state surveillance activities and business predisposition to monetize information relating to the on-line activities of users. This volume is very much informed by the range of discussions being conducted at scholarly and policy levels. The essays illustrate the value of viewing privacy concerns not only in terms of the means by which information is communicated but the political processes that are inevitably engaged and the institutional, regulatory and cultural contexts within which meanings regarding identity and security are constituted. Privacy scholarship has dealt with topics posed by emerging technologies and addressed a number of questions and issues raised as a consequence. In the next four parts a snapshot will be provided of topics and issues that can provide a springboard for further research and studies. A caution should be noted at the outset- privacy, identity and security are multidimensional. The categories chosen are not exhaustive or determinative since privacy concerns often overlap and multiple perspectives can be presented to offer different insights. The diversity in the narratives chosen for this volume serves as a reminder that various policy frames could be used to address core privacy concerns such as identity and security.

xvi

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Identity, Security and Privacy in Context Rather than rehearse the foundations of concepts such as identity and security, the approach adopted here is to allow the authors in the selected essays to introduce their perspectives and emerging issues. The dilemma for L. Jean Camp in 'Designing for Trust' (Chapter 1) is not so much to do with ascertaining the nature of privacy but the steps that must be taken to bridge the trust deficit. Camp draws on her considerable experience in computer science and social sciences to set the context for the way we ought to think about concepts such as identity and security. She moves away from orthodox treatments of the privacy dilemma and directs attention towards addressing the trust deficit that undermines users' confidence in networks and information systems. What does trust have to do with the way individuals manage the security and integrity of personal information and on-line activities? Camp adopts a techno-anthropology approach and shows that trust norms lie at the intersection of privacy, security and reliability. In the context of the privacy debate, Camp urges policy-makers and designers to operationalize trust by taking account of user perceptions of privacy and opportunities for designing confidence enhancing tools at user and network level. There are two policy implications to be noted. First, the governance strategy in formulating, implementing and embedding trust enhancing designs in networks and communication platforms. Second, the emphasis placed on the value of design solutions that accommodate human-centric needs, values and perceptions. Both raise valid governance challenges given consumers' concerns about identity theft and security lapses regarding the storage of personal data by data controllers. Let us pull back briefly. It is useful to recall some of the ways digital technologies enable information about individuals to be collected, processed and analysed. Designing for trust also ensures, among other things, that individuals can continue to define their identities and values. Privacy enables individuals to exhibit their freedoms and develop the 'self'. This should, rightly, extend to the right to create our digital identities. IT disrupts this norm. Roger Clarke, a renowned specialist in the strategic and policy aspects of IT, surveillance and privacy introduces the concept of a digital persona in 'The Digital Persona and Its Application to Data Surveillance' (Chapter 2). This foreshadows the European Commission's proposal for a right to forget. The digital persona concept helps us frame the concerns and issues posed by the dynamics of the networked environment, particularly in acting as a catalyst for the formation of personas. Clarke stresses the need for policy-makers to undertake a critical assessment of the way IT processes can make inroads into fundamental freedoms often without transparency or accountability. It is this lack of democratic oversight that concerns Clarke, which he describes as capable of undermining the essence of human flourishing. The surveillance of individuals through their digital footprints is one manifestation of the normalization of surveillance - a common feature perhaps, of the concerns surrounding the pervasive nature of data surveillance via on-line profiling, behavioural targeting and information sharing. The next three essays pick up the recurring themes of transparency, visibility and accountability. Privacy lawyers have long explored these themes in privacy debates; scholars increasingly tum to other disciplines to generate critical policy perspectives. In 'Privacy, Visibility, Transparency, and Exposure' (Chapter 3) Julie E. Cohen draws inspiration from philosophers such as Langdon Winner, and urges us not to overlook the political character

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

xvii

of networked technologies and the values and interests these perpetuate. In real space, privacy norms aim to create a space, which enables individuals to control who, what and how information about them is accessed or made visible. According to Cohen, we need to reconceptualize risks and harms to privacy since erosions can be incremental, passive and often go unnoticed. Furthermore, calling for privacy policies that promote informational transparency, she concludes, does not bring to an end the risks to an individual's 'right to be left alone'. Cohen argues any move towards framing privacy rules and governance mechanisms must bring into the equation both spatial and informational dimensions. Cohen is right to point to the spatial dimension in privacy since convergence and mobile technologies now blur contexts. We can infer here that the experienced space is an integral aspect of privacy and may not necessarily fit into ordinary conceptions of 'public' and 'private' spaces. What Cohen, carefully brings to the foreground, is the possibility that in the experienced space, individuals should continue to determine the circumstances when their identities become visible to others. Data mining is not simply about the collection of personal data with the aim of formalizing consumers' identities and refining their choices, preferences and values. A different set of questions are generated if we consider the significance of the exponential growth in the data mining industry for the junctures between the social dynamics of power and the new wealth ofthe digital economy- personal data. Oscar Gandy, 'Exploring Identity and Identification in Cyberspace' (Chapter 4), views the emergence of new technologies as normalizing discrimination, and designed to avoid regulatory scrutiny and accountability. This essay is a careful study of the ethics of surveillance and highlights approaches that enable privacy harms to be anticipated. Gandy suggests that we scrutinize the values and preferences hidden in data aggregation and profiling practices. He urges regulators to problematize data mining and surveillance activities so that a proper privacy impact assessment can be made between the risks and the benefits of surveillance and monitoring. Gandy's essay also reignites ongoing concerns that policy-makers often fall short of grappling with the central problem resulting from the use of panoptic surveillance tools by industry, namely, the creation of a hierarchical structure of relations, and automating and normalizing surveillance. His reference to the ethics of surveillance is apt- given the scale of data mining activity now taking place, it is imperative that policy-makers and industry take the lead in contending with the privacy concerns associated with digital curation (Marx, 1998, p. 174). One suggestion Gandy offers, and which is now emerging as a legitimate policy response is that data controllers be required to make clear how they balance their legitimate economic and security interests with the expectation of individuals that their civil liberties are not compromised. The final essay in this part serves to remind us that privacy is about managing trust and expectations in social relations. Helen Nissenbaum, 'A Contextual Approach to Privacy Online' (Chapter 5), brings her philosophical and computer science expertise to illustrate the benefits of focusing on the context in which privacy concerns emerge. Do we need specific privacy rules for the on-line environment? One shortcoming in drafting a set of privacy rules particularly for the on-line environment is that we may end up losing sight of the raison d'etre of privacy principles and norms - which is to provide individuals with adequate safeguards to their personal information and privacy. Nissenbaum stresses that emphasizing the distinctiveness of on-line and off-line privacy is the wrong way of addressing the privacy challenges in contemporary society. She suggests that we should concentrate on the subjects

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

xviii

Security and Privacy

of privacy protections - individuals. She has a point. New communication tools and social media disrupt the way we have traditionally managed our identities and security. Nissenbaum envisages that all stakeholders have an important role in brokering spaces for meaningful social and economic activity while being alert to the value and significance of informed consent. Since many individuals have social media accounts, we could, like Nissenbaum, consider how these technologies disrupt social norms such as preserving user privacy in online social interactions and obtain informed consent before accessing and processing personal information.

Surveillance, Security and Anonymity On 17 July 2013 the Chairman of the Intelligence and Security Committee of Parliament, the Rt Hon Sir Malcolm Ritkind MP, issued a statement regarding GCHQ's alleged interception of communications under the US PRISM Programme (see Intelligence and Security Committee of Parliament, 20 13a, 20 13 b). The statement was intended to reassure the general public that GCHQ had not circumvented or attempted to circumvent UK law, by using the United States' National Security Agency's PRISM programme to access the content of its citizens' private communications. Should we provide governments with some latitude in the way personal information or communications data are accessed to ensure our safety? Under what circumstances should an individual's right to privacy or anonymity be subordinated to the safety interests of the broader community? On the face of it, there are arguments for and against placing trust in security and intelligence agencies. As the recent cause celebre involving US National Security Agency's intelligence gathering activities highlight, some scepticism is no doubt healthy in a democracy. A starting point to help frame the essays in this part would be to note the default position as stated in Article 8 of the European Convention on Human Rights (ECHR). It provides that 'Everyone has the right to respect for his private and family life, his home and his correspondence', and that: There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

What is the intersection between surveillance, secrecy and anonymity? Unregulated access to communications data can provide information about the individual's private life, which includes identity, location, interactions and interests. Secrecy and anonymity have undoubted social value - anonymity allows us to vote freely or express our thoughts without fear of reprisal. Secrecy allows intimate relations and communications to be conducted away from the public gaze. With regard to surveillance, Article 8 ECHR is directly relevant to the subject of privacy since all forms of covert monitoring and data gathering activities potentially create an imbalance in the relationship between the individual and the body undertaking the monitoring (Haggerty and Samatas, 20 I 0). In the Case of Liberty and Others v. The United Kingdom Application no. 58243/00 (2008), the European Court of Human Rights indicated that a state's ability to enact legislation, which enabled it to undertake secret monitoring of citizens'

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

xix

communications had to be circumscribed by clear safeguards to fundamental freedoms. 1 The democratization of innovation and technology has now made available another layer of privacy safeguards. Individuals can avail themselves of technological tools to preserve secrecy and anonymity. Instant messaging, cryptography and anonymous browsers are some of the tools that enable individuals to engage in social communications without fear of monitoring. The human rights provision also acknowledges that surveillance and other forms of monitoring can augment democratic values by creating an environment where citizens can freely engage in their daily activities without fear to their safety and well-being (Lyon, 200 I). The essays chosen in this part are only a sample of the burgeoning literature on surveillance. Before examining the ideas underpinning countersurveillance, we can briefly tum to a working definition of surveillance, which focuses on the collection and processing of personal data for making decisions. The fundamental objection to surveillance is the creation of asymmetrical power relations. The power relations can exist as between the state and its citizens or between organizations and individuals. The mischief, anti-surveillance and civil libertarian scholars highlight is that indiscriminate collection of personal information can lead to political misuse (for example, censorship and propaganda) or corporate manipulation of end users' choices and preferences (for example, on-line profiling and behavioural targeting). New technologies have been used by commerce and industry to profile individuals for behavioural advertising or sale of goods and services. There is a body of literature that explores surveillance from a different vantage point. A number of scholars regard technologies such as CCTV cameras, facial recognition technologies and the Internet as ushering in a new aesthetic to surveillance. The focus on the role and potential of resistance to destabilize the power relations surveillance aims to establish should not be overstated. Two essays advocate a cautious stance. Benoit Dupont's essay, 'Hacking the Panopticon: Distributed Online Surveillance and Resistance' (Chapter 6), questions the uncritical transposition of the panopticon metaphor to the Internet and digital technologies. He suggests that two trends in the evolution of new technologies and distributed architecture of the Internet have significant consequences for institutionalizing power relations. First, surveillance technologies are now readily available. Their ready availability Dupont (Chapter 6) suggests has resulted in the '"democratization of surveillance"' (p. 106). Second, these technologies can be used to counter surveillance activities. Kevin D. Haggerty and Richard V. Ericson advocate a different approach in 'The Surveillant Assemblage' (Chapter 7). They offer the concept of the 'surveillant assemblage' as a rhetorical frame to visualize the role of IT in deconstructing individuals and aggregating the information into depersonalized data and profiles. Haggerty and Ericson adapt the heuristic employed by De leuze and Guattari ( 1987) in A Thousand Plateaus. It may be recalled that Deleuze and Guattari viewed the human face, ontologically, as an abstract machine. They suggested that the constitution of the face into identities and categories is in essence an assemblage of power. Haggerty and Ericson (Chapter 7) regard a similar assemblage as taking place when surveillance technologies collate and aggregate data to profile, create new knowledge and define particular identities. They conclude that 'we are witnessing a rhizomatic leveling of the hierarchy of surveillance, such that groups which were previously exempt from routine surveillance are now increasingly being monitored' (p. 128). See also Articles 17-19 of the International Covenant on Civil and Political Rights. Regulatory oversight mechanisms in the United Kingdom include the Regulatory Investigatory Power Act 2000.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

XX

Security and Privacy

The ubiquity of surveillance technologies and their ready accessibility mean that all of us, and not simply those in power or authority, have the tools to monitor people and events. Voyeurism, exhibitionism and spying seem unexceptional in the networked environment. We appear to be living in an era of liquid surveillance where the previous boundaries between the watchers and the watched on the one hand and surveillance and resistance on the other, are illdefined. Many hardly register any concerns when tagging material on social networking sites, searching through user profiles and updating time lines. Steve Mann, '"Sousveillance": Inverse Surveillance in Multimedia Imaging' (Chapter 8), focuses on the aesthetic of sousveillance to provide a counterpoint to orthodox perceptions of surveillance. Sousveillance it should be remembered is not countersurveillance. He suggests that individuals by wearing small wearable or portable personal recording technologies in their daily lives can gain new perspectives about society and its institutions. Sousveillance renders the external environment and institutions as objects of veillance. His well-documented life logging of personal experiences creates vivid images of surveillance practices and public reactions to sousveillance. Mann's recounting of his personal experiences instils the well-known claim that technologies are culturally situated and that may at times bring with it their own problems. Surveillance systems cannot be disassociated cultural practices and symbols. In short, surveillance does not exist in a vacuum. This is the focus ofTorin Monahan in 'Surveillance as Cultural Practice' (Chapter 9) as he considers how media, art and film narratives provide researchers with new avenues for studying surveillance. Surveillance when understood within a socio-technological constructivist frame, he argues, opens up new avenues for exploring this sphere of culture and increases our consciousness about its politics. Monahan illustrates some of the situations where meanings, knowledge and experiences of individuals' engagement with surveillance can even emerge at the localized level. Surveillance tools such as loyalty cards and social networking affordances could be regarded as instances of individuals appropriating such tokens regardless of the purposes for which they were originally made publicly available. How do we reconcile sousveillance or even surveillance as forms of cultural practice within the risk society? One hallmark of a risk society is society's desire that threats to its safety are minimized, if not eliminated. Our politicians may have tapped into the public's deep-seated psyche when justifying the installation ofCCTV cameras in public spaces and use body scanners in airports. The risk society is also a security conscious society with maturing surveillance tools. What are the trade-offs? Are security tools nothing more than surveillance creep? Walter Peissl, 'Surveillance and Security: A Dodgy Relationship' (Chapter I 0), challenges the received political rhetoric that more surveillance will enhance the security of citizens. Main streaming security, he suggests, does not invariably lead to greater public safety but will lead to the emergence of a panoptic society where surveillance becomes normalized. There is a subtle point being made by Peissl. Security has become a major part of political and social discourse and there is a perception that the resulting 'moral panic' has generated increased public surveillance of citizens without corresponding reduction of fears of safety. 2 Many will agree that an uncritical acceptance of these technologies will only embed asymmetric hierarchical power relations. These power relations are not typically between the state and its citizens.

See, for example, Schedule 7 Terrorism Act 2000.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

xxi

Increasingly, the state and its law enforcement agencies have turned to intermediaries to act as proxies for targeted surveillance activities. Is there an appropriate response to the emerging 'intelligence-industrial' complex? In 'Counter-Surveillance as Political Intervention?' (Chapter 11) Torin Monahan undertakes a review of countersurveillance strategies such as activist demonstrations and artistic displays of resistance but fears that despite the short-term symbolic gains in disrupting institutional mechanisms of power and control, such interventions may be counterproductive. Often institutions and agencies use these temporary displays of resistance to remove the inefficiencies in the mechanisms of control. Oliver Leistert, 'Resistance against Cyber-Surveillance within Social Movements and How Surveillance Adapts' (Chapter 12), offers a nuanced analysis. He focuses on the intelligence gathering practices of both the watchers and the watched. He suggests that both parties in the age of the Internet and mobile communications paradoxically utilize telecommunications infrastructures to engage in monitoring activities while preserving their anonymity. Not all lawyers take the view that privacy values are subordinated to surveillance. Richard A. Posner, 'Privacy, Surveillance, and Law' (Chapter 13), argues that in the light of increased threats to society posed by terrorists and other criminals a firm stance needs to be taken by the state and its law enforcement agencies. He asks whether those who claim that surveillance harms society are able to quantify the costs and benefits of these measures. He suggests that privacy concerns may either be exaggerated or taken out of context. As efforts are made to develop a model that better calibrates the competing public policy interests, Joseph A. Cannataci, 'Squaring the Circle of Smart Surveillance and Privacy' (Chapter 14), regards the trend towards investing in smart surveillance as worrying. There is sound basis for his concern. One report forecasts that the market for surveillance tags is likely to see an increase (see ReportsnReports, 20 13). It is not simply the drivers and scale of the market for smart surveillance that concern privacy advocates. Smart surveillance not only occupies a broad landscape comprising both the technologies and types of data used to gather, assemble and analyse personal data- it is also automated. The trend towards assuaging a risk averse culture with distributed intelligent surveillance systems needs to be balanced with concrete privacy safeguards.

Privacy, Data Protection and Security Data protection laws provide an important framework regulating the processing of an individual's personal data. Privacy is an important concern when personal data are processed. New communication technologies create opportunities to safeguard communities and citizens but they also threaten to undermine privacy protections. In the EU, national courts and data protection authorities are not the only entities ensuring that data controllers comply with data protection laws. The Court of Justice ofthe European Union (CJEU) and the European Court of Human Rights have played an important role in providing invaluable case law from their interpretation of human rights provisions such as the EU Charter of Fundamental Rights and the European Convention on Human Rights (ECHR) respectively. Human rights and privacy lawyers have increasingly been interested in examining whether data protection and human rights provisions provide an adequate response to privacy concerns. Article 8 of the EU's

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

xxii

Charter of Fundamental Rights states that data protection is a fundamental right. The respect for private and family life is regarded as a separate right in Article 7. Lee A. By grave, 'Data Protection Pursuant to the Right to Privacy in Human Rights Treaties' (Chapter 15), looks at the emerging jurisprudence on Article 17 of the International Covenant on Civil and Political Rights and Articles 8 of the European Convention for the Protection of Human Rights and Fundamental Freedoms. He explains how these treaties interact with each other and stresses that these provisions have sufficient flexibility to integrate data protection principles into human rights treaties. It may be mentioned here that under the European Commission's proposed reform to Directive 95/46/EC, the protection of personal data is to be regarded as a fundamental right. 3 Some circumspection is warranted. Observers have noted that the proposed regulation does not make specific reference to the importance of member states in protecting individuals' right to privacy with respect to the processing of personal data (Article 1(1) ofDirective 95/46/EC). The protection of personal data under Directive 95/46/ EC and privacy are not interchangeable concepts. How national courts and the European Court of Human Rights handle this aspect concerns privacy lawyers. One policy area where the courts may have an important role to play in clarifying whether these new proposals should be read as incorporating individuals' right to privacy relates to the privatization of the security market. Increasingly, with the expansion of the security market in the private sector questions have been raised regarding the privacy safeguards for individuals when private firms discharge public security functions. Ian Loader's contribution, 'Consumer Culture and the Commodification of Policing and Security' (Chapter 16), examines the implications of the overlap in functions. He argues that security is a public not a private good. The crossover of security from public to private domains, he cautions, has transformed security into a commodity. The blurring of the distinction between state and non-state security actors he observes could lead to policy responses being shaped by market rules and norms and undermines the role of the state as a provider of security for the public good. Opinions vary on how the know ledge base of personal information is to be managed. There is another approach or conceptual frame that may be of assistance in the way we unbundle the risks of personal data straddling private and public sector domains. Roger Clarke, 'Information Technology and Dataveillance' (Chapter 17), provides us with a timely introduction to the concept of 'dataveillance'. He describes this form of processing activity as involving the systematic monitoring of individuals' actions or communications through information technology. Roger's concerns that the centralization of monitoring activity and the reactive nature of privacy and consumer protection laws are well founded. Increased access to the Internet, on-line social media and mobile communications, he notes, only leads to an exponential increase in communications data but it provides a fertile landscape for dataveillance. While these technologies can provide consumers and individuals with considerable benefits, the seamless nature of the networks and communication platforms create new opportunities for industry and the state's intelligence agencies to assert their control by collating data from multiple sources. In 'Public Assessment of New Surveillance-Oriented Security Technologies: Beyond the Trade-Off between Privacy and Security' (Chapter 18) Vincenzo Pavone and Sara Degli Esposti deal with the question of how individuals respond to security enhancing technologies COM(2012) 11 final.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

xxiii

that also bring them inherent risks to privacy. Privacy management is a pragmatic enterprise influenced by tools, information, choices and preferences at the disposal of individuals. Their empirical study underlines some of the observations raised by Camp and Nissenbaum. The essay also sheds additional light on the mosaic of reactions of individuals to surveillanceoriented security technologies. Contexts matter and the caricature of such technologies as invariably involving a trade-off are held to be simplistic. For example, during times of intense security tensions and risks, citizens may regard the political response in developing monitoring processes as necessary and do not view privacy as an exchangeable good. However, in other contexts their attitudes to the deployment of surveillance technologies may depend on variables such as personal, economic and social factors, which cannot be exchanged. The next two essays look at cloud computing. Should we avoid US-based cloud services providers given that EU data protection laws provide far greater safeguards for individuals' privacy? John T. Billings' essay, 'European Protectionism in Cloud Computing: Addressing Concerns over the PATRIOT Act' (Chapter I 9), provides a critical examination of United States and European Union Law. He concludes that one must not attach too much significance to the distinction between US- and non-US-based cloud services providers since US jurisdictional rules and mutual legal assistance treaties permit access to EU consumer data.+ The distinction nevertheless is still important. The European Data Protection Supervisor Peter Hustinx has published an Opinion on the European Commission's Communication on cloud computing, which highlights the importance of clarifying the responsibilities and obligations of all parties involved in cloud computing in the context of Directive 95/46/EC and the proposed General Data Protection Regulation. 5 New technologies allow data to be used to promote innovation or realize outcomes which benefit society. For example, geospatial data could be harnessed for critical policy responses to situations such as disaster management, monitoring of environment conditions and tracking of infectious diseases. Geospatial data could also be used for purposes that subordinate the privacy interests of individuals to business or political goals. Complex issues arise when geospatial data include personal data. The collection of personal data can provide opportunities for innovation but can result in curbing competition (see also Almunia, 2012). Lisa Madelon Campbell's essay, 'Internet Intermediaries, Cloud Computing the Geospatial Data: How Competition and Privacy Converge in the Mobile Environment' (Chapter 20), explores the legal effects of the convergence between competition and privacy law issues on innovation and competition. Finally, in 'Stolen Identities' (Chapter 21), Jennifer Whitson and Kevin D. Haggerty argue that companies' zest for customer data and the huge growth in e-commerce are exacerbating the problem of identity theft. In the process, informational security measures are poised Article 29 Working Party Opinion 05/2012 on Cloud Computing clarifying the rules in the cloud computing context. Reference should now be made to the document, 'Clarifications Regarding the U.S.-EU Safe Harbor Framework and Cloud Computing', issued by the Department of Commerce's International Trade Administration regarding the transfer of personal data from the European Union to the United States, at: http://export.gov/static/Safe%20Harbor"/o20and%20Cloud%20Computing%20 Clarification_April%20 12%202013 _Latest_eg_main_060351. pdf. Information from the European Commission strategy on cloud computing, entitled 'Unleashing the Potential of Cloud Computing in Europe' can be found at: http://ec.europa.eu/digital-agenda/en/european-cloud-computing-strategy. 5 At:http://www.edps.europa.eu/EDPSWEB/webdav/site/mySite/shared/Documents/Consultation/ Opinions/2012/12-11-16_Cloud_Computing_EN.pdf.

xxiv

Security and Privacy

to become more elaborate and intrusive as they simultaneously reproduce the institutional reliance on personal information that has ultimately made identity theft possible.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Smart Technologies, Social Control and Human Rights

An aspect of managing risks to individual security concerns the way IT and new technologies are employed to maintain social control. Smart technologies can by definition normalize surveillance. An important privacy impact assessment issue is whether living under the overarching umbrella of surveillance technologies will lead to more proactive social control activities emerging as a consequence. Privacy scholars are already beginning to come to grips with smart technologies- radio-frequency identification (RFID), ambient intelligence, smart meters, sensor technologies, augmented reality and so on. Social control potentially becomes pervasive and invisible since the interests and power relations are embedded in software algorithms. Software algorithms automate control. How will we wrestle with the uncertainties smart technologies create for our privacy? We may find some clues in the following essays. In 'The Body as Data? Biobank Regulation via the "Back Door" of Data Protection Law' (Chapter 22) Lee A. Bygrave warns of the dangers of conceptual seepage between data and information and suggests that we must resist our continued reliance on data protection law. William Webster, 'CCTV Policy in the UK: Reconsidering the Evidence Base' (Chapter 23), notes that insights can be derived from adopting a policy perspective to the adoption of particular technologies. The essay posits that a 'policy perspective' approach to understanding the CCTV revolution is illuminating as it highlights the complex intertwined interactions between government, policy-makers, the media and other stakeholders, and that CCTV does not necessarily have to 'work' if it meets other purposes. Barrie Sheldon, 'Camera Surveillance within the UK: Enhancing Public Safety or a Social Threat?' (Chapter 24), suggests that we should locate policies and government interventions in relation to CCTV cameras on empirical evidence. He questions if there is evidence justifying their proliferation in public spaces on the assumption that use of surveillance cameras significantly contributes to public safety and prevents crime and terrorist activity. A not dissimilar set of questions is raised in the remainder of the essays. Personalized health care policies, for example, hold out the prospects of placing patients at the centre of decision making and delivery of health care services. Health care accounting may create new power elites. The Health and Social Care legislation may provide a blueprint for new social relations and which may perpetuate asymmetrical power relations. At the moment the dilemma for the law is to create a robust technological infrastructure maintaining security of health information. There is a role for policy-makers and regulators. Governments must be sensitive to concerns about privacy, security and the logistics of implementation. Compare and contrast however these concerns with those raised with regard to social networks. In the age of modernity, information is collected seamlessly. The networked terrain of social networking communication platforms surpasses the windowless cells in George Orwell's Ministry of Love and Bentham's involuntary penal servitude. Digital convergence creates a host of problems for individuals to manage not only some control over the processing of personal data but also the way these same data are subsequently used to create profiles and inform decision-making.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

XXV

Shara Monteleone argues in 'Privacy and Data Protection at the Time of Facial Recognition: Towards a New Right to Digital Identity?' (Chapter 25) that the current data protection frameworks are wanting in their ability to safeguard users' privacy. She uses the social networking platform, Facebook, to highlight the shortcomings in current privacy protection and calls for a policy that empowers users, and in particular acknowledges the individual's right to digital identity. She suggests that without empowering individuals, industry will continue to utilize individuals' personal data with impunity. Jeffrey Rosen, 'The Deciders: The Future of Privacy and Free Speech in the Age of Face book and Google' (Chapter 26), identifies the constitutional dilemmas placed on law. He argues that in the age of Google and Face book, corporations should take the lead and adopt a version of enlightened stakeholder value that will neutralize the information pathologies of the digital age. Is there a third way? While law strives to respond to progress through appeals to the Enlightenment's ideals such as human reason and quest for progress, affordances may help release us from the self-imposed immaturity. Donald A. Norman suggests in 'Affordance, Conventions, and Design' (Chapter 27) that if designers make affordances visible this may lead to empowering users and help them better constitute their social relations, and manage the freedoms and opportunities technology make possible. There is a deeper point pursued by Norman, which is the absence of a coherent and understandable conceptual model that informs design solutions. We can infer from the tenor of his observations that designers need to make transparent the necessity for security and make clear the choices open to users in managing their personal data. How does this relate to the health IT infrastructure? The design of centralized databases is motivated by the need, not least to leverage the exponential storage space to hold patient data and the opportunities these create for promoting innovation and efficiency in the delivery of health care. This is a well-discussed area of public policy. In 'eHealth: Roadmap to Finding a Successful Cure for Privacy Issues' (Chapter 28) Deborah Peel provides an accessible account into the likely impact of digitalizing the health sector for patients. She draws on her considerable experience working with patients' rights organizations to find ways of harnessing the potential of health-related information for the benefit of individuals, communities and the health care industry. For Deborah, access to aggregated health information can assist planning and pre-emptive targeting of resources to meet the needs of communities and vulnerable individuals. However, the electronic health information infrastructure needs to become more mainstream to enable the economies of scale to be realized. More importantly there is some scepticism whether the security protocols are sufficiently robust and ongoing concerns that health information may be used for purposes other than providing medical care or treatment. Deborah's essay provides us with an opportunity to reflect on how best we can design and deploy trusted electronic systems in a way that coheres with Article 8 ECHR and at the same time remain cost effective. Those who remember the 'National Programme for IT' in the NHS in 2002 will recall the massive expenses incurred by the British taxpayer for the public sector disaster. Furthermore, questions continue to be raised about the effectiveness of security protocols to preserve the confidentiality of patient data. The coherent conceptual model theme is pursued by Daniel L. Pieringer in 'There's No App for That: Protecting Users from Mobile Service Providers and Developers of LocationBased Applications' (Chapter 29). He argues that proliferation of communication service providers, application developers and location-based services has found the law wanting. He

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

xxvi

Security and Privacy

fears that legislators, the IT industry and Apps developers tend to have different approaches to the intersection between usability, security and privacy. He suggests that these parties need to understand the end user better and adopt a comprehensive model that standardizes the philosophies, policies and design solutions. Will a Privacy Bill of Rights help steer us towards a coherent, effective and sustainable framework balancing the interests of all parties? Christopher Wolf, 'The Privacy Bill of Rights: What Are the Expectations for 2013?' (Chapter 30), is clearly optimistic and suggests that we will move towards this solution. He compares the current status quo in privacy laws to the position in relation to environmental laws. The Consumer Privacy Bill of Rights, proposed by the United States administration, he suggests is a move towards developing a conceptual model that makes explicit the values, norms and goals regarding the collection and use of personal data.

Conclusion The essays in this volume illustrate the challenges posed by new technologies and the issues that require attention. Two goals are intended to be served. First, to identify the role and value of trust in addressing concerns many feel that the data protection framework is failing them. Second, to provide a snapshot of how we can begin to shape the 'privacy agenda' that can be reconciled with an environment that is diverse, rich and complex, so that it works for individuals, society, industry and governments. More broadly, the essays chosen are intended to provide a commencing point for thinking about how issues relating to identity, surveillance and dependence on smart technologies challenge orthodox conceptions of autonomy, identity and privacy. Why is a multidisciplinary coverage adopted in this volume? Even though privacy laws and regulations have a critical role, smart technologies and the distributed mobile computing environment create an additional layer of complexity. Insights from criminology, sociology and anthropology can assist lawyers, scholars, activists, industry and policymakers to embrace these perspectives when thinking creatively about developing democratic technological, regulatory and institutional responses. It should be clear from the selected essays in this volume that as new technologies become pervasive we need to think creatively about how we integrate orthodox conceptions of the public and private sphere- it cannot be right to state that we have 'zero privacy'. Neither, it should be added, that we insist on total anonymity. These are some of the issues currently occupying our policy-makers. For example, the European Parliament has recently called on its member states to produce an acceptable outcome to the data protection package. The spread of information sharing between the public and private sector and the exponential rise in the use of surveillance technologies to collect personal information have resulted in the Home Office issuing a code of practice following concerns expressed by the public and the media about the violation of citizens' privacy (Home Office, 20 13). The City of London Corporation has asked a company, Renew London, to stop using recycling bins to track the smartphones of passers-by (Datoo, 20 13). These examples illustrate the continued tensions between control and access encountered in privacy debates. Although there is very little agreement, for example, on how the concept of privacy is to be understood, the essays illustrate that individuals in society have become a little more sensitive to the contexts in which frequent battles over the ideological and policy arguments about what data processing and surveillance activities cohere with democratic ideals. A timely reminder

Security and Privacy

xxvii

perhaps for the need to be cautious and circumspect about any uncritical acceptance of claims for policy fixes.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

References Almunia, J. (20 12), 'Competition and Personal Data Protection', SPEECH/12/860, 26 November, at: http://europa.eu/rapid/press-release_SPEECH -12-860_ en.htm. Datoo, S. (2013), 'This Recycling Bin Is Following You', 8 August, at: http:l/qz.com/112873/thisrecycling-bin-is-following-you/. Deleuze, G. and Guattari, F. (1987), A Thousand Plateaus, Minneapolis, MN: University of Minnesota Press. Haggerty, K.D. and Samatas, M. (eds) (2010), Surveillance and Democracy, London: Routledge. Home Office (2013), Surveillance Camera Code ofPractice, London: Stationery Office. Intelligence and Security Committee of Parliament (2013a), 'Statement on GCHQ's Alleged Interception of Communications under the US PRISM Programme', at: http://isc.independent.gov. uk/files/20130717_ISC _statement_GCHQ.pdf Intelligence and Security Committee of Parliament (20 13b), Report: Access to Communications Data by the Intelligence and Security Agencies, Cm. 8514, London: HMSO. Lyon, D. (2001), Surveillance Society: Monitoring Everyday Life, Buckingham: Open University. Marx, Gary T. (1998), 'Ethics for the New Surveillance', The Information Society, 14, pp. 171-85. ReportsnReports (2013), 'Global Electronic Article Surveillance Tags Market2012-2016', November, at: http://www.reportsnreports.com/reports/270724-global-electronic-article-surveillance-tags-market -2012-20 16.html.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Part I Identity, Security and Privacy in Context

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[1] Designing for Trust

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

L. Jean Camp Associate Professor of Public Policy Kennedy School of Government, Harvard University

[email protected]

Abstract. Designing for trust requires identification of the sometimes subtle trust assumptions embedded into systems. Defining trust as the intersection of privacy, security and reliability can simplify the identification of trust as embedded in a technical design. Yet while this definition simplifies, it also illuminates a sometimes overlooked problem. Because privacy is an element of trust, purely operational definitions of trust are inadequate for developing systems to enable humans to extend trust across the network. Privacy is both operational (in the sharing of data) and internal (based on user perception of privacy). Designing trust metrics for the next generation Internet, and indeed implementing designs that embed trust for any digital environment. requires an understanding of not only the technical nuances of security but also the human subtleties of trust perception. What is needed is a greater understanding of how individuals interact with computers with respect to the extension of trust, and how those extensions can be addressed by design.

1

Introduction

Trust is built into all systems, even those without security. Trust assumptions are included when data are collected, or coordination is enabled. Trust is embedded when resources are reserved (as shown by denial of service attacks). If trust is an element of all systems, what does it mean to design for trust? Trust is a complex word with multiple dimensions. There has been much work and progress on trust since the first crystallization of this concept. Combining the threedimensional trust perspective with studies of humans, I conclude that a new approach to understanding and designing mechanisms for peer to peer trust is critically needed. The flrst section of this work gives a quick overview of the alternative perspectives on trust: rational trust exhibited through behavior and internal trust which cannot be directly observed. The second section revisits the definition of trust offered in Camp 2001, by considering privacy, security, and reliability. At the end of that second section is an examination of how trust has evolved in whois. Thus at the beginning of the third section there is a clearly defined concept of trust. Using that definition, the third section argues for a trust system that allows users to aggregate trust, make transitive trust decisions, and manage their own electronic domains. This leads to the conclusion - that current trust management systems are hampered by designing for computers rather than humans. Trust systems for the next generation Internet must be built on solid conceptions of human trust drawn from the social sciences.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

4

Security and Privacy

16

L. Jean Camp

2

Alternative Perspective on Trust

Multiple authors have offered distinct perspectives on trust. In this section the three dimensional concept of trust is contrasted with other selected concepts of trust. Trust is a concept that crosses disciplines as well as domains, so the focus of the definition differs. There are two dominant definitions of trust: operational and internal. Operational definitions of trust require a party to make a rational decision based on knowledge of possible rewards for trusting and not trusting. Trust enables higher gains while distrust avoids potential loss. Risk aversion a critical parameter in defining trust in operational terms. In game theory-based analyses of operation trust (e.g., Axelrod, 1994) competence is not at issue. A person is perfectly capable of implementing decisions made in a prisoner's dilemma without hiring a graduate of Carnegie Mellon or MIT. In the case of trust on the Internet, operational trust must include both evaluation of intent and competence. Particularly in the case of intent, the information available in an equivalent physical interaction is absent. Cultural as well as individual clues are difficult to discern on the Internet as the face of most web pages is impersonal almost by definition. In the three dimensional definition of trust privacy, reliability, and security are based neither entirely on intention or competence. Both good intent and technical competence are required to ensure security. The result for the user (fraudulent use of data, usually to charge services) from a failure in either intention or competence are the same. Thus an operational approach arguably supports a focus on the types of harms resulting from trust betrayed 1. One operation definition of trust is reliance. (Golberg, Hill and Shostack, 2001) In this case reliance is considered a result of belief in the integrity or authority of the party to be trusted. Reliance is based on the concept of mutual self-interest. In that way, reliance is built up the assumptions of human beings as homo economicus (Olson, 1965). Therefore the creation of trust requires structures to provide information about the trusted party to ensure that the self-interest of the trusted party is aligned with the interest of the trusting party. Reliance-based trust requires that the trusted party be motivated to insure the security of the site and protect the privacy of the user. Under this conception the final placement of trust is illustrated by a willingness to share personal information. Another definition of trust, popular among social psychologists, assumes that trust is an internal state. (e.g., Tyler, 1990: Fukyama, 1999) From this perspective, trust is a state of belief in the motivations of others. The operational concept of trust is considered confidence. Based on this argument, social psychologists measure trust using structured interviews and surveys. The results of the interviews often illustrate that trust underlies exhibited behavior, finding high correlations between trust and a willingness to cooperate. Yet trust is not defined as but rather correlated with an exhibited willingness to cooperate. The difference between these perspectives is a difference in conception of trust a foundation for behavior rather than the behavior itself. To some degree this can be ' Betrayal is used in operational definitions in part because to choose not to cooperate is always a function of intent. The same ill intent or moral implications are not appropriate in failures of technical competence; however, the word is still useful for the results of trust ill-placed.

Security and Privacy

5

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Designing for Trust

17

modeled operationally as the difference between perceived (e.g., internal sense of) versus measurable risk (statistical or deterministic). (e.g., Morgan at. a!, 2002) Is willingness to share information based on the risk of secondary use of information rather than a psychological sensitivity to information exposure? Consider the case of medical information. Risks in the United States include loss of employment or medical insurance. Risks in the United Kingdom include loss of employment. In both nations medical issues are considered private. An internalized definition of trust would assume roughly equivalent sensitivity of information exposure in both nations assuming both had the same cultural sensitivity to medical privacy. An operational perspective would argue that medical privacy is more important in the US because the risks are greater2• Yet should there be differences it would be impossible to distinguish exactly the elements of risk and the elements of culture that are the foundation of that risk. These definitions of trust will merge only when observed behavior can be explained by internal state. Yet without understanding trust behaviors, designs for enabling peer to peer trust over the digital network will be flawed.

3

The Three Dimensions of Trust: Privacy, Security, Reliability

The definition of trust offered in (Camp, 2000) is operational when privacy is ensured by anonymity. Absent that assurance, the definition of privacy inevitably included internal considerations. The earlier definition of trust as a function of privacy, security and reliability is operational. It is based on risks rather than user perception of risk. In the operational sense, anonymity offers a definition for privacy that focuses on the existence of risk rather quantifying the risk. In that way it is not stochastic but rather Boolean. Yet with the removal of anonymity, granular issues of privacy arise. There still remains the operational perspective, where privacy is a measure of willingness to share information. Understanding elements of rationality and elements of internal state requires a finer delineation of privacy than available with a discussion of anonymity. In order to further the discussion of trust in operational and internal terms, this section offers three definitions of privacy. The first, the right to autonomy, is based on fear of state action. The second, a right to seclusion, is based on an internal right to define contact as unwanted. The third, data as property, is based on a strictly rational view of privacy as a market good. A common approach to the examination of privacy is based on jurisdiction. As travelers cross jurisdictional boundaries their privacy rights, indeed basic human rights, are altered. Any consideration of privacy on the Internet based on jurisdiction must be sufficiently flexible in order to describe any legal regime of privacy. Yet an exhaustive examination of privacy in the jurisdictions of the member states of the United Nations would provide little guidance, as well as exceeding the patience of the reader. A second concept of privacy is based on cultural concepts of space. Spatial privacy is of particular interest on the Internet because of the lack of cultural or social ' This question is an element of the dissertation currently being completed by Sara Wilford at the Kennedy School (contact: [email protected]).

6

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

18

L. Jean Camp

clues in virtual spaces. Virtual spaces differ from physical spaces with respect to simultaneity, permeability and exclusivity. (Camp and Chien, 2000). Permeability is the ability to move seamlessly between spaces. (Shapiro, 1998) Simultaneity is ability to move into one space without moving out of another - even when there is no overlap. For example, one my have multiple threads in discrete email lists, or view multiple new sources from a single framed browser. Exclusivity refers to the ability to create spaces that are not only private, but also invisible from the outside. (Nissenbaum and Introna, 2000) Clearly different privacy rules and expectations are appropriate for the marketplace, the avant-guard theater, and the home. Yet there is no single analysis that offers a single coherent theory about spatial privacy across the globe, despite some progress on this track. The goal of this paper is not to move the frontier of the understanding of cultural and spatial concepts of privacy across the planet. A third approach is to consider identifiable data as the issue, and govern data. The privacy regimes of Europe are designed to provide protection against violations of data protection. The data protection regimes can fit well within the taxonomy presented here if data are addressed under privacy as a human right and privacy as a property right. The data elements prohibited from collection (e.g., orientation) by the data collective would fall under privacy as autonomy. Beginning with an operational approach, I necessarily fall back on process and structure to define privacy. The American federalist legal system provides an effective parsing of privacy into those issues that are criminal and civil, corresponding with Federal and state law. Thus my operational framing and the carefully structured (if not particularly rational in outcome) American legal system offer a conception of personal data as a property right, a Federal right of autonomy and a civil right of seclusion. At the risk of self-plagiarism I review the concepts of privacy as embedded in United States law. Any design addressing privacy requires some definition of privacy that states clearly the perception of privacy built into the code. If all is included, then nothing is defined, and the definition is without worth. Definitions of privacy such as those provided by iPrivacy in which transactions are said to be as private "as in the off-line world" are meaningless. The off-line world of political action, idle gossip or commercial transactions? As private as cash transactions or credit card transactions? By including the world in the definition, no limit is placed on concept of privacy. There is no guidance provided for system design. (See iPrivacy.com for that organization's definitions.)

3.1

Privacy as Autonomy · The Human Right

Privacy is the right to act without being subject to external observation. People under constant surveillance are not free. Arguments against privacy on the basis of autonomy often imply that the ability to act freely and without surveillance offers only the ability to commit those acts normally subject to social sanction. Privacy is sometimes presented as a moral good only to the sinner and the criminal. Yet privacy as an element of autonomy also enhances the public good. The right to privacy allowed the National Association for the Advancement of Colored People (NAACP) by the Supreme Court was the "right of members to pursue their lawful private interests privately and to associate freely with

Security and Privacy

7

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Designing for Trust

19

others." In 1956 this was a right to pursue justice. At the time the members of the NAACP were seen by law enforcement as troublesome at best and subversive at worst. Those left bereaved by the murder of members of the NAACP did not seek justice from the state in the American South in 1956. In addition to the historical arguments for privacy as autonomy for the greater good there are empirical arguments. Making this argument on the basis of empirical research requires three assumptions. The essence of these assumptions is contained in the second sentence of the first paragraph in the section. First, assume that the opposite of privacy is recorded surveillance. That is, not only is some act observed via real time surveillance but there is also a record of the act created. Second, assume that when privacy is violated the user is aware of that fact. (If this is true is the basis of some debate. Certainly some data compilations are obvious, while some technical mechanisms to obtain user information are devious.) Lastly assume that the existence of the record implies some ability to coerce either by rewarding good behavior or punishing bad behavior. (In this case good or bad can be defined by the party with surveillance capacities.) Based on the three assumptions above, homo economicus would increase his or her good behavior. Yet the arguments that individuals respond in a strictly irrational way when faced with rewards (Kahan, 2001) or punishment (Lawler, 1988) are not reflected in empirical studies. When individuals are paid, required, or recorded in some "good" act the motivation to do that act decreases. A well-documented example of this is the drop in blood donations when individuals are paid (Titmuss, 1971). Privacy as autonomy offers free people the right to act freely. It enhances not only the power to choose socially prohibited acts, but also the power and tendency to choose socially optimal acts. Surveillance alters action. The constraint on action created by observation is the basis of the autonomy right of privacy. The American Constitutional right to privacy is grounded in the First, Third, Fourth, Fifth, Ninth and Fourteenth Amendments (Compaine, 1988; Trublow, 1991). The First Amendment states: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the government for a redress of grievances." The right to read is the right to read anonymously (Cohen, 1996). The argument above suggest that people are not only less likely to go to assemblies and support organizations subject to official sanction, but also that people are less likely to offer their efforts to those socially sanctioned public actions. If every appearance at a social function is marked and credited, then the internal motivation is diminished. People are less free, less autonomous, and less active. The Third Amendment states: "No soldier shall, in time of peace be quartered in any house, without the consent of the owner, nor in time of war, but in a manner to be prescribed by law." The Fourth Amendment states: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Certainly no person argues for law enforcement or military personnel to be placed in the homes of those they police or control. Yet this Amendment is not only a re-

8

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

20

L. Jean Camp

minder of the progress of global concepts of property and human rights, but also a statement about the limits of government's reach. (The Third Amendment is also a personal favorite, and can be used as reminder against nostalgia.) Combined with the Fourth Amendment, this creates of space safe from direct government intervention or even casual surveillance. The element of the Fifth Amendment that is relevant to privacy states: "No person shall ... be compelled in any criminal case to be a witness against himself, nor be deprived of life, liberty, or property, without due process of law; nor shall private property be taken for public use, without just compensation." In terms of privacy the limits on forced testimony are of greatest interest. One cannot be required to testify against oneself. The implications for wiretaps, key stroke tapping programs, and lie detectors remain in dispute. Yet it is certain that while some technology and all possible wiles can be used against a suspect, compelling testimony is simply not acceptable. Neither can an innocent person's movements nor his thoughts be constrained by governmental force. The Ninth Amendment states that the set of Constitutional rights is neither exclusive nor exhaustive. The Ninth Amendment allows the right to privacy to exist in a Constitutional sense. The Fourteenth Amendment (which primarily implemented a punitive and destructive approach to nation-building for the American south) states the rights given by the Federal government cannot be abridged by the states. The question with respect to rights of autonomy on the Internet are questions of economic and corporate power. The coercive power of the state was well-recognized by the eighteenth century. Yet the modern corporation did not yet exist. The ability to gather information and violate privacy was held by the state alone until the rise of the popular press in the nineteenth century. Because of the First Amendment, weak privacy rights were necessarily trumped by strong speech rights. Yet the debate on the Fourteenth Amendment asks if the state has a positive responsibility to guarantee those rights, or simply the responsibility not to violate them directly. When building a system specific to digital government, an understanding of autonomy is required. Yet the legal understanding of autonomy in the commercial corporate world is yet inchoate. Because of the uncertainty of the policy outcome, and the reality of the risk faced by a median worker, technical designs that promise a level of trust appropriate for the person concerned with autonomy must meet a high standards than designs based on seclusion or property concepts. (Camp and Osorio, 2002).

3.2

Privacy as Seclusion - The Right to Be Let Alone

"The right to be let alone." Warren and Brandies' alliteration of privacy has come to be a definitive work. A century and half later the work was either refined (Prosser, 1941) or destroyed (Bloustein, 1968) by determining that right to be free from intrusions consists of four possible torts: intrusion upon seclusion, appropriation of name and likeness, false light, and public disclosure of private facts. Each of these torts is framed by the technology of the printing press. Understanding their meaning in a networked digital world requires a reach across an economic and technological chasm. In fact, the work singled out the then-emerging popular press for reprobation: "Gossip is no longer the resource of the idle and of the vicious, but has become a trade which is pursued with industry as well as effrontery." (Warren

Security and Privacy

9

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Designing for Trust

21

and Brandeis, 1890). Now gossip is not only the vocation of journalist but also the avocation of many with a modem. Appropriation of name and likeness may include names in meta-data in order to associate with amore successful site. It may include the use of domain names to obtain the attention of those seeking a related site, as when Colonial Williamsburg was used by the service employee unions (Mueller, 2001). Or it may include the use of a person's name or publications to attract those interested in related materials. Yet such appropriations are treated very differently. Meta data is not generally actionable while domain names have been subject to action based on the expansion of the rights of trademark holders. Visibility of meta-data allows detection of mis-appropriation. Trust systems that implement ratings, meta-moderating and ordering of sites can address misleading and appropriate practices. False light is so common on the web that making it actionable seems impossible. When everyone is a journalist, everyone has the right to frame content. Private persons need show only falsehood, yet how can on be an active participant in networked conversations and remain a private participant? False light is entirely content based. Again implementation of content-ratings systems can address false light, assuming that the majority do not choose falsity over truth. Public disclosure of private facts implies individual control over information. Registration systems that send information about the user (also known as spyware) violate this concept of privacy. Spyware is used in browsers and peer-to-peer systems including Kazaa and Limewire. The sharing of this information and targeting of ads provides the financial incentive for the systems to continue to function. Arguably the networks would not exist without the spyware. Yet the design for trust perspective would allow such designs only if the systems were easy to delete, and adequate notice was part of the design. Adequate notice may be as simple as allowing a add-on to be disabled during use rather than asking for a one-time installation permission.

3.3

Privacy as Data Ownership - The Property Right

For those who believe that privacy is property, what is required is a fair trade for private data. Much of the legislative debate about privacy concerns the existence and intensity of concerns about privacy. Observations of the diffusion of the Internet commerce are in contrast with surveys identifying increasing privacy concerns. The privacy as property argument is enhanced by considering private information as a form of intellectual property (Mel!, 1996). In that case the transfer of data subject to data owner is fairly conceptually simple. The concept of privacy as property can explain this conflict. Individuals are ready to provide information to Amazon. Amazon decided that legal risk prevented the personalization and affinity marketing provided by user data. Therefore Amazon issued a privacy policy removing all possible expectation of privacy from users. The Free Software Foundation and Computer Professionals for Social Responsibility issued a call for a boycott. Amazon was only marginally affected. Amazon used consumer information for consumer benefit. In contrast, Geocities used consumer information only for the benefit of Geocities. Geocities, like Amazon, depends entirely on customer relationships. After the Federal Trade Commission announced that Geocities had substantially violated the pri-

10

Security and Privacy

22

L. Jean Camp

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

vacy of it's the total value of Geocities fell nearly $1,000,000 for each minute that the stock market remained open. Geocities never recovered the value. If privacy is property then programs that send personal information or trap personal information are theft. In that case the most basic market frameworks are all that is required in designing for privacy.

3.4

Privacy and Security

Security is not privacy. Confidentiality allows a person to communicate with another without eavesdroppers. As confidentiality is a function of security and an enabler of privacy, security and privacy are sometimes confused. Yet in the general case, the control of information enabled by security does not imply privacy. Security enables the control of digital information, while social and organizational forces determine who exercises the power of that control. Privacy requires that a person be able to control information about his or her self. Security provides to privacy the ability to generate privacy in a specific case (as with confidentiality of communication). Security also provides the capacity for cryptography. Cryptography is the art of hiding information. When the information that is hidden is identifying information then security can be said to provide anonymity. Anonymity is a technical guarantee of privacy. Thus, unlike many social values, the concept of privacy has an excellent mapping into implementation because of anonymity. Yet the simplicity of removing individual names is misleading. For example, inclusion of date of birth, current residence and place of birth will uniquely identify most Americans.

3.5

Trust as Reliability

Trust implies more than secure endpoints - it requires that such security not come at the expense of survivability. Two of the greatest strengths of the Internet Protocol are that it is distributed, and it exhibits graceful degradation. Graceful degradation means any person can connect to the a network without altering others' access, and the loss of one machine does not effect those not using its services. Even during the most effective assault to date on the Internet, the Morris worm incident, staying connected proved to be the best strategy for recovery. Obtaining defenses against the worm, and information regarding these defenses, required remaining connected. Those who disconnected were isolated, with only their own resources to develop defenses. The ability of any network - the Internet or an intranet - to degrade gracefully rather than suffering catastrophic failure is survivability. Trust architectures have developed significantly in the past decade. Yet despite that innovation, security can come at the cost of reliability and survivability. Security systems (as well as a lack of security systems) both enable denial of service attacks. Security systems that are computationally intensive or intolerant of user input increase the likelihood of a user experiencing the system as umeliable. An element of design for trust should be designing the survivability of distributed trust mechanisms. Proposals for trust include short-lived attribute-specific certificates (Blaze, Feigenbaum, Ioannidis and Keromytis, 1999); long-lived multipurpose certificates (e.g., Anderson, 2001); certificates signed by multiple parties (Visa, 1995); a

11

Security and Privacy Designing for Trust

23

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Web of Trust (Garfinkle, 1994) and or a combination of these into a Web of Hierarchies. Yet other than the Web of Trust, few of the distributed trust mechanisms have been evaluated with respect to their ability to recognize an attack, reduce the damage of any attack, and subsequently recover. To design for trust, it is necessary to determine if, and under what conditions trust mechanisms are brittle.

4

A Design for Trust Application: The Case of Whois

Were whois to function as designed there would be no privacy considerations. Recall that the design goal of whois is to provide technical information in the case of technical errors or malicious action. Yet the Internet has changed, and the administrative structures of the Internet have changed as well. whois is an example of a technology currently in use which was designed at a point in time with vastly different economics, norms, and politics. whois was designed for the purpose of containing narrow technical contact information. whois was built for a relatively small Internet community consisting of predominantly technical users. Additional fields were added to whois, and the expansion of the function of whois occurred when the trust assumptions about the Internet began to fail. The additional fields include administrative and billing contacts. Had the trust model implicit in whois been recognized, the lack of wisdom in adding the additional field would have been obvious. A technical contact would be appropriately contacted if a server were taking part in a DDoS attack. Yet the webmaster or billing contact would be appropriately contacted if content in a web site were under dispute. The additional fields in whois are useful primarily to content enforcement authorities. A significant problem with the traditional approaches to obtaining law enforcement information is that web sites cross jurisdictions. There already exist treaties and cooperation in terms of obtaining subscriber information form telephone companies across the borders of jurisdictions. Such policies, worked out over more than century, provide a basis for law enforcement to obtain information. These policies were worked out in a complex trust network that included issues of sovereignty and imbalances of power. As the Internet and traditional network services converge, the possible business and legal arrangements between a network service provider and content provider explode. The trust environment becomes more similar to the politicized environment of global competition and cooperation reflected in the governance of telephony. By limiting whois information to technical contact and the appropriate registrar, motivation for incorrect contact information would be significantly decreased. Default automated access to whois information could reasonably be limited to those with network responsibilities. Feasible limitation of automated access to whois, and thus the ability to increase the integrity of the information, requires technical coordination at a level the holders of whois information have yet to achieve. A necessary first step for cooperation is trust. Trust may be enabled by removing the functionality that brought the enforcement spotlight to bear on whois. Reversing the unwise expansion of whois, and thus decreasing the resulting focus of intellectual property and other enforcement authorities on whois ', could enable the trust necessary for cooperation. In addition to the changes in community the domain name itself has changed. Originally simply a mnemonic the domain name is now commercial property, politi-

12

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

24

L. Jean Camp

cal speech, personal expression or artistic moniker. As a result very different models of privacy apply. It is these differences in privacy models that 'are a core cause of the trust models in whois. It is unlikely that IBM.com considers the contact information in the domain registration as constraining institutional autonomy in the political domain. etoys.org was notoriously noncommercial (Mueller, 2002). The trust failure is a function of the expansion of whois to include billing and administrative fields without reconsidering the core trust assumption: that all Internet users are created equally powerful. Billing and administrative contact became necessary as the use and users of the Internet, and thus the trust relationships on the Internet, were changing. The increased diversity of Internet users and the resulting decrease in trust was exacerbated by alterations of whois. In this case the original design was narrow and suitable for the initial environment. Failing to expand the function and fields of whois beyond the minimal necessary technical requirements would both have served the whois system more effectively and allowed the trust assumptions to remain valid in the rapidly changing realm of the Internet. This is because the trust framing was for technical individuals empowered over some small section of the network. By limiting the fields to technical information, that trust model would have been more likely to remain consistent, and therefore the service was more likely to remain effective.

5

Design for Trust

At this point I have offered a concept of trust as consisting of privacy, reliability and security. Also there has been one small example, arguing that design for trust would have resulted in a more limited and possibly more reliable whois. In this section that modest core is expanded to a broad call for trust systems that are multidimensional, transitive, and aggregate. Trust in today's Internet is based on all-or-nothing trust relationships. A network resource request is not trusted before authentication, and after authentication it is granted the full credentials of the corresponding user. Executable content from within a protected network is completely trusted, but content from outside the firewall is strictly disallowed. A network connection, once established, has equal priority with all other network connections on the system. These ali-or-nothing trust relationships fail to match the expectations of users and the needs of next generation network applications. This mismatch promotes security breaches among users, as users undermine simplified trust models to meet their own complex resource-sharing needs. As for the specific example of executable content, it is one of the keys to providing advanced functionality in network applications, but is typically disallowed by firewalls. The firewall model of trust is too simple to distinguish secure sources of executable content. When sophisticated users find this exclusion unacceptable and use methods like tunneling to work around it the security of the entire protected network can be compromised. There is a need for distributed trust modes that will allow distinctions to be made in the trustworthiness of network entities. In order to do this it is necessary to provide a better match between peoples' intuitive notion of trust and the needs of next generation applications. Security in today's Internet is focused on a centralized model where strong security requires a firewall. The firewall may be a formidable obstacle, but once it has been

Security and Privacy

13

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Designing for Trust

25

compromised the entire network that it protects is compromised, making the firewall a single point of failure. The tunneling example demonstrates how this centralized approach can allow a single breach of security to compromise the security of the entire protected network. The Microsoft!Verisign approach to regulating executable content is to centralize trust. In this approach, a presumably trustworthy third party uses a digital signature to verify the identity of an executable module. Although there is some commonality in purpose, their security model is the antithesis of most human approaches. It assumes that the same level of trust is appropriate for all approved content and gives a right of approval to some developers. Further, it requires users to manually examine the source of executable content to provide more subtle variations of trust. The parallel to the firewall example are clear. Currently proposed cross-domain trust mechanisms seek to minimize computational costs and management overhead. For example, commerce systems minimize key generation by linking all attributes and rights to a single commerce-enabling certificate. These keys are validated by a single root. This creates a single point of failure for the entire system (the root) as well as a single point of failure for the consumer (the key). The only similar system in the United States is the currency system, where the failure of the US Treasury would yield complete collapse. In family systems, individual businesses, and even religions there are multiple levels and power points. In physical security, any key is a part of a key rings, so that the failure of the validity of one key does not destroy the strength of all electronic locks. .Net ("dot net") or Passport exacerbate this problem by allowing cross-domain failure from a single lost pass phrase. SSH and SSL are used for securing Internet connections. SSH is commonly used to provide secure terminal connections, whereas SSL is commonly used to implement secure HTTP connections. The endpoints of these connections have to be ready to extend trust before the mechanism are called into play. They are extremely useful technologies for the prevention of snooping but are not useful for implementing organizational or individual trust across many dimensions (including time). Yet in real life and in social networks the "security models" (including drivers licenses, check clearing, credit cards, etc.) distribute the resources that implement authentication and authorization. In network security there are still single roots, and control is often held in a centralized point of control. A network service can be rendered unusable when the number of requests it receives exceeds the rate at which they can be served. This creates an important relationship between performance and security for Internet servers. Although users on today's Internet are accustomed to server failures due to overload, the the next generation of Internet-based applications will require better service guaranties. Decentralization is necessary to provide stable peak performance, even when a site as a whole is experiencing overload, until network capacity becomes the limiting factor. Decentralization provides defense against a large class of denial of service attacks. In contrast, overload in conventional systems typically results in thrashing behavior such as paging, leading to significant performance loss. Decentralization requires utilizing processing power at the endpoints more effectively. Decentralized trust requires enabling users to be their own trust managers. There is a need for a peer-to-peer distributed trust mechanism that implements trust effectively in the ever-increasing scale of the network. The network needs to scale not only to an increasing number of devices but also in terms of complexity of tasks.

14

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

26

L. Jean Camp

Yet as there are increasingly complex interactions and task on the network, simplicity is critical to user-managed resource-specific security. In order to allow users to share information it is necessary both to communicate trust states and enable users manipulation their own trust states. Trust must support the complexity of life, in that users function in multiple dimensions. For example, a spouse will have access to all shared family and personal information. Yet a spouse should not have access to all company and employer information. Trust in these two dimensions is managed off-line because of the reality of physical space. In addition to having multiple dimensions, users should be able to aggregate trust within a dimension. With aggregate trust the initial extension of trust is based on some introduction, which is provided by any entity or security mechanism. Any additional extension of trust is then based on aggregating different mechanisms (e.g., attaching value to different attribute-based certificates and summing) and/or extending trust to a machine based on interactions over time. Such a mechanism would be modeled more on observed social networks than on the strengths of cryptography. Users who find multiple independent paths to another user would increase the trust to that person accordingly, in a more generous manner than proposed in (Beth, Borcherding, and Klein, 1994). An early example of a user-centered approach to distributed trust, the UNIX philosophy gives users responsibility for setting security controls on their own resources. For example, UNIX systems allow users to set file protection level. Yet this approach is not adequate for a number of reasons. First, for those using UNIX based system the security mechanism is hampered by its lack of simple mechanisms for authentication and resource sharing across domains. Second, the UNIX security system requires understanding the operating system and the distinction between listing, executing, and readings a file. Third, the interface violates the rules of good human-computer interaction (HCI) design. Truncated commands (e.g., chmod), a text line interface, and obscure error codes make this interface flawed. In addition the function has too many parameters, and these parameters are not clearly specified. For these reasons, even if there were well implemented cross-domain UNIX file protection mechanisms, this implementation would fail to meet the needs of the modern Internet user. Similarly peer to peer systems allow users to determine which files are be shared. Peer to peer systems are built to implement coordination and trust across administrative domains. Peer to peer systems allow for sharing trust across domains, yet are notoriously hampered by problems of accountability (e.g., Oram, 2001). Peer to peer systems allow users control over their own files in a more transparent manner than UNIX controls, but the P2P code itself is often untrustworthy (e.g., Borland, 2002). Any optimal trust approach would benefit from experience with Pretty Good Privacy (PGP), which lets users increase trust in a transitive manner. Transitivity means that users select their own sources of validation; e.g. if A trusts B and B validates C, then A trusts C. There is no central server of tree-like hierarchy that validates users. PGP also lets users select their own sources of trust, and select a key length appropriate for the situation. PGP is specific to a single application, electronic mail. In PGP users select specific individuals to trust based on their ability to verify the identity/key carried in a PGP certificate. This research extends and enhances the distributed security model of PGP to the more generic problem of sharing resources. PGP is weak in that there is a single dimension of trust. Regardless of the definition of trust, it is certain that there are different dimensions of trust. Social admonitions not to mix friendship and money illustrate this, as well as concepts of family

Security and Privacy

15

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Designing for Trust

27

trust versus trusting in a business transactions. Trusting one's sister and trusting IBM are very different matters indeed. Users should be able to express trust in more dimensions, more richly, than with PGP. Yet unlike whois, PGP has maintained its efficacy by refusing to expand beyond its design base of email. The attempt to minimize system management by concentration of trust management is a fundamental error, doomed to fail in a world of increasingly complex trust arrangements. Oversimplified security paradigms which limit implementations will result in users subversion. Security management should be distributed and simplified by automation, rather than simplified at by the administrative assumption of a single trusted entity. Humans are capable of managing quite complex tasks (consider in the abstract the task of driving an automobile) if enabled by an interface that provides adequate and useful feedback. Rather than minimizing computational and management costs, future trust designs ideally will recognize the high value of distributed security and empower the resource owner to be a security manager. Security management must become more complex because peer-to-peer, international resource sharing is more complex than intranetwork sharing. Peer to peer systems recognize the need to share resources, yet the trust problems in peer to peer systems have not been solved. In fact, in 2002 most trust systems require users trust a central software distributor or administrator. The trust problem has only begun to be solved. In order to provide simple mechanisms to enable users to take responsibility for their own resources, the design must implement an understanding of trust based on an understanding of trust among human users and social networks. While such a design basis may appear initially too complex for implementation, such a model would inherently provide better scalability and better resistance to attacks than the current, popular, centralized model. In short, trends in distributed system security computing are on a collision course with system survivability through the construction of brittle trust mechanisms. The lack of understanding of the human interface exacerbates this problem. If trust extensions are not effectively communicated to the very human users, those users cannot react effectively when and if the trust system fails.

6

Conclusions on Design for Trust

Experts focus on the considerable technological challenges of securing networks, building trust mechanisms, and devising security policies. Although these efforts are essential, that trust and security would be even better served if designs more systematically addressed the (sometimes irrational) people and institutions served by networked information systems. In order to address human concepts of trust, privacy must be a consideration and not an enemy or afterthought of the implementation. Efforts at securing systems should involve not only attention to machines, networks, protocols and policies, but also a systematic understanding of how social agents (individuals and institutions) participate in and contribute to trust. Security is not a separable element of trust. An interdisciplinary perspective will enable protocols for trust over the network to be optimized for human trust. That the human is a critical element in security systems has been recognized both from a usability point of view (Tygar and 'Whitten, 1999) and from the analysis of

16

Securi~v

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

28

and Privacy

L. Jean Camp

systematic failures of security (Anderson, 1994). However, little work integrates methods from the social sciences, philosophy, and computer science to evaluate mechanisms for trust on-line. Previous work on integrating privacy and security (Friedman, Howe and Felton, 2002) has been complicated by the lack of a definition that can be used across disciplines. Efforts have been made to find a single definition of trust that can be used effectively within philosophy, computer security, and those social scientist embracing an operational definition of trust, as shown in (Camp, McGrath and Nissenbaum, 2001). Design for trust requires examining all assumptions about a system and the user of the system. Sometimes those assumptions are based on class (e.g., the user has a credit card). Sometimes those assumptions are based on the capacities of the human (e.g., the user must select a large number of context-free random passwords). Sometimes the assumptions are necessary to enable a functioning design. Design for trust requires enumerating the social assumptions and examining how those assumptions can function to put some user of the system at risk. In order to understand and design trust systems, acknowledgment of the social and human elements are required.

References Anderson, R.: Security Engineering, Wiley, New York (2001). Axelrod, R.: The Evolution of Cooperation, Harper Collins, USA (1994). Beth, T. , Borcherding, M., Klein, B.: Valuation of Trust in Open Networks. D. GoHman, ed., Computer Security - ESORICS '94 Lecture Notes in Computer Science. Springer-Verlag Inc., Berlin (1994) 3-18. Blaze, M., Feigenbaum, J. , Joannidis, J. , and Keromytis, A.:The role of trust management in distributed systems security" Secure Internet Programming, Vol. 1603. Lecture Notes in Computer Science. Springer-Verlag Inc. Berlin (1999) 185-210. Blaustein, A. :Privacy as an aspect of human dignity: an answer to Dean Prosser. New York University Law Review 39: (1968) 962-970. Borland, J.: Stealth P2P network hides inside Kazaa. CNET Tech News, April, 2002.

http://news.com.corn!21 OO-l023-87318l.html (2002)

Camp, L. J. :Trust and Risk in Internet Commerce, MIT Press, Cambridge, MA (2001). Camp, L. J and Chien, Y.T. :The Internet as Public Space: Concepts, Issues and Implications in Public Policy, Readings in Cyberethics. eds. R. Spinello and H Tavani, Jones and Bartlett Pub., Sudbury, MA (January 2001). Previously published in ACM Computers and Society, September (2000). Camp, L. J., McGrath C. and Nissenbaum H.: Trust: A Collision of Paradigms. Proceedings of Financial Cryptography, Lecture Notes in Computer Science. Springer-Verlag Inc. Berlin (2001). Camp, L. J. and Osorio, C. :Privacy Enhancing Technologies for Internet Commerce. Trust in the Network Economy. Springer-Verlag, Berlin (2002). Cohen, J. :A Right to Read Anonymously: A Closer Look at Copyright Management in Cyberspace. Conn. L. Rev. Vol. 28 (1996). Compaine B. J. :Issues in New Information Technology. Ablex Publishing, Norwood, NJ (1998) Friedman, B., Howe, D. C., and Felten, E.: Informed Consent in the Mozilla Browser: Implementing Value-Sensitive Design. Proceedings of the Thirty-Fifth Annual Hawaii's International Conference on System Sciences. IEEE Computer Society: Los Alamitos, CA. (2002).

Security and Privacy

17

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Designing for Trust

29

Fukuyama F. :Trust: The Social Virtues and the Creation of Prosperity. Free Press, NY, NY (1996). Golberg, Hill and Shostack: Privacy Ethics and Trust. Boston University Law Review, Vol. 81, N. 2 (2001) 407 -422. Garfinkle, S.: Pretty Good Privacy, OReilly Publishing, Cambridge, MA. (1994). Kahan, D. :Trust, Collective Action, and Law. Boston University Law review, Vol. 81, N. 2 (2001) 333-347. Lawler, E. J. :Coercive Capability in Conflict: A Test of Bilateral versus Conflict Spiral Theory. Social Psychology Quarterly, Vol. 50 (1988) 93-96. Mel!, P.:Seeking Shade in a Land of Perpetual Sunlight: Privacy as Property in the Electronic Wilderness. Berkeley Technology Law Journal 11(1). (http://www .law .berkeley .edu/journals/btljlindex.html) ( 1996) Morgan, M. G.,Bostrom, A., Fischhoff, B., Atman, C. J.: Risk Communication : A Mental Models Approach. Cambridge University Press, Cambridge, UK (2002). Mueller, M.: Ruling the Root. MIT Press, Cambridge, MA (2002). Nissenbaum, H. and Introna, L. :Sustaining the Public Good Vision of the Internet: The Politics of Search Engines. The Information Society, Vol. 16, No.3 (2000). Olson: The Logic of Collective Action: Public Goods and the Theory of Groups. Harvard University Press. Cambridge, MA (1965). A. Oram, ed.: Peer-to-Peer Harnessing the Power of Disruptive Technologies. OReilly and Associates, Cambridge, MA (2001). Prosser W.L.: Handbook of the Law of Torts, West Publishing Co., St. Paul, MN (1941). S. Shapiro: Places and Space: The Historical Interaction of Technology, Home, and Privacy. The Information Society, No. 14, Vol. 4, (1998) 275-284. Titmuss R. M. :The Gift Relationship: From Human Blood to Social Policy, Expanded and revised edition. Ann Oakley and John Ashton (eds.) The New Press, New York (1997). Trublow, G.: Privacy law and practice. Times Mirror Books, New York (1991). Tygar, J.D. and Whitten, A., : WWW Electronic Commerce and Java Trojan Horses, Second USENIX Electronic Commerce Workshop, Berkeley, CA (1996). Tyler, T. :Why People Obey the Law. Yale University Press, New Haven, NH (1990). Visa: Secure transaction technology specifications. Version 1.1, Visa International, New York (1995). WarrenS. and Brandeis L.: The right to privacy. Harvard Law Review, Vol. 4 (1890) 193-220.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

[2] Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Its Application to Data Surveillance ROGER CLARKE Australian National University Department of Commerce Canberra, Australia The digital persona is a model of the individual established through the collection, storage, and analysis of data about that person. It is a useful and even necessary concept for developing an understanding of the behavior of the new, networked world. This paper introduces the model, traces its origins, and provides examples of its application. It is suggested that an understanding of many aspects of network behavior will be enabled or enhanced by applying this concept. The digital persona is also a potentially threatening, demeaning, and perhaps socially dangerous phenomenon. One area in which its more threatening aspects require consideration is in data surveillance, the monitoring of people through their data. Data surveillance provides an economically efficient means of exercising control over the behavior of individuals and societies. The manner in which the digital persona contributes to an understanding of particular dataveillance techniques such as computer matching and profiling is discussed, and risks inherent in monitoring of digital personae are outlined.

Keywords Internet, agent, network behavior, behavior monitoring, personality projection, computer matching, profiling, data quality The marriage of computing and telecommunications brought us networks. This led to connections among networks, most importantly the Internet. With the networks has come a new working environment, popularly called "the net," "cyberspace," or "the matrix." Individuals communicate by addressing electronic messages to one another and by storing messages that other, previously unknown people can find and access. For a review of applications of the Internet to the practice of research, see Clarke (1994). People exhibit behavior on the network that other people recognize. Some of it is based on name; for example, people who use a pseudonym like "Blackbeard" create different expectations in their readers than people who identify themselves using a name like "Roger Clarke." Other aspects of the profile of people on the net are based on the promptness, frequency, and nature of their contributions, and the style in which they are written. Over a period of time, the cumulative effect of these signals results in the development of something that approximates personality. It is a restricted form of personality, because the communications medium is generally restricted to standard text; at this early stage of developments, correspondents generally do not see pictures or sketches, or even handwriting, and do not hear one another's voices. The limitations of the bare 26 letters, Received 20 August 1993; accepted 20 January 1994. Address correspondence to Roger Clarke, The Australian National University, Dept. of Commerce, Canberra, ACT 0200 Australia. Email: [email protected].

20

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

78

R. Clarke

10 digits, and supplementary special characters of the ASCII character set have spawned some embellishments, such as the commonly used "smiley" symbol :-), the frowning symbol,:-(, and the wink;-). Some variations are possible, of course; for example, (:-1} could imply a boring, bald person with a beard, and {&-() could be someone with a hangover. Generally, however, the symbol set is anything but expressively rich. Moreover, its use originated in and is by and large limited to particular net subcultures. Net-based communications give rise to images of the people involved. These images could be conceptualized in many different ways, such as the individual's data shadow, or his or her alter ego, or as the "digital individual." For reasons explained below, the term "digital persona" has some advantages over the other contenders. This paper's purpose is to introduce and examine the notion of the digital persona.

Introduction to the Digital Persona In Jungian psychology, the anima is the inner personality, turned toward the unconscious, and the persona is the public personality that is presented to the world. The persona that Jung knew was that based on physical appearance and behavior. With the increased data intensity of the second half of the twentieth century, Jung's persona has been supplemented, and to some extent even replaced, by the summation of the data available about an individual. The digital persona is a construct, i.e., a rich cluster of interrelated concepts and implications. As a working definition, this paper adopts the following meaning: The digital persona is a model of an individual's public personality based on data and maintained by transactions, and intended for use as a proxy for the individual. The ability to create a persona may be vested in the individual, in other people or organizations, or in both. The individual has some degree of control over a projected persona, but it is harder to influence imposed personae created by others. Each observer is likely to gather a different set of data about each individual, and hence to have a different gestalt impression of that person. In any case, the meaning of a digital persona is determined by the receiver based on his or her own processing rules. Individuals who are aware of the use of data may of course project data selectively in order to influence the imposed digital persona that is formed (e.g., on arriving in the United States, immigrants may take out an unnecessary loan simply to create a credit record). It is useful to distinguish between informal digital personae based on human perceptions, and formal digital personae constructed on the basis of accumulations of structured data. The data intensity of contemporary business and government administration results in vast quantities of data being captured and maintained, and hence considerable opportunity to build formal digital personae. These data range from credit and insurance details, through health, education, and welfare, to taxation and licensing. The extent of interchange of data holdings among organizations is increasing, both concerning groups (e.g., census and other statistical collections) and about identified individuals (e.g., credit reference data and ratings, insurance claims databases, consumer marketing mailing lists, telephone, fax and email address directories, electoral rolls, and license registers). Later sections of this paper also distinguish between passive, active, and autonomous digital personae. A schematic representation of the formal, passive digital persona is shown in Figure 1. There is something innately threatening about a persona constructed from data and used as a proxy for the real person. It is reminiscent of the popular image of the voodoo

21

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Data Surveillance

79

relat~ Transaction

cumulate~

represents

1111111 The Passive Digital Persona

Figure 1. The passive digital persona.

doll, a (mythical) physical or iconic model, used to place a magical curse on a person from a distance. Similar ideas have surfaced in "cyberpunk" science fiction, in which a "construct" is "a hardwired ROM cassette replicating a... man's skills, obsessions, kneejerk responses" (Gibson, 1984, p. 97). Some people may feel that such a construct is demeaning, because it involves an image rather than a reality. Others may regard it as socially dangerous. This is because the person's action is remote from the action's outcome. This frees the individual's behavior from his or her conscience, and hence undermines the social constraints that keep the peace. The digital persona offers, on the other hand, some significant potential benefits. Unlike a real human personality, it is digitally sense-able, and can therefore play a role in a network, in real time, and without the individual being interrupted from work, play, or sleep. Leaving aside the normative questions, the notion has descriptive power: Whether we like it or not, digital personae are coming into existence, and we need the construct as an element in our understanding of the emerging network-enhanced world. The following sections investigate the nature of the digital persona, commencing with a simple model and progressively adding further complexities in order to build up a composite picture of the notion.

The Passive Digital Persona A digital persona is a model of an individual and hence a simplified representation of only some aspects of the reality. The efficacy of the model depends on the extent to which it captures those features of the reality that are relevant to the model's use. As with any modeling activity, it suffers the weaknesses of the reductionist approach: Individuals are treated not holistically, but as though a relatively simple set of data structures were

22

Security and Privacy R. Clarke

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

80

adequate to represent their pertinent characteristics. Some aspects of the person's digital persona and of the transactions that create and maintain it can be represented by structurable data, such as the times of day when the person is on the net, the frequency and promptness with which he or she communicates, and the topics he or she discusses. Other aspects are more subjective and depend on interpretation by message recipients of such factors as the degree of patience or tolerance shown, the steadiness of expression, the appreciation of the views of others, and the consistency of outlook. There is a trade-off between the syntactic consistency with which structured data can be processed and the semantic depth and tolerance of unusual cases associated with less formal communications. An individual may choose to use more than one projected digital persona. People may present themselves differently to different individuals or groups on the net, or at different times to the same people, or at the same time to the same people. One projection may reflect and provide close insight into the person's "real personality," while other personae may exaggerate aspects of the person, add features, omit features, or misrepresent the personality entirely. Reasons why people may wish to adopt multiple personae include the following: • Maintenance of a distinction between multiple roles (e.g., prison warder, psychiatrist or social worker, and spouse/parent; employed professional and spokesperson for a professional body; and scoutmaster and spy) • Exercise of artistic freedom • Experimental stimulation of responses (e.g., the intentional provocation of criminal acts, but also the recent instance of a male impersonating a physically impaired female) • Willing fantasy (as in role-playing in multiuser dungeons and dragons, or MUDDs) • Paranoia (i.e., to protect against unidentified and unlikely risks) • Fraud and other types of criminal behavior There are many instances in which multiple projected personae may be used constructively. In conventional e-mail, recipients may be unaware that multiple user names are actually projections of the same person, and the sender may thereby feel free to express a variety of ideas, including mutually contradictory ones. Anonymity is particularly useful in alleviating problems associated with power differentials, such as the fear of retribution by one's superiors or of derision by one's peers. Even where the mapping of digital personae to person is known to the recipients, the sender's choice of persona enhances the semantic richness of the conversation. In contexts beyond e-mail, the idea has even greater power. In the decision support literature, techniques such as brain-storming and Delphi encourage the pooling of know-how and the stimulation of new ideas. The existing choice among comments being anonymous, temporarily anonymous, or identified can be supplemented by participation in the event of more personae than people. This enables advantage to be taken of the power and complexity of intellectually prodigious individuals.

The Active Digital Persona In the preceding section, the digital persona was described as a passive notion, comprising data alone. It is important to relax that simplification. The concept of an "agent" has been current in computer science circles for some years. This process acts on behalf of the individual, and runs in the individual's workstation and/or elsewhere in the net. A trivial implementation of this idea is the "vacation" feature in some e-mail servers, which

23

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Data Surveillance

81

returns a message such as "I'm away on holiday until " to the senders of messages. (Where the sender is a mailing list, this may result in broadcast of the message to hundreds or thousands of list members.) More useful applications of projected active digital personae are mail filterers (to intercept incoming mail and sort it into groups and priority sequences), news gatherers (to search news groups, bulletin boards, and electronic journals and newsletters in order to identify items of interest to the individual and compile them into personal news bulletins), and "knowbots" (to undertake relatively intelligent searches of network sources in order to locate and fetch documents on a nominated topic). For a review of some of these capabilities, see Loch and Terry (1992). The agent concept derives from two ideas. One is the long-standing, spookily named "daemon," i.e., a program that runs permanently in order to perform housekeeping functions (such as noticing the availability of files to be printed and passing them in an orderly manner to the printer). The other ancestor idea is the "object," which refers to the combination of data with closely associated processing code. Although this term should be understood in its technical sense, it is unavoidable that people who are fearful of the impacts of ubiquitous networking will draw attention to how the very word underlines the mechanistic dangers inherent in the idea. The active digital persona has all of the characteristics of the passive: It can be projected by the individual or imposed by others and it can be used for good or ill. The difference is in the power that the notion brings with it. It enables individuals to implement filters around themselves, whereby they can cope with the increasing bombardment of data in the networked world. These need not be fixed barriers, because they can self-modify according to feedback provided by the person or compiled from other sources; and they can contain built-in serendipity, by letting through occasional lowly weighted communications, and hence provide the network equivalent of bookshop browsing. People's digital behavior may be monitored (e.g., their access to their mail and the location they accessed it from, and their usage of particular databases or records). This may be done with the agreement of the individual, as a contribution to community welfare and efficiency (see, for example, Hill & Hollan, 1994), or without the individual's knowledge or consent, in which case it may be used sympathetically or aggressively. In the extreme case, an active agent may be capable of autonomous behavior. It may be unrecallable by its originator (as was the case with the Cornell worm). It may, by accident or design, be very difficult to trace to its originator. A familiar analogy is to shortduration nuisance telephone and fax calls.

Public Personae Individuals can exercise a degree of control over their projected passive and active personae, but much less influence over those personae imposed by others upon them. Although there are likely to be considerable differences among the various personae associated with an individual, there are also commonalities. With some individuals, there is so much in common among the images that it is reasonable to abstract a shared or public persona from the many individual personae. Examples abound of public personae developed through conventional media. The public images of Zsa Zsa Gabor, Elizabeth Taylor, Pierre Trudeau, Donald Trump, and Ross Perot are public property. The idea of any of them successfully suing in defamation a person who criticized their public image, on the grounds that this misrepresented their

Security and Privacy

24

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

82

R. Clarke

real personality, seems ludicrous. Similar limitations confront personalities of the Internet, such as Cliff Stoll, Peter Neumann, Richard Stallman, and Phiber Optik. A public persona may arise in and be restricted to a particular context. For example, a person's digital shadow may be well known within an electronic community such as that associated with a mailing list or bulletin board. Archetypal public personae include the inveterate sender of worn-out jokes and cliches, the practical joker, the sucker who always takes practical jokers' bait, the wild idea generator, the moral crusader, and the steadying influence who calls for calm and propriety when the going gets rough. With the immediacy of the net, many people play these roles without the realization that they are predictable. But they can be adopted quite consciously and constructively, e.g., where a respected persona reinforces the need for appropriate behavior and, conversely, where a normally placid respondent replies vigorously, implying that his or her patience is stretched to the limit. In such contexts, there is once again no reason why an individual should be restricted to a single public persona.

Potential Applications There is a range of uses to which individuals might put projected, passive digital personae. It may simply be to express their personality, or a facet of it, or an exaggeration of some feature of it, or a feature that the person would like to have. It may be a desire to free themselves of normal constraints in order to express different thoughts or the same thoughts differently. There is the well-known activity of "flaming" on the net, in which people express themselves to others with a vigor that would be socially and perhaps physically risky if done on a face-to-face basis. The freedom to project a digital persona can be used creatively, constructively, entertainingly, intemperately, in a defamatory manner, or criminally. Projected, active digital personae are, on the other hand, a relatively recent development, and it is too early to be able to appreciate and analyze the scope of the potential applications. There is no reason, however, why an agent has to run on one's own workstation or be limited to input filtering. For example, so-called "program trading" agents can issue buy/sell orders if the price of nominated commodities fall/rise beyond a nominated (or computed) threshold; and updates to key records in remotely maintained statistical databases can be monitored. It is also possible to conceive of the "active" role being extended to, for example, conducting a nuisance campaign against an opponent, by bombarding his or her e-mail and/or fax letterbox or countering such a campaign that has been directed against oneself (e.g., by diverting the calls to another address). Passive digital personae may be imposed by other individuals and organizations for a variety of reasons. Typical among these is the construction of a consumer profile in order to judge whether to promote the sale of goods, services, or ideas to each particular individual, and if so, then to indicate the suitability of each a palette of promotional media and devices should be used. Similarly, imposed, active digital personae offer considerable prospects. People's interests or proclivities could be inferred from their recent actions, and appropriate goods or services offered to them by the supplier's computer program using program-selected promotional means. Another application might be a network "help desk" program to detect weak or inefficient database search strategies, and to offer advice as a service to network users. A network control mechanism could provide warnings to subscribers when they use foul language or exceed traffic or storage quotas. As with other forms of monitoring of the workplace and the public, questions of law, contract, image, and morality arise.

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Data Surveillance

25

83

Aficionados of science fiction are aware of ample sources of inspiration for more futuristic uses of the concept. In John Brunner's The Shockwave Rider (1975), personae are used primarily by the State as an instrument of repression, but also secondarily by the few individuals capable of turning features of the net against the ruling clique, as a means of liberation. In "cyberpunk" literature, people adopt preprogrammed personae in a manner analogical to their usage of psychotropic drugs (see, in particular, Gibson's Neuromancer [1984], and the collection of short stories edited by Sterling [1986]). In Bear's Eon (1985), digital personae have become so comprehensive that they are routinely detached from individuals: Disembodied "partials" are created to perform specific tasks on their owners' behalf, and "ghosts" of biologically dead people are rejuvenated from the city databank. As with all imaginative fiction, plugging into the net, partials, and ghosts should not be understood as predictions, but as investigations of extreme cases of contemporary ideas, as speculations of what might be, and as inspiration for more practicable, restricted applications. As virtual reality graduates from the laboratory, the digital persona idea will doubtless be embodied in some of its applications. To provide a deeper appreciation of the power of the digital persona, the remaining sections of the paper investigate its application to one specific area: the monitoring of people.

Dataveillance Data surveillance, usefully abbreviated to dataveillance, is the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons. In the past, the monitoring of people's behavior involved physical means, such as guards atop observation towers adjacent to prison yards. In recent times, various forms of enhancement of physical surveillance have become available, such as telescopes, cameras, telephoto lenses, audio recording, and directional microphones. In addition, electronic surveillance has emerged in its own right, in such forms as telephone bugging devices and the interception and analysis of telex traffic. Dataveillance differs from physical and electronic surveillance in that it involves monitoring not of individuals and their actions, but of data about them. Two classes need to be distinguished: • Personal dataveillance, in which a previously identified person is monitored, generally for a specific reason • Mass dataveillance, which is of groups of people, generally to identify individuals of interest to the surveillance organization Dataveillance is much cheaper than conventional physical and electronic surveillance. The expense involved in physical and even electronic monitoring of the populace acted as a constraint on the extent of use. This important natural control has been undermined by the application of information technology to the monitoring of data about large populations. The development is perceived by philosophers and sociologists as very threatening to freedom and democracy. The increasing information intensity of modern administrative practices has been well described by Rule and colleagues (Rule, 1974; Rule et al., 1980). Foucault (1975) used Bentham's concept of the "panopticon" to argue that a prison mentality is emerging in contemporary societies. Smith (1974 et seq.), Laudon (1986b), OTA (1986), and Flaherty (1989) deal with dataveillance generally. The role of information technology in dataveillance is discussed in detail in Clarke (1988). A political history of dataveillance measures in one country are in Clarke (1987, 1992a).

26

Securi~v

84

and Privacy

R. Clarke

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Dataveillance To be useful for social control, data must be able to be related to a specific, locatable human being. Organizations that pursue relationships with individuals generally establish an identifier for each client, store it on a master file, and contrive to have it recorded on transactions with, or relating to, the client. The role of human identity and identification in record systems is little discussed, even in the information systems literature (see, however, Clarke, 1989). The notion of the digital persona is valuable in understanding the process of dataveillance. The data that are monitored are implicitly assumed by the monitoring organization to provide a model of the individual that is accurate in all material respects. Organizations' data collections typically comprise basic data provided by clients at the time the relationship with the organization is established, supplemented by data arising as a byproduct of transactions between them. This gives rise to a digital persona that is far from complete, but generally adequate for the purposes implied by the relationship. Secondary uses not contemplated within that relationship result in greater risk of misinterpretation of the limited model inherent in the data, e.g., through misunderstanding of the varied meanings of such data items as marital status, number of dependents, and income. The following two sections discuss the role of the digital persona in two particular dataveillance techniques that involve secondary use of data and transcend organizational boundaries.

Computer Matching Computer matching is a computer-supported process in which personal data records relating to many people are compared in order to identify cases of interest. Since it became economically feasible in the early 1970s, the technique has been progressively developed and is now widely used, particularly in government administration in the United States, Canada, Australia, and New Zealand. A description and analysis are in Clarke (1992b, pp. 24-41 ). See Figure 2 for an overview of the process. Computer matching brings together data that would otherwise be isolated. It has the capacity to assist in the detection of error, abuse, and fraud in large-scale systems (Clarke, 1992b, pp. 41-46). It may, in the process, jeopardize the information privacy of everyone whose data are involved and even significantly alter the balance of power between consumers and corporations (see Larsen, 1992; Gandy, 1993) and citizens and the State (see Rule, 1974; Laudon, 1986b). Of particular concern is the extent to which the digital persona that arises from the matching process may be a misleading image of the individual and his or her behavior. There are three means whereby a digital persona can be constructed from multiple sources: • A common identifier • Correlation among multiple identifiers • Multiattributive matching Virtually all computer matching undertaken by agencies of the U.S. Federal Government appears to be based on a common identifier, the Social Security number or SSN (Clarke, 1993, pp. 9-19). In Canada the Social Insurance number (SIN) plays a similar role. In European countries it has been the practice for many years for a single identifier to be used for a limited range of purposes, in particular, taxation, social security, health insurance, and national superannuation. In Australia, an originally single-purpose identifier

Security and Privacy

27

85

The Digital Persona and Data Surveillance

,-----------------------------------------------------------------~

'

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

SOURCE ORGANIZATION(S)

MATCHING ORGANIZATION

---------·--· ' '

(g) Record Creation Amendment Extension

(f) Case Analysis Decision Action

'

~-----------------------------------------------------------4

CLIENT ORGANIZATION(S)

Figure 2. The computer matching process.

28

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

86

R. Clarke

(the Tax File number) has recently been appropriated to serve as a social security identifier as well (Clarke, 1992a). These codes are used because they are widely available and their integrity is regarded by the agencies concerned as being adequate. Some agencies, in order to address acknowledged quality problems, use additional data items to confirm matches. There are alternatives to a government-assigned number. Physiologically based identifiers (sometimes referred to as "positive" identifiers) have the advantage of being more reliably relatable to the person concerned. Many forms have been proposed, including thumbprint, fingerprints, voiceprints, retinal prints, and earlobe capillary patterns. There is also the possibility of a nonnatural identifier being imposed on people, such as the brands and implanted chips already used on animals, the collars on pets, and the anklets on prisoners on day-release schemes and on institutionalized patients. Where a single common identifier is not available, two or more organizations can establish cross-references between or among separate identifiers. This can be achieved by the supply by each individual of his or her identifier under one scheme to the operator of one or more other schemes. This may be mandated by law, or sanctioned under law (i.e., not prohibited) and required under contract, which, if applied consistently by all operators in an industry such as credit or insurance, is tantamount to mandating. An alternative approach is to construct a matching algorithm based on pairs of similarly defined fields. Typically this involves names, date-of-birth, some component(s) of address, and any available identifiers (such as driver's license number). Data collection, validation, storage, and transmission practices are such that dependence on equality of content of such fields is impracticable (Laudon, 1986a). Instead, the data generally need to be reformatted and massaged ("scrubbed") and/or algorithms need to be devised to identify similarity. Considerable progress has been made in supporting technologies for multiattributive matching, including sophisticated algorithms, high-speed processors, storage, vector and array processing, associative processing (such as CAFS), and software development tools expressly designed to enable multiattributive matching (such as INDEPOL) (Clarke, 1992b, pp. 39-40).

Profiling Profiling is another dataveillance technique that is attracting increasing usage. A set of characteristics of a particular class of person is inferred from past experience, and data holdings are then searched for digital personae with a close fit to that set of characteristics. The steps in the profiling process can be abstracted as follows: • • • • • •

Describe the class of person sought. Use existing experience to define a profile of that class of person. Express the profile formally. Acquire data concerning a relevant population. Search the data for digital personae whose characteristics comply with the profile. Take action in relation to those individuals.

Profiling is used by government agencies to construct models of classes of persons in whom they are interested, such as terrorists, drug couriers, people likely to commit crimes of violence, tax evaders, social welfare frauds, adolescents likely to commit suicide, and children with special gifts. It is also used by corporations, particularly to identify consumers likely to be susceptible to offers of goods or services, but also staff members and job applicants relevant to vacant positions (Clarke, 1993 ).

Security and Privacy The Digital Persona and Data Surveillance

29

87

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Flaws in the Dataveillance of the Digital Persona There are substantial weaknesses in the digital persona used in computer matching and profiling. These arise in respect of the identification basis used and the data and processing characteristics. As regards human identification schemes, the deficiencies of universal schemes such as the SSN as a basis for digital identification have been well documented (FACFI, 1976; Clarke, 1989; Hibbert, 1992). All such identifiers depend on a seed document, most commonly a birth certificate. Such documents generally have no direct association with the persons to whose birth they attest. They are therefore capable of appropriation by multiple people as a nominal basis for identities. Other documentary evidence of identity (driver's license, passport, credit card, club membership, etc.) derive from the primary document and from one another. Such a pattern of interlocking documents is therefore a highly insecure means of identifying people, especially where they have incentives to avoid identification or to otherwise mislead. Yet this is the dominant means of identification used in administrative systems and people act as though it were reliable, as attested to by the prevalence of the term "proof of identity" to refer to documents. As a result of the unreliability of identification, it is inevitable that for some proportion of the data subjects, matching creates pseudo-personae, which purport to relate to a specific individual, but may be a composite of several. In relation to data and process quality, Neumann (1976) has catalogued manifold problems. An analysis in Clarke (1992b, pp. 52-61) distinguishes a number of different areas in which problems arise: • • • • • • •

Data sources Data meaning Data quality Data sensitivity and privileges Matching quality Context Oppressive use of the results

The composite digital persona that computer matching produces comprises data from different sources, gathered and maintained for different purposes. The definitions of such apparently common data items as marital status, number of dependents, and income vary a great deal among the various government agencies and private sector corporations that use them. In addition to definitional problems, the care taken to assure quality of data reflects the circumstances of use. This applies to the precision of the data, its completeness, and the extent to which it is amended to reflect change. Substantial efforts are necessary to control quality and ensure comparability of multisourced items. In the absence of adequate controls, the composite image is a melange, and drawing inferences from the pooled data is fraught with risks. Who bears the risks depends on the circumstances. In some cases, the agency or corporation makes many mistakes and expends significantly greater resources than it anticipated. In others, the individual suffers through delays, confusion, and denial of services. Profiling may be based on data arising from matching procedures, in which case it is subject to the same risks. Alternatively, it may be undertaken on the basis of the holdings of a single personal data system, in which case a greater degree of confidence in the data quality may be justifiable. Even here, however, risks arise. The model against which the data are compared is derived from the experience of specialists and reflects their biases. It also involves trade-offs between different factors that may be understood by the designers

30

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

88

R. Clarke

but not by the users. The result is that atypical, idiosyncratic, and eccentric people, and extenuating circumstances, tend not to be provided for. Inferences are readily drawn that with careful review may be quite unreasonable. The first risk is that organizations may invest far too much effort in cases that provide them with no benefits; the second and much more serious risk is that individuals may be subjected to unreasonable treatment and face significant difficulties in even understanding what is happening to them, let alone coping with the difficulties.

More Sophisticated Applications of the Digital Persona to Dataveillance Beyond computer matching and profiling, there are many further ways in which the digital persona can be applied. In a networked world, individuals' behavior can be monitored and their attitudes inferred on the basis of what they do on the net. The data sources they access, the individuals they communicate with, and the contents of their messages will be of considerable interest to many organizations, ranging from law enforcement agencies to consumer marketing companies. Some individuals are seeking ways to confound the attempts to monitor them, through, for example, message encryption and the adoption of multiple identities. Law enforcement agencies, meanwhile, are seeking to preclude the deployment of encryption tools other than those that they can crack, and will doubtless seek legal limitations on the freedom of individuals to express themselves through multiple network personae. When that fails, it is reasonable to expect that they will invest considerable sums in technology to establish and maintain mappings of personae onto persons. In the normal manner of things, the mere exercise of the freedom may be treated as sufficient cause for the person to be subjected to a higher than normal level of surveillance. Another plausible approach would be to match multiple personae in order to generate suspicions of error, misdemeanor, or crime. There is evidence that real-time monitoring of electronic transactions is already being undertaken. For example, in 1989 an extortionist in the United Kingdom was arrested at an automated teller machine (ATM) while withdrawing money from an account into which the proceeds of his crime had been paid. He had made a succession of withdrawals from A TMs throughout the country, and this was the first occasion on which he had used that particular machine. The most likely means whereby the arrest could have been effected was through monitoring of all transactions at ATMs for the use of a particular card, with immediate communication to police on the beat. As with any privacy-intrusive technique, the first use is unarguably in the public interest. Regrettably, there must be considerable doubt about whether it will be subsequently constrained to such universally acceptable uses. Profiles or templates of sought-after classes of people can be assembled, and databases of persona behavior compared with them in order to identify potential terrorists, drug runners, hackers, and suicide-prone adolescents. Until now, this has been done occasionally, on the basis of existing data holdings. The scope now exists for it to be done in teal time, with not only suspicious individual transactions being sought, but also with the accumulation of transactions being monitored for suspicious combinations. In the marketing arena, the promotion of goods, services, and ideas could be automated. Active agents operating on behalf of organizations could contain profiles, monitor transactions, recognize opportunities, and initiate contact. Law enforcement agencies could extend beyond mere dataveillance to automated actions, such as preprogrammed warnings to individuals who are approaching the boundaries of the law or even directly

31

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Data Surveillance

89

implemented punishment, such as the levying of fines directly against the infringer's bank account or the suspension of network privileges. There may be some circumstances in which such uses might be considered inappropriate, and many in which controls would be essential, but none of them would appear to be repugnant in their own right and all of them seem likely to be put to use in some circumstances. There are many further potential applications that are less salubrious. For example, the increased visibility of people's habits and movements creates opportunities for thieves seeking to enter premises when they are unattended, and for extortionists, kidnappers, and assassins to be in the right place at the right time to perform their deeds with a minimum of risk to themselves. False data can be infiltrated into the network with the intent of rendering a digital persona misleading. This might be done by individuals seeking to escape detection or to be interpreted as falling into some particular category. It might alternatively be done by some other person or organization seeking to assist or to harm the individual concerned.

Implications Broader social impacts of dataveillance are identified and discussed in Clarke (1988), as follows (see also Rule, 1974; Laudon, 1986b; Flaherty, 1989): Personal dataveillance Low data quality decisions Lack of subject knowledge of, and consent to, data flows Blacklisting Denial of redemption Mass dataveillance Dangers to the individual Arbitrariness Acontextual data merger Complexity and incomprehensibility of data Witch hunts Ex-ante discrimination and guilt prediction Selective advertising Inversion of the onus of proof Covert operations Unknown accusations and accusers Denial of due process Dangers to society Prevailing climate of suspicion Adversarial relationships Focus of law enforcement on easily detectable and provable offenses Inequitable application of the law Decreased respect for the law and law enforcers Reduction in the meaningfulness of individual actions Reduction in self-reliance and self-determination Stultification of originality Increased tendency to opt out of the official level of society Weakening of society's moral fiber and cohesion Destabilization of the strategic balance of power Repressive potential for a totalitarian government

32

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

90

R. Clarke

Clearly, many of these concerns are diffuse. On the other hand, there is a critical economic difference between conventional forms of surveillance and dataveillance. Physical surveillance is expensive because it requires the application of considerable resources. With a few exceptions (such as Romania, East Germany under the Stasi, and China during its more extreme phases), this expense has been sufficient to restrict the use of surveillance. Admittedly, the selection criteria used by surveillance agencies have not always accorded with what the citizenry might have preferred, but at least the extent was limited. The effect was that in most countries the abuses affected particular individuals who had attracted the attention of the State, but were not so pervasive that artistic and political freedoms were widely constrained. Dataveillance changes all that. It is relatively inexpensive and getting cheaper all the time thanks to progress in information technology. When the economic limitations are overcome, the digital persona can be monitored with thoroughness and frequency and surveillance can be extended to whole populations. To date, particular populations have attracted the bulk of the attention, because the State already possessed substantial data holdings about them, viz., social welfare recipients and employees of the State. Now that the techniques have been refined, they are being pressed into more general usage in the private as well as the public sector. The primary focus of government matching programs has been "evildoers." This is not intended in a sarcastic or cynical sense. The media releases do indeed play on the heartstrings, but the fact is that publicly known matching programs have been mostly aimed at classes of individuals who are abusing a government program and thereby denying more needy individuals of the benefits of a limited pool of resources. Nonetheless, these programs have a chilling effect on the population they monitor. Moreover, they have educated many employees in techniques that are capable of much more general application.

Conclusions In basing its analysis on models and their incompletenesses, this paper has adopted a "materialist" ontological perspective, and a "critical realism" standpoint. Alternative philosophical standpoints might lead to a rather different analysis, but this paper has argued that widespread networking is bringing with it many new developments. The ability for individuals to project one or more models of themselves outward and for individuals and organizations to impose digital personae upon others has the potential to create valuable new opportunities and to impinge upon established and important values. Application and articulation of the idea should enable the descriptive and explanatory power of models of network behavior to be enhanced. After the coming period of turbulence, they should play a role in improving the predictive power of those models. The digital persona raises questions about the appropriateness of various legal notions. Internetworking exacerbates the already apparent inadequacies of privacy, data protection, and intellectual property laws. The tort of appropriation provides qualified protection to Elvis Presley's heirs and Madonna against profit-making based on those public personae; but in the networked world it may be less able to protect people from having actions and utterance of others attributed to them. Dataveillance is an inevitable outcome of the data intensity of contemporary administrative practices. The physical persona is progressively being replaced by the digital persona as the basis for social control by governments and for consumer marketing by corporations. Even from the strictly social control and business efficiency perspec-

33

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The Digital Persona and Data Surveillance

91

tives, substantial flaws exist in this approach. In addition, major risks to individuals and society arise. If information technology continues unfettered, then use of the digital persona will inevitably result in impacts on individuals that are inequitable and oppressive, and in impacts on society that are repressive. European, North American, and Australasian legal systems have been highly permissive of the development of inequitable, oppressive, and repressive information technologies. Focused research is needed to assess the extent to which regulation will be sufficient to prevent and/or cope with these threats. If the risks are manageable, then effective lobbying of legislatures will be necessary to ensure that appropriate regulatory measures and mechanisms are imposed. If the risks are not manageable, then information technologists will be left contemplating a genie and an empty bottle.

References Bear, G. 1985. Eon. London: Victor Gollancz. Brunner, J. 1975. The Shockwave Rider. New York: Methuen. Clarke, R. A. 1987. Just another piece of plastic for your wallet: the Australia card. Prometheus 5(1):29-45, republished in Computers & Society 18(1) (January 1988):7-21, with an Addendum in Computers & Society 18(3) (July 1988):10-13. Clarke, R. A. 1988. Information technology and dataveillance. Commun. ACM 31(5):498-512. Clarke, R. A. 1989. Human Identification and Record Systems. Working Paper, Dept. of Commerce, Aust. National University. Clarke, R. A. 1992a. The resistible rise of the Australian national personal data system. Software L. J. 5(1):29-59. Clarke, R. A. 1992b. Computer Matching by Government Agencies: A Normative Regulatory Framework. Working Paper, Dept. of Commerce, Australian National University. Clarke, R. A. 1993. Profiling: a hidden challenge to the regulation of data surveillance. J. L. & Inf Sc. 4(2):405-419. Clarke, R. A. 1994. Electronic support for the practice of research. The Information Society I 0(1 ):25-42. FACFI. 1976. The Criminal Use of False Identification. Washington, DC: Federal Advisory Committee on False Identification. Flaherty, D. H. 1989. Protecting Privacy in Surveillance Societies. Chapel Hill, NC: University of North Carolina Press. Foucault, M. 1975. Discipline and Punish: The Birth of the Prison. Originally published in French; trans!. Allen Lane, London. Gandy, 0. H. 1993. The Panoptic Sort. Critical Studies in Communication and in the Cultural Industries. Boulder, CO: Westview. Gibson, W. 1984. Neuromancer. London: Grafton/Collins. Hibbert, C. 1992. What To Do When They Ask You for Your SSN. Comp. Prof'! for Social Resp., various electronic versions. Hill, W. C., and J. D. Hollan. 1994. History-enriched digital objects: prototypes and policy issues. The Information Society 10(2): 139-145. Larsen, E. 1992. The Naked Consumer: How Our Private Lives Become Public Commodities. New York: Holt. Laudon, K. C. 1986a. Data quality and due process in large interorganisational record systems. Commun. ACM29(l):4-11. Laudon, K. C. 1986b. Dossier Society: Value Choices in the Design of National Information Systems. New York: Columbia UP. Loch, S., and D. Terry, eds. 1992. Information filtering. Special section of Commun. ACM 35(12):26-81. Neumann, P. 1976-. RISKS Forum. Software Engineering Notes since 1,1 (1976), and in Risks. Forum on UseNet.

34

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

92

R. Clarke

OTA. 1986. Federal Government Information Technology: Electronic Record Systems and Individual Privacy, OTA-CIT-296. Washington, DC: U.S. Govt Printing Office. Rule, J. B. 1974. Private Lives and Public Surveillance: Social Control in the Computer Age. New York: Schocken. Rule, J. B., D. McAdam, L. Stearns, and D. Uglow. 1980. The Politics of Privacy. New York: New American Library. Smith, R. E. 1974-. Privacy Journal; monthly since November 1974. Sterling, B. 1986. Mirrorshades: The Cyberpunk Anthology. New York: Ace.

[3] Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Privacy, Visibility, Transparency, and Exposure Julie E. Cohen t INTRODUCTION

This essay considers the relationship between privacy and visibility in the networked information age. Visibility is an important determinant of harm to privacy, but a persistent tendency to conceptualize privacy harms and expectations in terms of visibility has created two problems. First, focusing on visibility diminishes the salience and obscures the operation of nonvisual mechanisms designed to render individual identity, behavior, and preferences transparent to third parties. The metaphoric mapping to visibility suggests that surveillance is simply passive observation rather than the active production of categories, narratives, and norms. Part I explores this problem and identifies some of the reasons that US privacy jurisprudence has been particularly susceptible to it. Second, even a broader conception of privacy harms as a function of informational transparency is incomplete. Privacy has a spatial dimension as well as an informational dimension. The spatial dimension of the privacy interest, which I characterize as an interest in avoiding or selectively limiting exposure, concerns the structure of experienced space. It is not negated by the fact that people in public spaces expect to be visible to others present in those spaces, and it encompasses both the arrangement of physical spaces and the design of networked communications technologies. US privacy law and theory currently do not recognize this interest at all. Part II argues that they should. Part III argues that the spatial dimension of the privacy interest extends to online conduct and considers some implications of that view for current debates about expectations of privacy online. Part IV offers some preliminary thoughts on how the privacy interest against exposure might affect thinking about privacy self-defense.

t Prokssor of Law, Gc:orgctown University Law Ccntc:r. Thanks to Susan Cohen, Oscar Gandy, Ian Kerr, David Phillips, Neil Richards, Rebecca Tushnet, participants in the Unblinking Workshop at UC Berkeley, and participants in The University of Chicago Law School's Surveillance Symposium for their comments on an earlier version of this paper, to Kirstie Ball for sharing her work in progress on exposure as an organizing concept for surveillance, and to Amanda Kane and Christopher Klimmek for research assistance.

36

Security and Privacy

182

The University of Chicago Law Review

[75:181

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

I. VISIBILITY AND TRANSPARENCY

Within US legal culture, debates about privacy traditionally have reflected a relatively great concern with visibility and visual privacy issues. Over the last decade, the principal contribution of what has been dubbed the "information privacy law project"' has been to refocus both scholarly and popular attention on the other ways in which contemporary practices of surveillance operate to render individuals and their behaviors accessible in the networked information age. Yet the information privacy law project remains more closely tied to visibility than this description would suggest; its principal concern has been with data trails made visible to others. And to the extent that the information privacy law project conceptualizes privacy interests as interests against informational accessibility, its grasp of the workings and effects of surveillance is incomplete. Surveillance is only partly about the gathering and dissemination of fixed, preexisting information about identified individuals. Designations like "at risk," "no-fly," "soccer moms," "business elite," and "shotguns and pickups" are not preexisting facts. Surveillance also depends importantly on other, informationcreating activities that lie outside the frame of visibility altogether. An implicit linkage between privacy and visibility is deeply embedded in privacy doctrine. Within the common law of privacy, harms to visual privacy and harms to information privacy are subject to different requirements of proof. Of the four privacy torts, two are primarily visual and two primarily informational. The visual torts, intrusion upon seclusion and unauthorized appropriation of name or likeness, require only a showing that the conduct (the intrusion or appropriation) violated generally accepted standards for appropriate behavior.' The informational torts, unauthorized publication and false light, are far more stringently limited (to "embarrassing" private facts and to falsity).' To make out a more general claim to information privacy, some have tried to characterize collections of personally identified data visually, likening them to "portraits" or "images," but courts have resisted the conflation of facts with faces.' The body of constitutional privacy doctrine that defines unlawful "searches" regulates tools that enable law enforcement to "see" activities as they are taking place Neil M. Richards, The Information Privacy Law Project, 94 Georgetown L J 1OK7 (2006). I should note that I am one of the scholars identified with this project. 2 See W. Page Keeton, et al, Prosser and Keeton on the Law of Torts§ 117 at 851-56 (West 5th ed 1'!84). 3 Sec id § 117 at 856-66. See, for example, US News & World Report, Inc v Avrahami, 19'!6 WL 1065557, *6-7 (Ya Cir Ct); Dwyer v American Express Co, 652 NE2d 1351,1355-56 (Ill App Ct 1995); Castro v NYT Television, 851 A2d 88,98 (NJ Super Ct 2004).

Security and Privacy Privacy, Visibility, Transparency, and Exposure

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

2008]

37

183

inside the home more strictly than tools for discovering information about those activities after they have occurred.' Within the academic literature on privacy, efforts to develop an account of privacy interests in personal information have confronted great skepticism, for reasons that seem closely linked to conventions about visibility. Information privacy skeptics have argued that the information conveyed by most individual items of personal data is too banal to trigger privacy interests. They have asserted, further, that privacy interests cannot attach to information voluntarily made "visible" as part of an otherwise consensual transaction. Under the influence of the information privacy law project, privacy discourse has changed. Many new legal and philosophical theories of privacy are organized explicitly around problems of information privacy and "privacy in public." Some scholars assert a "constitutive" relationship between flows of personal information and selfdevelopment. Helen Nissenbaum argues that the collection and aggregation of personal information disrupts expectations of "contextual integrity" by allowing presence, appearance, and behavior in different contexts to be juxtaposed.' Drawing upon pragmatist philosophy and phenomenology, Daniel Solove argues that "digital dossiers" threaten a varied but related set of interests that are grounded in the logic of everyday experience.' These theories suggest that the persistent theme of visibility in privacy discourse is a distraction from the more fundamental problem of informational accessibility. Although the theories differ from each other in important respects, an implicit premise of all of them is that databases and personal profiles can communicate as much or more than images. To the extent that privacy is conceived as encompassing a more general interest against accessibility, the adage that "a picture is worth a thousand words" requires rethinking. Visibility is an important determinant of accessibility, but threats to privacy from visual surveillance become most acute when visual surveillance and databased surveillance are integrated, enabling both real-time identifica6

See, for example, Kyllo v United States, 533 US 27, 29 (2001); California v Greenwood, 4Ro US 35, 40-41 (19RR). See also R v Tessling, [2004] 3 SCR 432,434 (Can). Kyllo was thought to h~ a hard case precisdy h~cause it sc~m~d to lie on the boundary betwc~n th~ two categmies. 6 See, for example, Julie E. Cohen, Examined Lives: Informational Privacy and the Subject as Object, 52 Stan L Rev 1373, 1424-25 (2000); Paul M. Schwartz, Internet Privacy and the State, 32 Conn L Rev 815, 856-57 (2000). See also Luciano Floridi, The Ontological Interpretation of ln{ormaliona! Privacy, 7 Ethics & 1nfo Tech 185, 194-'!9 (2005). 7 Se~ generally Helen Nissenbaum, Privacy as Contextual Integrity, 79 Wash L Rev 119 (2004). See also generally Jeffrey Rosen, The Unwanted Gaze: The Destruction of Privacy in America (Random House 2000). 8 See generally Daniel J. Solove, Conceptualizing Privacy, 90 CalL Rev 1087 (2002).

Security and Privacy

38

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

184

The University of Chicago Law Review

[75:181

tion of visual surveillance subjects and subsequent searches of stored visual and databased surveillance records. And, for the most part, informational accessibility does not result from a conscious decision to target particular individuals. Rather, accessibility is embedded in the design of social and technical institutions." Even as information privacy theorists have sought to shift the focus of the discussion about privacy interests, however, the terms of both academic and public debate continue to return inexorably to visibility, and more particularly to an understanding of surveillance as direct visual observation by centralized authority figures. Within popular privacy discourse, this metaphoric mapping tends to be organized around the anthropomorphic figure of Big Brother. Academic privacy theorists have tended to favor the motif of the Panopticon, a model prison proposed by Jeremy Bentham that consists of cells concentrically arranged around a central guard tower, from which the prison authority might see but not be seen. Historically and also etymologically, the Panopticon suggests that direct visual observation by a centralized authority is both the most effective means and the best exemplar of surveillance for social control. It is not particularly surprising that the paradigm cases of privacy invasion should be conceptualized in terms of sight. The cultural importance of visibility extends well beyond privacy law, and well beyond law more generally. Within Western culture, vision is linked metaphorically with both knowledge and power. The eye has served throughout history as a symbol of both secular and religious authority. The Judea-Christian God is described as all-seeing, and worldly leaders as exercising "oversight" or "supervision." Cartesian philosophy of mind posits that objects and ideas exist in the "unclouded" mind, where truth is revealed by the "light of reason." In the language of 10

11

12

See, for example, Daniel J. Solove, The Digital Person: Technology and Privacy in the Information Age 97-101 (NYU 2004); Richards, 94 Georgetown L J at 1095-1102 (cited in note 1); Schwartz, 32 Conn L Rev at 1-131 (cited in note ti), citing Joel R. Reidenberg, Lex Informatica: The Formulation of Information Policy Rules through Technology, 7ti Tex L Rev 553, 55ti (1998). 111 Sec, for example. Sonia K. Katyal, The New Surveillance. 54 Case W Res L Rev 297,317-19 (2003); Rosen, The Unwanted Gaze at 213-14 (cited in note 7); Schwartz, 32 Conn L Rev at852-53 (cited in note 6). See also Michel Foucault, Discipline and Punish: The Birth of the Prison 195-209 (Vintage 1977) (Alan Sheridan, trans) (describing the Panopticon). 11 The Hebrew Bible refers to God in a number of ways, including El-Roi, or ·'God who sees." Sec, for example. Genesis lti:13. The all-seeing God figures prominently in the religious iconography of the Renaissance, and the linkages between vision, power, and knowledge continue in the subsequent secular iconography of the Enlightenment. See Astrit SchrnidtBurkhardt, "l"he All-Seer: God's liye as Proto-surveillance, in Thomas Y. Levin, Ursula Frohne, and Peter Weibel, cds, Ctrl [Space]: Rhetorics of Surveillance from Bentham to Big Brother 17, 18-26 (MIT 2002). 12 See Rene Descartes, Rules for the Direction of the Mind, in 31 Great Books of the Western World 4 (Encycloprmation Ewnomy," eel, JaneK, Winn (London: Ash gate Publishing, zooli),

11

Ian Kerr, "The Legal Relationship lktwecn Online Service Providers and Users," Canadian Business Law ]ournal3s (2001): 1-40,

12 Joseph Turow, Lauren Feldman, and Kimberly Mell~er, Openlolixploilalion: AmericanS hoppers Online and CJjjline (Philadelphia: Annenberg Public Policy Center, University of Pennsylvania, June 1, zoos), hLLp ://www,annenbcrgpublicpolicycenler,org/Downloads/Informalion_And _Society /Turow_APPC_Reporl_WEB_FINALpdL 13 Lorrie Faith Cranor and joel Reidenberg, "Can User Agents Accurately Represent Privacy Notices?" The 30th Research Conference on Cornrnunication, lnforrnation, and Internet Pol-

icy (TPRC2002), Alexandria, Virginia, September 2R- 30, 2002,

'4lJ,S, Department ofTTealth, Education, and Welfare, Records, Computers, and the Rights of Citizens, Report of the Secretary's ,\dvisory Committee on Automated Personal Data Systems, July 1973, http:/ /aspe,hhs,gov/datacnci/1973[Jrivacy/tocprefacemembers,htm, '5 Solon Barocas and Helen Nissenbaum, "On Notice: The Trouble with Notice and Consent,,, Proceedings of Lhe Engaging Dala Forum: The Firs/ Inlemalional Forum on Ihe Applicalion and Management of Personal Electronic Information, Cambridge, Massachusetts, October 12-

Dredalus, Lhe journal oflhe American Academy of Arls Oi Sciences

99

100

Security and Privacy 13, 2009; Vincenl Toubiana, Arvind ]'.; arayanan, Dan Bonel1, Helen Nissenbaum, and Solon Helen llarocas, "Adnostic: Privacy-Preserving Targeted Advertising," Proceedings of the Network and Nissenbaum Dislribuled System Symposium, San Diego, California, February 28- March 3, 2010.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

16 Counling ad servers alone, a lisl currcnl as of April 2011 shows 2,766 unique cnlries; sec hllp :/ /pgl.yoyo.org/adservers/formals.php (accessed April 13, 2011). 17 Barocas and J',;issenbaum, "On Notice," and Toubiana, Narayanan, Boneh, Nissenbaum, and Barocas, "Adnostic." 18 Vincent Toubiana and Helen Nissenbaum, "An Analysis of Google Log Retention Policies," The Journal of Priva'y and Confidentiality (forthcoming). 19 For example, personal informalion is shared wilh no one and dcslroyed afler each session. 20 Deborah Franklin, "Uninformed Consent," Scientific American, March 2011, 24-25. 21

See Katie Hafner and Matthew Lyon, Where Wizards Stay Up Late: The Origins of the Internet (New York: Simon and Schuster, 1999); Tim Ilerners-Lee, Weaving the Web: The Past, Present and Future of the World Wide Web by Its Inventor (London: Texere Publishing, 2000); and Janet Abbate, Inventing the Internet (Cambridge, Mass. : MIT Press, 2000 ).

22 On Lhis Lransformalion of Lhc lnlerncllhrough Lhe ''Lusslc" of inlcrcslcd parlies, sec David Clark, John Wroclawski, Karen Sollins, and Robert Braden, "Tussle in Cyberspace: Defining Tomorrow's Internet," Proceedings of theACM SigComm 2002 Conference, Pittsburgh, Pennsylvania, Augusl 19-23, 2002, published in Compuler Communications Review 32 (4) (Oclobcr 2002). 23 AI Core, "Infraslruclure for Lhe Global Village: Compulers, Nelworks and Public Policy," special issue, "Communicalions, Compulcrs, and Nclworks," Scienlijic American, Scplembcr 1991, 150- 153. 24 john Perry Barlow, "The Economy of Ideas," l!Vired, March 1994, 84. 25 Clay Shirky, Here Comes Everybody: The Power of Organizing Without Organizations (New York: Penguin, 2009 ). 26 Samanlha Shapiro, "Rcvolulion, Faccbook-SLylc," The New York Times, January 22, 2009, http:/ /V>"'v.;.nytimes.com/2009/01/25/magazine/25bloggers-t.html. 27 Federal Trade Commission, "Protecting Consumer Privacy in an Era of Rapid Change," and Department of Commerce, "Commercial Data Privacy and Tnnovation in the Tnternet Economy." 28

Compare this notion to Mark Zuckerberg's claim that norms change due to the contours of Facebook's privacy policies; see Bianca llosker, "Facebook's Zuckerberg Says Privacy No Longer a 'Social Norm,"' The Huffington Post, January 11, 2010, http:/ /www.huffington post .com/:w 1o/ o 1I 11/facebooks-zuckerberg-the_n_417969. h tm 1.

29

U.S. Federal Trade Commission, Gramm-Leach-Bliley Act 15 U.S.C., Subchapter I, sec. 68016809, November 12, 1999. http:/ /www.ftc.gov/privacy/glbact/glbsubt.htm; Adam Barth, Anupam Datta, John Mitchell, and Ilelen Nissenbaum, "Privacy and Contextual Integrity: Framework and Applications," Proceedings of the IIJEE Symposium on Security and Privacy, llerkeley, California, May 21- 24, 2006.

3° The Moravian Book Shop now has its own online portal, http://www.rnoravianbookshop .com/ (accessed April13, 2011). 31 The Video Privacy Protection Act 18 U.S.C:. sec. :q10, 1988, http://www.law.cornell.edu/ uscode/hlml/uscodeJ8/usc_sec_18_ooo02710----ooo-.hlml. 32 See http:/ /www.vatican.va/phome_en.htm (accessed April13, 2011). 33 Sec hllp :/ /www.online-churches.ncl (accessed April13, 2011). 34 Nissenbaum, Privacy in Context, esp. chap. 8.

140 ( 4)

I'all 2011

47

Security and Privacy A 35 David Kocieniewski, "IRS Sits on Data Pointing to Missing Children," The New York Times, November n, 2010, http:/ /www.nytimes.com/20I0/11/13/bnsiness/13missing.html. Conlexlual Approach Ia Privacy 36 Inlernal Revenue Service, "Disclosure and Privacy Law Reference Guide," Publicalion 4639 Online (10-2007) Catalog Number 50891P, 1-7. 37 Hearings on Revenue Revision 1925 Before the House Ways and Means Committee, 6gth Cong., 1st sess. 8-9 (1925).

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

38 A Coogle search on "HIV status," performed january 11, 2011, yielded more than 7.5 million results. 39 Federal Trade Commission, "Protecting Consumer Privacy in an Era of Rapid Change," and Department of Commerce, "Commercial Data Privacy and Tnnovation in the Tnternet Economy." 4° Nooh Cook is presented as heing written hy "a girl who likes to cook"; see http:/ /wV>'W .noobcook.com/about/ (accessed February 25, 2011). 41 Eslablishing Lhe number of Lrackers on a websile is highly inexacl. Various uLiliLies offer Lhis service, for example, Ghostery; numbers vary depending on the approaches they adopt. l':ot all approaches recognize all types of trackers. Further, these numbers also vary from tirnc to Lime because websiles may frequenLly revise Lheir underlying policies and business arrangements. (I am indebted to Ashkan Soltani for clarifying this point.) 42 Elizabeth Anderson, Value in Ethics and Economics (Cambridge, Mass.: Harvard University Press, 1995), 147. 43 Milton Friedman, "Can Business Afford to be Ethical'?: Profits before Ethics," in Ethics for Modern Life, ed. Raziel Abelson and Marie-Louise Friquegnon, 4th ed. (New York: St. Martin's Press, 1991), 313-318. 44 Sergey Erin and Lawrence Page, "The Anatomy of a Large-Scale Hypertextual W'eb Search Engine," W\V\'1'7/Computer Networks 30 (1-7) (1998): 107-117; quotation taken from Web version, http:/ /infolab.stanford.edu/-backrub/google.html (accessed February 26, 2011). See also Alex Diaz, "Through the Coogle Goggles: Sociopolitical Bias in Search Engine Design," in Web Searching: lnlerdisciplirwry Perspeclives, ed. Amanda Spink and Michael Zimmer (Dordrechl, The Nelherlands: Springer, 2008). 45 Roberl Ellis Smilh, "Ben Franklin's Web Sile: Privacy and Curiosily from l'lymoulh Rock Lo the Internet," Privacy Journal (2ooo): )4, 51. 46 Lucas lntrona and Helen Nissenbaum, "Shaping the Web: Why the Politics of Search Engines Matters," The Information Society 16 (3) (:woo): 1- 17; Frank Pasquale and Oren Bracha, "Federal Search Commission? Access, Fairness, and Accountability in the Law of Search," Cornell Law Review 93 (2oo8): 1149; Toubiana, Narayanan, Boneh, Nissenbaum, and Barocas, "Adnostic." 47 Toubiana and Nissenbaum, "An Analysis of Googlc Log Rclenlion Policies." 4 8 See "ChoicepoinL," EPIC-Eleclronic Privacy Informalion Cenler, hllp :/ /cpic.org/privacy/ choicepoinl/ (accessed April13, 2011). 49 As a practical matter, standards for communicating contexts will be needed through interface design. See, for example, the work of Ryan Calo and Lorrie Cranor. 5° Nissenbaum, Privacy in Context, esp. chap. 8-9. 51 Daniclle CiLron, "Jlullilling Govcrnmenl 2.o's Promise wilh Robusl Privacy Prolcclions," George Washington Law Review 78 (4) (June 2010): 822- 845.

Deeda/us, the journal qf the American 1\cademy qf J\rls ,~,·Sciences

101

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Part II Surveillance, Security and Anonymity

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[6] HACKING THE PANOPTICON: DISTRIBUTED ONLINE SURVEILLANCE AND RESISTANCE Benoit Dupont ABSTRACT Surveillance studies scholars have embraced Foucault's panopticon as a central metaphor in their analysis of online monitoring technologies, despite several architectural incompatibilities between eighteenth and nineteenth century prisons and twenty-first century computer networks. I highlight a number of Internet features that highlight the limits of the electronic panopticon. I examine two trends that have been considerably underestimated by surveillance scholars: ( 1) the democratization of surveillance, where the distributed structure of the Internet and the availability of observation technologies has blurred the distinction between those who watch and those who are being watched, allowing individuals or marginalized groups to deploy sophisticated surveillance technologies ayainst the state or large corporations; and (2) the resistance strateyies that Internet users are adoptiny to curb the surveillance of their online activities, through blockiny moves such as the use of cryptography, or masking moves that are designed to feed meaningless data to monitoring tools. I conclude that these two trends are neylected by a majority of surveillance scholars because of biases that make them dismiss the initiative displayed by ordinary users, assess

Security and Privacy

106

258

BENOIT DUPONT

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

positive and negative outcomes differently, and confuse what is possible and what is probable.

The panopticon concept occupies a pivotal pos1t10n in the field of surveillance studies. Michel Foucault's (1977) analysis of Bentham's total surveillance architecture has become a ubiquitous reference in the literature (Haggerty, 2006; Lyon, 2006), despite Foucault's deliberate lack of interest for the emerging technologies of his time (Haggerty & Ericson, 2000). A few years later, Thomas Mathiesen (1997) highlighted the limits of relying exclusively on the panopticon's metaphor in a "viewer society" where television lets the many see what the few are up to. Although these two major contributions still partly resonate with the current state of surveillance and continue to provide useful theoretical insights, I will argue in this chapter that their hegemonic influence (Haggerty, 2006) is becoming counterproductive to understand two trends related to surveillance in the online environment. The first trend can be defined (for lack of a better term) as the "democratization of surveillance", where cheap surveillance software and hardware is marketed to individual customers so that they can monitor the activities of their family, coworkers, neighbours, and even their favourite celebrity or their most despised politician. The second trend concerns the resistance to surveillance, where efforts are deployed by the subjects of surveillance to understand, reveal, mock, evade, and neutralize surveillance technologies through the collaborative power of socio-technical networks. Because of their incompatibility with the dominant panoptic and synoptic conceptual frameworks, these two trends have been underestimated and sometimes even ignored by surveillance scholars. These two facets of contemporary surveillance will be examined in a very specific context: the omnipresent network of computers, servers, software, and services that make up the Internet. The Internet is now routinely used to exchange information of personal and public interest, to conduct financial transactions, to acquire goods and services of all kinds, and to spend time (or waste it, depending on the perspective) by playing online games, downloading music and movies, and managing social networks of friends and acquaintances. Its architecture is decentralized and distributed, making it at the same time very exposed and very resilient to failures and malfeasances. Its invention is recent, and when Discipline and punish was first published in French in 1975, ARPANET (the ancestor of the Internet) was still in its infancy (Mowery & Simcoe, 2002). At first sight, the Internet seems to embody the worst fears of a panoptic world: total surveillance can

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

107

259

be achieved at very low cost, making all exchanges traceable and significantly altering the notion of privacy (Lessig, 2006). As the Internet penetrates every aspect of our lives and the boundaries between the physical world and virtual world become irremediably blurred, we should be quite worried by these flows of digitized information that are used to create "data doubles" whose slightest alterations are constantly scrutinized (Haggerty & Ericson, 2000, p. 611). If one tool could manage to leverage the power of knowledge to govern the behaviour of a population, the Internet should figure among the top contenders (Graham & Wood, 2003). However, no matter how great the dystopian potential of the Internet is, it seems that it has not yet delivered its disciplinary promise. To be entirely fair, it has not liberated people from autocratic regimes either, as some of its most na"ive promoters initially believed. One of the reasons for this lies in the "openness" paradox: while the technical protocols that underpin the Internet are public and standardized, therefore making surveillance relatively easy to carry out, the very same openness empowers application writers (programmers), who are free to design and distribute new tools of surveillance and resistance. For these reasons, the Internet seems like the perfect case study to assess the contemporary relevance of the panoptic and synoptic conceptual frameworks. I do not contest the existence and growth of pervasive surveillance programmes run by governments that seek to unmask terrorist suspects before they strike or political opponents who criticize the abuses of authoritarian regimes. Nor do I want to minimize the impact of similar efforts by corporations that want to profile their customers better in order to increase their profit margins (Gandy, 1993; O'Harrow, 2005) or ensure the compliance of their employees (Associated Press, 2007). Recent developments in the United States - where the executive branch has authorized massive antiterrorist datamining initiatives despite their dubious constitutional legality (Eggen, 2007) - and elsewhere would make such a position untenable because of its complete disconnection from reality. However, a simple transfer of the panoptic model, so eloquently delineated by Foucault and refined by Mathiesen, does not provide a more accurate description of the reality of contemporary Internet surveillance. In the following sections, I will first explain why the panoptic and synoptic approaches provide an incomplete set of conceptual tools to analyze the proliferation of surveillance capacities in the online world, before examining how these capacities have become available to a broad range of social actors and are also increasingly resisted with a certain degree of success by a growing body of activists and ordinary users. Finally, in the conclusion, I offer a

108

Security and Privacy

260

BENOIT DUPONT

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

non-exhaustive list of biases that have, in my opmwn, prevented a significant number of surveillance scholars from integrating the trends mentioned above in their existing work.

THE PANOPTICON: AN EXHAUSTED SURVEILLANCE METAPHOR? Although this question might seem unnecessarily provocative, I would like to show in this section the perils of extending eighteenth century thinking, no matter how innovative it was at the time, to twenty-first century technologies. Foucault's work assumes a certain linearity in the development and refinement of surveillance techniques, "from a schemata of exceptional discipline to one of a generalized surveillance, [which] rests on a historical transformation: the gradual extension of the mechanisms of discipline throughout the seventeenth and eighteenth centuries" (Foucault, 1977, p. 209), ending in the formation of "the disciplinary society". This unrelenting expansion of the disciplines does not consider the possibility of disruptive technologies that would redefine how people watch each others and resist various efforts to monitor their activities. Panoptic Features

Foucault's analysis of Bentham's panoptic prison emphasizes a number of features. The first innovation consists in the physical ordering of the cells in a ring, in the middle of which a focal point- the observation tower- affords a perfect view of all the inmates. Such a "hub-and-spoke" architecture allows a single warden to watch a large number of cells and creates a new economy of surveillance. The asymmetrical power relation created by this circular architecture is reinforced by the lighting arrangements that induce total and permanent visibility for the inmates, while the guardians are shielded behind blinds that make them invisible to the surveillance subjects. A third feature consists in the partition between cells. The solitude it creates seeks to make the inmate "a subject of information, never a subject in communication" (Foucault, 1977, p. 200), to remove the opportunities for coordination that could lead to a "collective effect". The expected result is a more effective institution, where the concentration of power facilitates the observation, classification, comparison, and ultimately, management of subjects.

109

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

261

Beyond an erudite description of Bentham's model, Foucault's main argument resides in the idea that the panopticon "must be understood as a generalizable model of functioning; a way of defining power relations in terms of the everyday life of men" (Foucault, 1977, p. 205). It is an idealtype, "the diagram of a mechanism of power reduced to its ideal form; its functioniny, abstracted from any obstacle, resistance or friction, must be represented as a pure architectural and optical system: it is in fact a figure of political technology that may and must be detached from any specific use" (Foucault, 1977, p. 205, my emphasis). Hospitals, military units, schools, or workshops were other places where Foucault identified panoptic mechanisms at work, in a trend that he predicted would result in the emergence of a disciplinary society. This total theory of surveillance and discipline proved very appealing and was embraced by a number of scholars, who extended its application to public spaces - where CCTV systems have become ubiquitous, in the workplace or on the Internet, just to name a few. While their interpretation of panopticism varies greatly (Lyon, 2006; Simon, 2005), they all implicitly subscribe to the idea of a power asymmetry between a small group of elite supervisors exercising a monopoly on surveillance tools, and a large mass of unsuspecting or passive individuals whose interests seem to rarely transcend their obsession for consumption (Bauman, 2000). This hierarchical model of surveillance was famously challenged by Thomas Mathiesen, who introduced the concept of synopticism in his article on the "viewer society" (Mathiesen, 1997). Mathiesen reminds Foucault's readers that a significant piece of the contemporary surveillance puzzle is missing from the master's account: We have seen the development of a unique and enormously extensive system enabling the many to see and contemplate the few, so that the tendency for the few to see and supervise the many is contextualized by a highly significant counterpart. I am thinking, of course, of the development of the total system of the modern mass media. (Mathiesen, 1997, p. 219)

However, far from disagreeing with Foucault's conclusions, Mathiesen insists on the reciprocal functions of the panopticon and the synopticon, which are to control and discipline the "soul", ending his article on a very pessimistic note. Although he calls for political resistance as a moral imperative, his prognosis is very gloomy, and the Internet is merely seen as another media reproducing a familiar pattern of domination and oppression through surveillance and preformatted choices. What is striking in this very severe judgement, which also resonates in many panoptic studies that extend Foucault's reasoning to computer

110

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

262

BENOIT DUPONT

technologies (Poster, 1990; Sewell & Wilkinson, 1992; Gandy, 1993), is that it transposes the rock and mortar architecture of the prison to the structure of the Internet, built on wires and bits. A more careful examination of the Internet's structural features should however introduce a dose of relativism and open up new avenues of enquiry for the study of contemporary surveillance practices. In that respect, Yochai Benkler's book on "the wealth of networks" (2006) offers one of the most detailed accounts of the Internet's structural and institutional features, as well as a consideration of their impact on political and cultural freedoms.

The Internet as an Anti-Panopticon Where the panopticon and synopticon adopt the "one-way, hub-and-spoke structure, with unidirectional links to its ends" (the periphery in the case of the former, the centre for the latter), the Internet is built as a decentralized and "distributed architecture with multidirectional connections among all nodes in the networked information environment" (Benkler, 2006, p. 212). This distribution of ties allows members of the network (machines and individuals) to access and communicate with other members through a large number of simultaneously available paths that very rarely transit through a single central node. This is due to the fact that the concept of centrality is by definition excluded from the architecture of the Internet to increase its resilience in case of a major failure of the central node. In this model of information management, it is much harder for a central authority to control the flow of data than in a panoptic environment, while at the same time, it becomes much easier for a myriad of actors to observe and monitor their peers, since the distribution of ties also creates a hyper-connectivity conducive to the multilateralization of surveillance. So, while the panoptic and synoptic models placed the emphasis on "the fact that the disciplines use procedures of partitioning and verticality, that they introduce, between the different elements at the same level, as solid separations as possible, that they define compact hierarchical networks, in short, that they oppose to the intrinsic, adverse force of multiplicity the technique of the continuous, individualizing pyramid" (Foucault, 1977, p. 220), the Internet functions under entirely different premises. It connects people and let them form horizontal networks - largely independent from governments - that moderate the distribution of power instead of reinforcing its concentration (Lessig, 2006, p. 274).

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

ill

263

This is not to say that the Internet is devoid of architectures of control: governments and businesses around the world spend considerable amounts of money to design surveillance systems able to tell them who is doing what, with whom, and from where on the Internet (Lessig, 2006, p. 38). But these technologies are not exclusive to a restricted group of supervisors. They are becoming increasingly accessible to individual users and fulfill a number of functions that range from the noble to the mundane, and the disciplinary to the playful. They must also contend with a number of resistance technologies and behaviours that thrive in the Internet environment because of its very un-panoptic architecture.

THE DEMOCRATIZATION OF SURVEILLANCE The term democratization refers to the broadening accessibility of online surveillance through a plurality of tools and services that could previously only be afforded by governments and large companies. This trend reverberates both in the private and public spheres, and corresponds to a wide range of rationalities sustained by business-oriented ventures, non-governmental organizations (NGOs), and social units such as families and groups of friends. Low barriers of entry to the world of online surveillance are responsible for this democratization. Contrary to other mass media such as television or newspapers, the marginal costs for the distribution of information on the Internet are very low, because expensive proprietary infrastructure such as satellites, fibre-optic cables, printing presses, and delivery routes are not required (Benkler, 2006). All providers of Internet services share the same infrastructure and the same data transfer protocols, also known as TCP /IP (Lessig, 2006, pp. 143-146). Therefore, large investments in capital assets are not required to start disseminating information, as millions of bloggers have found out. Most of the costs incurred by new service providers are associated with the collection and sorting of data, or the development of new methods to collect and sort data more effectively or more efficiently. For example, the success of the very popular Google search engine can be attributed to the superior quality of its ranking algorithm, making the results it displays at the top of its page more relevant than those of its competitors. Once data or information has been processed, it can be distributed or accessed on a largescale at little or no additional cost. This combination of openness and cheap means of distribution constitutes a powerful incentive to innovations fuelled by entrepreneurs and social activists alike. These innovations can be categorized in two

112

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

264

Security and Privacy

BENOIT DUPONT

groups. The first group merges off-line observation technologies with online dissemination tools, while the second group is entirely made up of online technologies that are used to collect and distribute data. Among the observation technologies mobilized by the first group, we find digital photography and video recording, remote sensing, geographical information systems, human input, and social engineering. The following examples will provide a better idea of the democratization processes at work.

Online Diffusion of Content Collected by OfFLine Observation YouTube 1 is probably the best-known video-sharing website, with an estimated monthly audience of 20 million people and 100 million video downloads per day. The company, whose slogan is "broadcast yourself", adds more than 65,000 videos every day to its library. Users of the site directly post these short segments with very limited interference from YouTube employees, whose number does not exceed 30 people (Reuters, 2006). Thousands of contributors find there a platform to share contents produced by the explosion of video-capable consumer devices such as camcorders, computer webcams, or mobile phones. Although YouTube and other less successful video-sharing websites are primarily promoting the entertainment aspect of their services, many videos uploaded on their servers have a distinctive surveillance flavour: shopkeepers or homeowners are routinely making surveillance tapes of burglars breaking into their property available in the hope that it will increase their chances of being arrested (Rodriguez, 2007), grainy videos capturing police brutality incidents or blatant instances of corruption are uploaded at regular intervals, 2 and politicians uttering racial slurs or contradicting themselves shamelessly in semi-private functions are also bound to find their duplicity exposed to an audience of millions within hours, with very limited opportunities for damage contro1. 3 The miniaturization of video recording devices and the ubiquity of Internet access points, even in conflict zones, also allow anyone with a connected computer to remotely experience the ferocity and confusion of close quarter combat: Iraqi insurgents and US troops alike profusely post uncensored videos of their deadly encounters, providing far bleaker pictures of the conflict than the sanitized versions offered by the main television networks. Y ouTube and its edgier competitors Live Leak and Dailymotion return thousands of results for search terms such as "Iraq war", "insurgency", "sniper", or "lED" (improvized explosive devices).

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

113

265

At the other end of the spectrum, macro-observation technologies such as remote sensing and geographical information systems applied to the Internet information economy can foil the efforts deployed by governments and large corporations to conceal some of their most questionable activities. Google and Microsoft offer through their Google Earth and Virtual Earth services high resolution geocoded satellite pictures of the planet that can been used for surveillance purposes, despite the fact that the data provided is usually a few weeks to three years old. 4 These very popular tools are free to use, and Google claims that more than 100 million people have downloaded the software needed to access its imagery (Meyer, 2006). The primary use of these tools involves the first-hand observation of what past official maps deliberately omitted (Monmonier, 1991), hidden behind high walls, or too remote to be accessed by any other means. The Cryptome website offers, for example, a series of detailed "eyeball" pictures 5 that expose sensitive infrastructures such as military bases, intelligence agencies' headquarters, politicians', and company executives' residences, in an effort to dispel the myths surrounding these secretive places. Anyone with a connection to the Internet can comb the millions of satellite pictures available online in order to satisfy their idiosyncratic curiosity. Some people use this capacity to track the latest nuclear submarine launched by the Chinese navl while others are just as happy having a peek at the houses of the rich and famous 7 or the places they will visit during their next vacation. NGOs are also enlisting Google Earth to call attention to civil wars and humanitarian disasters such as Darfur. Amnesty International has launched a campaign called "eyes on Darfur" that uses satellite imagery to present the extent of violence committed in this inhospitable part of the world and let Internet users "monitor [12] high risk villages [to] protect them from further attack" in what the NGO describes as the "global neighbourhood watch". 8 The United States Holocaust Memorial Museum offers a similar experience on its website, but on a much larger scale. It plans to use these satellite pictures to build an online "global crisis map" of emerging genocides or crimes against humanity, which would allow activists, journalists, and citizens to access and share information more quickly. 9 At the illegal end of the spectrum, some terrorists have even embraced these surveillance tools to identify possible targets and their vulnerabilities (Harding, 2007), an approach explicitly acknowledged by Google on its website when it describes how homeland security agencies can leverage the power of Google Earth to conduct "critical infrastructure vulnerability assessment" and "pattern visualization of surveillance data" for $ 400 a year. 10

114

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

266

Security and Privacy

BENOIT DUPONT

A more significant outcome of these online technologies derives from the capacity to combine satellite pictures and maps with other types of digital data provided by sensors such as mobile phones, or generated by users themselves. These new applications are known as "mashups" and are made possible by open and easy-to-use programming formats and tools (Eisenberg, 2007) that fuse layers of information into a single file, adding value to the original pool of diverse data. Some businesses incorporate mashups to the affordable surveillance tools they market, such as mobile phone companies that offer handsets equipped with global positioning systems and let their customers (usually parents) track online the movements of the person carrying the phone (usually a child) (Pogue, 2006). Beyond the rise of Big Mother and Big Father, mashups also assist citizens in their efforts to gain a more detailed awareness of their immediate environment. While interactive crime maps that let online users create personalized outputs based on criteria such as type of crime, zip code, location, or even transport route, 11 are popular in the United States, Europeans seem more interested in monitoring the location of speed and red light cameras. The SCDB website 12 claims to maintain a database of 18,000 cameras scattered all over Europe, whose coordinates are updated by road users (Big Driver?).

Online Surveillance of Online Activities In the previous examples, the Internet was used as a mediator by millions of connected supervisors who access dispersed real-world data, then modify, aggregate, and disseminate it for their own benefit, for altruistic motives, or in some instance for criminal gain. The same process applies to the surveillance of online activities, which cannot structurally be monopolized by governments or large corporations. As the underlying rationale is fairly similar, I will only use three examples (two lawful, the last one criminal) to show how this works. The first example demonstrates how travellers who book their trips online can harness the power of self-surveillance to extract cheaper airfare and hotel room rates from companies that have developed predatory pricing systems based on consumers' surveillance. This practice is known in the tourism industry and in other sectors that deal in perishable items as "yield pricing" or "yield management" (Desiraju & Shugan, 1999) and involves the dynamic allocation of discounts so that revenues are maximized for each flight or room sold (Borenstein & Rose, 1994, p. 655). The complexity of this pricing system can only be managed by computers

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

115 267

that constantly adjust prices to encourage purchases when sales are going slowly and maximize profits when the demand is strong, sometimes resulting in airfares that vary from one minute to another. Obviously, it creates a form of discrimination between consumers who pay fares that vary substantially for the same service, since they do not have access to the same data and tools on which to base their decision. The Internet resolved this informational asymmetry by creating a forecasting market that monitors the highs and lows of airfares or hotels rates. Services such as Farecast 13 or Kayak 14 use datamining techniques to comb the wild fluctuation of thousands of airfares over long periods of time and advise customers on the best purchasing strategy (wait or buy). Although they are applied to a fairly mundane activity, these tools should be understood as highly disruptive by nature. They bring meta-surveillance capacities to individuals who can deploy their own sophisticated technologies to uncover the routine surveillance to which they are submitted by large corporations. The second example also illustrates how the democratization of surveillance can be used to expose the online activities of powerful interests. Whether it represents an improvement or not, the online collaborative encyclopedia Wikipedia 15 has become in a matter of years a source of reference material for millions of Internet users who also contribute to its entries. Content accuracy is a major issue (Giles, 2005), especially for controversial issues where conflicting interpretations of an event or someone's actions can lead to defamatory or plainly dishonest comments (Kolbitsch & Maurer, 2006). Government agencies that seek to defend their record on contested policy decisions or want to obscure their mistakes are tempted, in that context of openness, to edit entries that refer to them. Large corporations and NGOs might also use Wikipedia as a public relations tool to downplay their responsibility in embarrassing scandals or inflate their contributions to society. Unfortunately for them, the same surveillance tools that are used to undermine privacy and authenticate the identity of every Internet user can also be used to identify (to a certain extent) who has made changes on any given Wikipedia entry. This capacity has always been available to computer-savvy users through what is known as an IP tracer or IP locator. The term IP stands for Internet Protocol and refers to the addressing system that allows data to be sent to the right machine on the network. IP addresses are unique identifiers, and although they are not allocated on a geographical basis, it is still fairly easy to locate a user based on publicly available IP address tables (Lessig, 2006, p. 59). Hence, talented programmers can develop an IP mapping application that integrates seamlessly with another web application. Virgil Griffith, the designer of

116

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

268

Security and Privacy

BENOIT DUPONT

WikiScanner, 16 is one of those talented programmers. His online search engine lets users find out which organizations are the most active Wikipedia editors. Thousands of changes made by people working for government agencies such as the US Department of Homeland Security, the Pentagon, or the CIA; companies such as Wal-Mart or Exxon; NGOs such as the American Civil Liberties Union (ACLU) or the Electronic Frontier Foundation or even religious entities such as the Vatican or the Church of Scientology are retrievable. While some of them are the results of bored employees taking a break to update a page that relates to their personal interests (in itself a form of resistance), many others are linked directly to attempts by these organizations to anonymously shape their image. The openness that characterizes the Internet's architecture renders these clandestine efforts much easier to detect, providing sufficient incentives exist for someone to provide monitoring tools and for users to take advantage of them. The surveillance tools described above are not isolated or exceptional, but the democratization trend is not synonymous with equal access to surveillance resources either. The barriers to the deployment of highly intrusive online surveillance technologies are not financial resources, but instead technical skills. While governments have rapidly expanded their online surveillance capacities since 9/11, criminal actors have also been busy deploying their own elaborate webs of surveillance. Botnets (the contraction of software robot and network) are computer networks made up of compromised machines (called zombies) that have been infected by viruses or other malicious software and that can, as a result, be monitored and controlled remotely without the knowledge of their rightful owners. These botnets are used by hackers (called botmasters in this instance) to send spam, commit click fraud/ 7 or launch large-scale attacks against websites in order to shut them down or extort money from their operators to stop the attacks. 18 Botnets are routinely used to perform scans of their host machines. With some of them including more than a million compromised computers (Gaudin, 2007) and conservative studies evaluating botnet infection at 11% of all computers connected to the Internet (Abu Rajab, Zarfoss, Monrose, & Terzis, 2006), their mass surveillance potential is not hard to imagine. In this last example, surveillance is no more horizontal and democratic than it is vertical or centralized, and the panoptic model can only be of limited assistance to analyze the distributed structure of supervision, and its disconnect from any disciplinary and social sorting project (Haggerty & Ericson, 2000; Lyon, 2006; Haggerty, 2006). Social and

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

117

269

technical factors such as the plurality of functions associated with the monitoring of others' online activities, regulatory frameworks, new business models, computer skills of Internet users, and the open or faulty code of communication protocols all play an important role in the adoption of online surveillance technologies. Unfortunately, we have barely begun examining these variables' empirical architecture, which also influence the numerous resistance strategies employed by those who want to defend their privacy from the omnipresent surveillance of the state, their family and friends, or computer hackers.

RESISTANCE TO ONLINE SURVEILLANCE In line with Foucault's lack of interest for resistance as a counteracting force to the oppressive panoptic gaze, many modern surveillance scholars have dismissed the possibility of collective neutralization and sabotage efforts or have been ambivalent about them, at best (Gandy, 1993, p. 147; Campbell & Carlson, 2002, p. 603), despite clear signs that they are not isolated occurrences (Bain & Taylor, 2000; Timmons, 2003; Lyon, 2004, Poster, 2005; Bogard, 2006, p. 101). Acts of resistance in surveillance studies are often presented as individual and localized efforts (Haggerty & Ericson, 2006, p. 18) that produce partial and temporary victories (Gilliam, 2006, p. 115) and merely reinforce the effectiveness of surveillance through an escalation process. There are, however, many ways for the subjects of surveillance to reclaim their privacy and autonomy, as Gary Marx (2003) so compellingly demonstrated. Although the eleven resistance strategies he describes in his article apply more or less to online surveillance, two of them will be considered in greater detail, and from a collective rather than an individual perspective. These strategies are: blocking moves and masking moves. Cryptography as a Blocking Move

Blocking moves refer to the process that seeks "to physically block access to the communication" (Marx, 2003, p. 379). Blocking moves are inconceivable in the panoptic world, since partitions prevent subjects from contacting each others, whereas on the Internet, where messages transit through multiple paths, they become an essential tool to ensure the safety of

118

Security and Privacy

270

BENOIT DUPONT

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

communications. Cryptography 1s perhaps one of the oldest blocking moves. It can be defined as: A transformation of a message that makes the message incomprehensible to anyone who is not in possession of secret information that is needed to restore the message to its normal plaintext or cleartext form. The secret information is called the key, and its function is very similar to the function of a door key in a lock: it unlocks the message so that the recipient can read it. (Diffie & Landau, 1998, p. 13)

Cryptography has a long history that dates back to the invention of writing and played an instrumental role in several military conflicts (Singh, 1999; Pincock, 2006). Yet, its impact on Internet surveillance is rarely considered, despite the fact that the need to safeguard online financial transactions makes it one of the most widely used online privacy tools. If encryption procedures were mainly used by spies and diplomats before the advent of the Internet, the computing power available in each PC today is sufficient to produce scrambled messages that would foil the most determined code breakers. Since Philip Zimmermann made his Pretty Good Privacy (PGP) encryption software available on the Internet in 1990 and won his legal battle with the US Department of Justice, anyone who is not a mathematician or programmer can still enjoy the benefits of unbreakable encryption and defeat the most sophisticated surveillance technologies (Diffie & Landau, 1998). For example, terrorist organizations, pedophiles, and computer hackers have been known to use off-the-shelf or homemade encryption tools to conceal their unlawful activities (Denning & Baugh, 2000). Encryption is sometimes used by human rights organizations who want to protect their correspondents in authoritarian regimes. Although most popular e-mail programs such as Outlook or Thunderbird can send and receive encrypted emails, very few people actually use this facility. An Internet user survey conducted by Garfinkel, Margrave, Schiller, Nordlander, and Miller (2005) shows that 68% of people in their sample (N = 417) were either unaware that encryption was available on their e-mail client or did not know what cryptography was. Hence, despite the fact that cryptography is widely available at virtually no charge to Internet users, resistance to online surveillance is informed by other factors than purely technical considerations. A study of political activists opposing US administration policies in the post-9/11 environment shows that users balance the need for secrecy with a reluctance to fall into what they perceive as a paranoid or abnormal state of mind (Gaw, Felten, & Fernandez-Kelly, 2006). Systematic resistance that applies indiscriminately to mundane and highly sensitive content is experienced as a mental burden denoting an

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

119

271

unbalanced personality, while selective resistance is described by one respondent as similar to healthy eating and exercise: people know it is the right thing to do, but they are not always doing it themselves (p. 594). What these informed users tell us is that they resort to blocking moves with parsimony, maintaining a much more complex rapport to resistance than initially assumed by surveillance scholars.

Distributed Masking Moves Masking moves that allow users to surf the web anonymously are more widespread than blocking moves. One reason that might explain this difference is that the former take full advantage of the distributed architecture of the Internet by establishing virtual networks of trust (Tilly, 2005). These resistance networks thwart surveillance attempts by randomly routing the information their members want to send or receive through other members of the network, thereby making it impossible for supervisors to know who is effectively communicating with whom and about what. TOR (The Onion Router), Freenet, and Psiphon 19 are examples of popular masking tools that are freely available for download and use on the Internet. Freenet's homepage claims that its software was downloaded more than two million times, and TOR's user base is said to reach hundreds of thousands, mainly from the United States, Europe, and China (Zetter, 2007). Although these programs differ slightly at the technical level, their overall approach is similar. Once people have installed them on their computer, a portion of their hard drive is automatically encrypted and secure connections are established with other computers that run the same software when the user logs on the Internet. All communications transit seamlessly through other nodes of the trust network before they are allowed into the more open and easily monitored part of the Internet. Attributing a particular online behaviour to a specific machine, and hence to its owner or operator, becomes a fruitless endeavour since complex algorithms are used to blur the patterns of data that enter and exit the trust network. What makes this type of trust network different from the more traditional ones described by Tilly (2005) is that it is scalable and does not require its members to share the same objectives. It is scalable in the sense that the more members these masking tools can enlist, the more effective they will be, while traditional trust networks expose themselves to failure and malfeasance when their membership becomes too large and difficult to manage. The second feature of these virtual trust networks is that credentials are allocated on a

120

Security and Privacy

BENOIT DUPONT

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

272

technological basis (the willingness to encrypt and relay encrypted communications with no control over the contents being transmitted) more than on ethno-religious traits or shared social or political interests, making strange bedfellows in the process. Even though they are primarily destined to privacy and anti-censorship activists, diplomatic missions, intelligence agencies, and armed forces- including from authoritarian regimes such as Iran- also make intensive use of these free masking tools (Zetter, 2007), a good indicator of the trust these surveillance organizations place in them to protect their sensitive information against their counterparts. Less drastic masking moves involve the manipulation by consumers of registration and search data in order to minimize the generation of profiles based on viewing patterns and datamatching techniques. The free online service BugMeNot20 (BMN) offers to bypass the registration process that is compulsory to enter many websites by providing its users access to a database made up of active accounts (usernames and passwords) obtained by submitting fake socio-demographic details. BMN also provides disposable e-mail addresses that can be used for twenty-four hours as an alternative to disclosing real e-mail address to online merchants and data-brokers. Because the online interface allows users to directly submit new accounts and retrieve passwords from the database, there is a positive correlation between the number of users and the utility they derive from this service. As of September 2007, BMN provided accounts to more than 175,000 websites. Another interesting initiative is TrackMeNot21 (TMN), a little program written by two New York University professors. 22 This application is used whenever the Firefox browser23 accesses Internet search engines such as Google, AOL, Yahoo, and MSN. These websites keep track of all the searches performed by individual users in order to return context or location-relevant advertisements to accompany search results (Barbaro & Zeller, 2006). TMN uses an obfuscation strategy to drown real search queries in a cloud of randomly generated queries that makes profiling considerably more difficult and much less accurate, if not totally meaningless. The inventors of TMN actually acknowledge on their webpage that Gary Marx's article "A tack in the shoe" (2003) partly inspired their application.

CONCLUSION The reified panoptic metaphor that dominates the field of surveillance studies appears increasingly detached from the complex reality of online monitoring (Boyne, 2000; Haggerty, 2006). Through a detailed analysis

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

121

273

of several diverse meta-surveillance and resistance technologies, I have attempted to expand the register of legitimate research questions on this issue. For example, how do some disruptive technologies concretely modify the underlying distribution of knowledge and power in the surveillant assemblage (Haggerty & Ericson, 2000)? How are expanding monitoring technologies appropriated by people and institutions for unexpected uses? What are the individual, social, political, economical, and technological factors that impact on resistance or constrain the effectiveness of surveillance? Can resistance be integrated to the study of surveillance, or should it be treated as a separate subject? These questions challenge the panoptic framework, but they also have the potential to make it more relevant to twenty-first century technological conditions. To be answered, they require a more grounded knowledge of the actual interactions between those who watch, the machines and infrastructure they design and use to carry out their surveillance, the people being watched and the flows of data that are generated as a result. These connections involving humans, machines, and places are easier to map in high-technology environments, because they leave behind a profusion of traces or markers, but it cannot be done without first abandoning the paranoid and megalomaniac tendencies the panopticon so often fuels (Latour, 2005). While compiling example upon example of distributed surveillance and widespread resistance, I could not help wonder why so many surveillance scholars had carefully avoided this less travelled path. In an important contribution, Kevin Haggerty (2006) offers some interesting hypothesis to explain this reluctance, such as the critical thinking tradition of surveillance scholars, their simplified understanding of Foucault's integral intellectual legacy, a focus on human surveillance that neglects human/technological hybrids, and a methodological approach that overemphasizes discourse and document analysis to the detriment of more grounded empirical data. This last trait makes surveillance scholars overly dependent on the public transcripts that explain power relations between subjects and supervisors. Unfortunately, the official story is rarely the whole story, and hidden transcripts that can be defined as "offstage speeches, gestures, and practices that confirm, contradict, or inflect what appears in the public transcripts" (Scott, 1990, p. 4) should also be studied. However, the critical posture or methodological choices made by surveillance scholars cannot entirely explain the lack of interest for the "arts of resistance" and their impact on the governance of surveillance. I offer an additional interpretation inspired by Gary Marx's (2007) techno-fallacies article and the heuristics' theory of Tversky and Kahneman

122

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

274

Security and Privacy

BENOIT DUPONT

( 1982). Just like technophiles often succumb to the false belief that there is a technological fix for every security problem, surveillance scholars (as an epistemic community, not as individuals) are not immune to biases that lead them to assume that the monitoring technologies embedded in virtually every aspect of our lives are a clear indicator of our inexorable fall into a 1984 reality. Three biases are particularly salient in this belief system. The first bias is the initiative bias, which leads people to attribute less initiative and less imagination to others than to themselves (Kahneman & Tversky, 1993, p. 3), especially if they belong to a lower socio-economic group. While surveillance scholars are able to offer elaborate narratives of the hidden power of the electronic panopticon and its significance, they frequently discount the interpretive capacities and agency of surveillance subjects and the resistance strategies that ensue. The loss aversion bias refers to the asymmetrical evaluation of positive and negative outcomes, where losses are systematically overestimated and gains are underestimated. This bias seems particularly pronounced "when the reference point is the status quo, and when retention of the status quo is an option" (Kahneman & Tversky, 1993, p. 14). This bias corresponds in surveillance studies to the reticence manifested toward the study of positive developments (Haggerty, 2006, p. 35) such as the accountability produced by meta-surveillance applications or the independence afforded to elderly patients by monitoring systems that let them stay at home. The tendency to predict widespread erosions of freedom has also been a prominent feature of surveillance studies, despite the lack of empirical and historical data to support this claim. Democracies have not crumbled since advanced monitoring technologies have invaded our lives, and the lack of sophisticated surveillance tools has never prevented authoritarian states to enroll thousands of informers to control internal dissent (Pfaff, 2001). Finally, the third heuristic is the probability bias whereby a confusion is made between what is possible and what is probable (Ohm, 2007). This bias is very closely connected with the previous one, because on contentious subjects such as surveillance and privacy, people tend to focus on disastrous outcomes and neglect the role played by randomness (Taleb, 2004), complexity, and contested rationalities (Espeland, 1998) among supervisors. Surveillance scholars frequently present what may happen as what will happen, obscuring the mechanisms that so often derail the best plans. Perhaps, the fact that Bentham's panopticon was actually never built and that the British government preferred instead to deport its prisoners to Australia, an open-air prison where convict supervision was deliberately kept at a minimum (Kerr, 1989;

Security and Privacy

Hacking the Panopticon

123

275

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Jackson, 1998), should serve as a reminder that dystopias are about as likely to materialize as utopias.

NOTES 1. http:/ /www.youtube.com, accessed September 4, 2007. 2. See for example the string of videos showing Moroccan police officers receiving cash payments from truck drivers at http:/ fwww.youtube.com/watch? v = Afed8wvYwmc, accessed September 11, 2007. 3. Former US Republican senator George Allen (with presidential aspirations) lost his bid in the 2006 election after a video in which he called an aide to his opponent a «macaca» was made available on YouTube at http://www.youtube.com/ watch'?v=r90zOPMnKwl, accessed September II, 2007. 4. See Google Earth help centre at http:/ fearth.google.comfsupport/, accessed September 15, 2007. 5. http://eyeball-series.org/, accessed September 15, 2007. 6. http://www .fas.org/blog/ssp/2007 /07 /new_chinese_ballistic_missile.php, accessed September 16, 2007. 7. http://www.gearthhacks.com/dlcat25/Famous- Homes.htm, accessed September 16, 2007. 8. http:/ /www.eyesondarfur.org/, accessed September 16, 2007. 9. http://www.ushmm.org/googleearth/projects/darfur/, accessed September 16, 2007. 10. http://earth.google.com/security.html, accessed September 16, 2007. II. http://www.chicagocrime.org/; http:/ /www.latimes.com/news/local/crime/ homicidemap/; http:/ fwww.mapufacture.com/feeds/1000398-0akland-Crime-Feed, all accessed September 16, 2007. 12. http://www.scdb.info/. It is one among others: see for example http:// www.speedcameramap.co.uk/ and http:/ /www.spod.cx/speedcameras.shtml for the United Kingdom, all accessed September 16, 2007. 13. http://www.farecast.com, accessed September 22, 2007. 14. http:/fwww.kayak.com, accessed September 22, 2007. 15. http://www.wikipedia.org, accessed September 22, 2007. 16. http://wikiscanner.virgil.gr/, accessed September 22, 2007. 17. A practice where online advertisers are charged for clicks on banners that originate from computer software and not legitimate users interested in their product. 18. They are known as DDoS or distributed denial of service attacks. 19. http://tor.eff.org/, http:/ /freenetproject.org, and http:/ /psiphon.civisec.org/, all accessed September 25, 2007. 20. http://www.bugmenot.com, accessed September 25, 2007. 21. http://mrl.nyu.edu/~dhowe/trackmenot/, accessed September 25, 2007. 22. Daniel C. Howe, from the Media Research Lab and Helen Nissenbaum from the Culture and Communication department. 23. Unfortunately, the program is not available with the most popular Microsoft Explorer browser.

124

Security and Privacy

276

BENOIT DUPONT

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

REFERENCES Abu Rajab, M., Zarfoss, J., Monrose, F., & Terzis, A. (2006). A multifaceted approach to understand the botnet phenomenon. In: P. Harford (Ed.), Proceedings of the 6th ACM SIGCOMM on Internet measurement (pp. 41-52). New York, NY: ACM Press. Associated Press. (2007). New breed of 'compliance software' makes office computer monitoring more sophisticated. Technology Review, published August 20, 2007, retrieved August 21, 2007, from http://www .technologyreview.com/Wire/19271 /page I 1 Bain, P., & Taylor, P. (2000). Entraped by the 'electronic panopticon'? Worker resistance in the call centre. New Technology, Work and Employment, 15(1), 2-18. Barbaro, M., & Zeller, T. (2006). A face is exposed for AOL searcher no. 4417749. The New York Times, published August 9, 2006, retrieved September 25, 2007, from http:// www .nytimes.com/2006/08/09 ;technology j09aol.html Bauman, Z. (2000). Liquid modernity. Cambridge: Polity Press. Benkler, Y. (2006). The wealth of networks. New Haven, CT: Yale University Press. Bogard, W. (2006). Surveillance assemblages and lines of flight. In: D. Lyon (Ed.), Theorizing surveillance: The panopticon and beyond (pp. 97-122). Cullompton: Willan Publishing. Borenstein, S., & Rose, N. (1994). Competition and price dispersion in the U.S. airline industry. The Journal of Political Economy, 102(4), 653-683. Boyne, R. (2000). Post-panopticism. Economy and Society, 29(2), 285-307. Campbell, E., & Carlson, M. (2002). Panopticon.com: Online surveillance and the commodification of privacy. Journal of Broadcasting & Electronic Media, 46(4), 586-606. Denning, D. E., & Baugh, W. E. (2000). Hiding crimes in cyberspace. In: D. Thomas & B. D. Loader (Eds), Cybercrime: Law enforcement, security and surveillance in the information age (pp. I 05-131 ). London: Routledge. Desiraju, R., & Shugan, S. (1999). Strategic service pricing and yield management. Journal of Marketing, 63(1), 44-56. Diffie, W., & Landau, S. (1998). Privacy on the line: The politics of wiretapping and encryption. Cambridge, MA.: MIT Press. Eggen, D. (2007). Lawsuits may illuminate methods of spy program. The Washington Post, August 14, p. AOI. Eisenberg, A. (2007). Do the mash (even if you don't know all the steps). The New York Times, September 2, p. 5. Espeland, W. (1998). The struggle for water: Politics, rationality and identity in the American Southwest. Chicago, IL: The University of Chicago Press. Foucault, M. (1977). Discipline & punish: The birth of the prison. New York, NY: Pantheon Books. Gandy, 0. H. (1993). The panoptic sort: A political economy of personal information. Boulder, CO: Westview Press. Garfinkel, S., Margrave, D., Schiller, J ., Nordlander, E., & Miller, R. (2005). How to make secure email easier to use. In: W. Kellogg & S. Zhai (Eds), Proceedings of the SIGCHI conference on human .factors in computing systems (pp. 701-710). New York, NY: ACM Press. Gaudin, S. (2007). Storm worm botnet more powerful than top supercomputers. Information Week, published September 6, 2007, retrieved September 23, 2007, from http:// www .informa tionweek.com/story /show Article.jhtmJ'larticlel D = 201804528

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Hacking the Panopticon

125

277

Gaw, S., Felten, E., & Fernandez-Kelly, P. (2006). Secrecy, flagging and paranoia: Adoption criteria in encrypted e-mail. In: R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries & G. Olson (Eds), Proceedings of the SIGCHI conference on human factors in computing systems (pp. 591-600). New York, NY: ACM Press. Giles, J. (2005). Internet encyclopedias go head to head. Nature, 438(7070), 900-901. Gilliam, J. (2006). Struggling with surveillance: Resistance, consciousness, and identity. In: K.D. Haggerty & R.V. Ericson (Eds), The new politics of surveillance and visibility (pp. 111-129). Toronto: University of Toronto Press. Graham, S., & Wood, D. (2003). Digitizing surveillance: Categorization, space, inequality. Critical Social Policy, 23(2), 227-248. Haggerty, K. D. (2006). Tear down the walls: On demolishing the panopticon. In: D. Lyon (Ed.), Theorizing surveillance: The panopticon and heyomt (pp. 23-45). Cullompton: Willan Publishing. Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605-622. Haggerty, K. D., & Ericson, R. V. (2006). The new politics of surveillance and visibility. In: K. D. Haggerty & R. V. Ericson (Eds), The new politics of surveillance and visibility (pp. 3-25). Toronto: University of Toronto Press. Harding, T. (2007). Terrorists 'use Google maps to hit UK troops'. The Telegraph, published January 13,2007, retrieved September 16,2007, from http://www.telegraph.co.ukjnews/ Jackson, R. V. (1998). Jeremy Bentham and the New South Wales convicts. International Journal of Social Economics, 25(2/3/4), 370-379. Kahneman, D., & Tversky, A. (1993). Conflict resolution: A cognitive perspective. Toronto: University of Toronto. Kerr, J. (1989). Panopticon versus New South Wales. Fabrications: The Journal of the Society of Architectural Historians, Australia and New Zealand, 1, 4-32. Kolbitsch, J., & Maurer, H. (2006). The transformation of the web: How emerging comm unities shape the information we consume. Journal of Universal Computer Science, 12(2), 186-213. Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford: Oxford University Press. Lessig, L. (2006). Code version 2.0. New York: Basic Books. Lyon, D. (2004). Globalizing surveillance: Comparative and sociological perspectives. International Sociology, 19(2), 135-149. Lyon, D. (2006). 9j11, synopticon and scopophilia: Watching and being watched. In: K. D. Haggerty & R. V. Ericson (Eds), The new politics of surveillance and visibility, (pp. 35-54). Toronto: University of Toronto Press. Marx, G. T. (2003). A tack in the shoe: Neutralizing and resisting the new surveillance. Journal of Social Issues, 59(2), 369-390. Marx, G. T. (2007). Rocky Bottoms: Techno-fallacies of an age of information. International Political Sociology, 1(1), 83-110. Mathiesen, T. (1997). The viewer society: Michel Foucault's 'Panopticon' revisited. Theoretical Criminology, 1(2), 215-234. Meyer, D. (2006). Google, Microsoft vie for earth domination. CNET News.com, published September 12, 2006, retrieved September 15, 2007, from http://www.news.com/ Monmonier, M. (1991). How to lie H'ith maps. Chicago, IL: The University of Chicago Press.

126

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

278

Security and Privacy

BENOIT DUPONT

Mowery, D. C., & Simcoe, T. (2002). Is the Internet a US invention') An economic and technological history of computer networking. Research Policy, 31(8-9), 1369-1387. O'Harrow, R. (2005). No place to hide. New York: Free Press. Ohm, P. (2007). The myth of the superuser: Fear, risk, and harm online. University of Colorado Law Legal Studies Research Paper No. 07-14, retrieved September 28,2007, from http:// www.ssrn.com/abstract = 967372 Pfaff, S. (2001 ). The limits of coercive surveillance: Social and penal control in the German Democratic Republic. Punishment & Society, 3(3). Pincock, S. (2006). Codehreaker: The history of codes and ciphers, fi"om the ancient pharaohs to quantum cryptography. New York, NY: Walker & Company. Pogue, D. (2006). Cellphones that track the kids. The New York Times, December 21, 2006, Cl. Poster, M. (1990). The mode of information: Poststructuralism and social context. Chicago, lL: The University of Chicago Press. Poster, M. (2005). Hardt and Negri's information empire: A critical response. Cultural Politics, 1(1), 101-118. Reuters. (2006). YouTube serves up 100 million videos a day online. USA Today, published July 16, 2006, retrieved on September 3, 2007, from http:jjwww.usatoday.comjtechjnewsj 2006-07 -16-youtube-views _x.htm Rodriguez, G. (2007). YouTube vigilantes. Los Angeles Times, published August 6, 2007, retrieved September II, 2007, from http:/ jwww.latimes.comjnewsjopinionj Scott, J. C. (1990). Domination and the arts of resistance: Hidden transcripts. New Haven, CT: Yale University Press. Sewell, G., & Wilkinson, B. (1992). 'Someone to watch over me': Surveillance, discipline and the just-in-time labour process. Socioloqy, 26(2), 271-286. Simon, B. (2005). The return of panopticism: Supervision, subjection and the new surveillance. Surveillance and Society, 3(3), 1-20. Singh, S. (1999). The code book: The science of secrecy fi'om ancient Egypt to quantum cryptography. Toronto: Random House. Taleb, N. N. (2004). Fooled by randomness: The hidden role of chance in life and the markets. New York, NY: Texere. Tilly, C. (2005). Trust and rule. Cambridge: Cambridge University Press. Timmons, S. (2003). A failed panopticon: Surveillance and nursing practices via new technology. New Technology, Work and Employment, 18(2), 143-153. Tversky, A., & Kahneman, D. (1982). Judgement under uncertainty: Heuristics and biases. In: D. Kahneman, P. Slovic & A. Tversky (Eds), Judgement under uncertainty: Heuristics and biases (pp. 3-20). Cambridge: Cambridge University Press. Zetter, K. (2007). Rogue nodes turn to anonymiser's into eavesdropper's paradise. Wired, published September 10,2007, retrieved September 25,2007, from http:jjwww.wired.com/ politics/security jnews/2007/09 /embassy_hacks

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[7] The surveillant assemblage Kevin D. Haggerty and Richard V. Ericson

ABSTRACT George Orwell's 'Big Brother' and Michel Foucault's 'panopticon' have dominated discussion of con temporary developments in surveillance. 'While such metaphors draw our attention to important attributes of surveillance, they also miss some recent dynamics in its operation. The work of Gilles Deleuze and Felix Guattari is used to analyse the convergence of once discrete surveillance systems. The resultant 'surveillant assemblage' operates by abstracting human bodies from their territorial settings, and separating them into a series of discrete flows. These flows are then reassembled in different locations as discrete and virtual 'data doubles'. The surveillant assemblage transforms the purposes of surveillance and the hierarchies of surveillance, as well as the institution of privacy.

KEYWORDS:

Surveillance; assemblage; Deleuze; panopticon; social theory

INTRODCCTION

One of the most recognizable figures in cultural theory is the flaneur as analysed by Walter Benjamin (1983). A creature of nineteenth-century Paris, the flfmeur absorbs himself in strolling through the metropolis where he is engaged in a form of urban detective work. Concealed in the invisibility of the crowd, he follows his fancies to investigate the streets and arcades, carving out meaning from the urban landscape. Possessing a 'sovereignty based in anonymity and observation' (Tester 1994: 5), the flfmeur characterizes the urban environment and the experience of modernity. There has been an exponential multiplication of visibility on our city streets. ·where the flaneur was involved in an individualistic scrutiny of the city's significations, the population itself is now increasingly transformed into signifiers for a multitude of organized surveillance systems. Benjamin

Security and Privacy

128

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

606

Kevin D. Haggerty and Richard V. Ericson

recognized the importance of even the earliest prototypes of such technologies, observing how the development of photography helped undermine the anonymity which was central to the flaneur by giving each face a single name and hence a single meaning (Benjamin 1983: 48). Surveillance has become a salient topic for theoretical reflection, and this interest coincides with the quantitative increase in surveillance in western societies. However, this paper docs not propose to provide a comprehensive overview of these systems of observation. A number of other authors have documented developments in this rapidly changing area (Staples 1997; Bogard 1996; Dandecker 1990; Lyon 1994; Gandy 1993). Instead, we view surveillance as one of the main institutional components of late modernity (Giddens 1990). Our aim is to reconsider some of the more familiar theoretical preoccupations about this topic. We do so by drawing from the works of Gilles Deleuze and Felix Guattari to suggest that we are witnessing a convergence of what were once discrete surveillance systems to the point that we can now speak of an emerging 'surveillant assemblage'. This assemblage operates by abstracting human bodies from their territorial settings and separating them into a series of discrete flows. These flows are then reassembled into distinct 'data doubles' which can be scrutinized and targeted for intervention. In the process, we are witnessing a rhizomatic leveling of the hierarchy of surveillance, such that groups which were previously exempt from routine surveillance are now increasingly being monitored.

THEORIZING SURVEILLWCE: ORWELL AND FOUCAULT

·writing well in advance of the contemporary intensification of surveillance technologies, Orwell (1949) presented a prescient vision. In his futuristic nation of Oceana, citizens are monitored in their homes by a telescreen, a device which both projects images and records behaviour in its field of vision. The 'thought police' co-ordinate this extensive monitoring effort, operating as agents of a centralized totalitarian state which uses surveillance primarily as a means to maintain social order and conformity. Not all citizens, however, are singled out for such scrutiny. The upper and middle classes are inlensely monilored, while lhe vasl majorily of lhe populalion, lhe underclass 'proles', are simply lefl lo lheir own devices. The fact that we continue to hear frequent cautions about '1984' or 'Big Brother' speaks to the continued salience of Orwell's cautionary tale. In the intervening decades, however, the abilities of surveillance technologies have surpassed even his dystopic vision. Writing at the cusp of the development of computing machines, he could not have envisioned the remarkable marriage of computers and optics which we see today. Furthermore, his emphasis on the state as the agent of surveillance now appears too restricted in a society where both state and non-state institutions are

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The surveillant assemblage

129

607

involved in massive efforts to monitor different populations. Finally, Orwell's prediction that the 'proles' would largely be exempt from surveillance seems simply wrong in light of the extension and intensification of surveillance across all sectors of society. Michel Foucault's (1977) analysis of the panopticon provides the other dominant metaphor for understanding contemporary surveillance. In part, Foucault extends Orwell's fears, but his analysis also marks a significant departure, as it situates surveillance in the context of a distinctive theory of power. The panopticon was a proposed prison design by eighteenth-century reformer Jeremy Bentham (1995). What distinguished this structure was an architecture designed to maximize the visibility of inmates who were to be isolated in individual cells such that they were unaware moment-to-moment whether they were being observed by guards in a central tower. More than a simple device for observation, the panopticon worked in conjunction with explicitly articulated behavioural norms as established by the emerging social sciences, in efforts to transform the prisoner's relation to him or her self. This disciplinary aspect of panoptic observation involves a productive soul training which encourages inmates to reflect upon the minutia of their own behaviour in subtle and ongoing efforts to transform their selves. Foucault proposed that the panopticon served as a diagram for a new model of power which extended beyond the prison to take hold in the other disciplinary institutions characteristic of this era, such as the factory, hospital, military, and school. Foucault's analysis improves on Orwell's by reminding us of the degree to which the proles have long been the subject of intense scrutiny. In fact, Foucault accentuates how it was precisely this population -which was seen to lack the self-discipline required by the emerging factory system -that was singled out for a disproportionate level of disciplinary surveillance. Foucault also encourages us to acknowledge the role surveillance can play beyond mere repression; how it can contribute to the productive development of modern selves. Unfortunately, Foucault fails to directly engage contemporary developments in surveillance technology, focusing instead on transformations to eighteenth and nineteenth century total institutions. This is a curious silence, as it is these technologies which give his analysis particular currency among contemporary commentators on surveillance. Even authors predisposed to embrace many of Foucault's insights believe that rapid technological developments, particularly the rise of computerized databases, require us to rethink the panoptic metaphor. For example, Mark Poster (1990: 93) believes that we must now speak of a 'superpanopticon' while Diana Gordon (1987) suggests the term 'electronic panopticon' better captures the nature of the contemporary situation. But even these authors are in line with a general tendency in the literature to offer more and more examples of total or creeping surveillance, while providing little that is theoretically novel. For our purposes, rather than try and stretch Foucault's or Orwell's concepts beyond recognition so that they might better fit current developments, we draw from a

Security and Privacy

130

608

Kevin D. Haggerty and Richard V. Ericson

different set of analytical tools to explore aspects of contemporary surveillance.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

THE SURVEILL'\.NT ASSEMBUGE

The philosopher Gilles Deleuze only occasionally wrote directly on the topic of surveillance, usually in the context of his commentaries on Foucault's work (Deleuze 1986; 1992). In conjunction with his colleague Felix Guattari, however, he has provided us with a set of conceptual tools that allow us to re-think the operation of the emergent surveillance system, a system we call the 'surveillant assemblage'. While Deleuze and Guattari were prolific inventors of concepts, we embrace only a few of their ideas. Undoubtedly, this means that we are not fully representing their thought. However, our approach is entirely in keeping with their philosophy which animates one to 'think otherwise': to approach theory not as something to genuflect before, but as a tool kit from which to draw selectively in light of the analytical task at hand (Deleuze and Foucault 1977: 208). Deleuze and Guattari introduce a radical notion of multiplicity into phenomena which we traditionally approach as being discretely bounded, structured and stable. 'Assemblages' consist of a 'multiplicity of heterogeneous objects, whose unity comes solely from the fact that these items function together, that they "work" together as a functional entity' (Patton 1994: 158). They comprise discrete flows of an essentially limitless range of other phenomena such as people, signs, chemicals, knowledge and institutions. To dig beneath the surface stability of any entity is to encounter a host of different phenomena and processes working in concert. The radical nature of this vision becomes more apparent when one realizes how any particular assemblage is itself composed of different discrete assemblages which are themselves multiple. Assemblages, for Deleuze and Guattari, are part of the state form. However, this notion of the state form should not be confused with those traditional apparatuses of governmental rule studied by political scientists. Instead, the state form is distinguished by virtue of its own characteristic set of operations; the tendency to create bounded physical and cognitive spaces, and introduce processes designed to capture flows. The state seeks to 'striate the space over which it reigns' (Deleuze and Guattari 1987: 385), a process which involves inlroducing breaks and divisions inlo olherwise free-flowing phenomena. To do so requires lhe crealion of bolh spaces of comparison where flows can be rendered alike and centres of appropriation where these flows can be captured. Flows exist prior to any particular assemblage, and are fixed temporarily and spatially by the assemblage. In this distinction between flows and assemblages, Deleuze and Guattari also articulate a distinction between forces and power. Forces consist of more primary and fluid phenomena,

131

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The surveillant assemblage

609

and it is from such phenomena that power derives as it captures and striates such flows. These processes coalesce into systems of domination when otherwise fluid and mobile states become fixed into more or less stable and asymmetrical arrangements which allow for some to direct or govern the actions of others (Patton 1994: 161). It is desire which secures these flows and gives them their permanence as an assemblage. For psychoanalysts, desire is typically approached as a form oflack, as a yearning that we strive to satisfY. In contrast, Deleuze and Guattari approach desire as an active, positive force that exists only in determinate systems. Desire is a field of immanence, and is a force 'without which no social system could ever come into being' (May 1993: 4). As such, desire is the inner will of all processes and events; what Nietzsche refers to as the 'will to power'. As we demonstrate below, a range of desires now energize and serve to coalesce the surveillant assemblage, including the desires for control, governance, security, profit and entertainment. The remainder of this paper documents attributes of the surveillant assemblage. Some caution is needed, however, at this point. To speak of the surveillant assemblage risks fostering the impression that we are concerned with a stable entity with its own fixed boundaries. In contrast, to the extent that the surveillant assemblage exists, it does so as a potentiality, one that resides at the intersections of various media that can be connected for diverse purposes. Such linkages can themselves be differentiated according to the degree to which they are ad hoc or institutionalized. By accentuating the emergent and unstable characteristic of the surveillant assemblage we also draw attention to the limitations of traditional political strategies that seck to confront the quantitative increase in surveillance. As it is multiple, unstable and lacks discernible boundaries or responsible governmental departments, the surveillant assemblage cannot be dismantled by prohibiting a particularly unpalatable technology. Nor can it be attacked by focusing criticism on a single bureaucracy or institution. In the face of multiple connections across myriad technologies and practices, struggles against particular manifestations of surveillance, as important as they might be, are akin to efforts to keep the ocean's tide back with a broom a frantic focus on a particular unpalatable technology or practice while the general tide of surveillance washes over us all. Perhaps we risk having something still more monumental swept away in lhe Lide. Recall Foucaull's (1970: 387) conLroversial (and frequenlly misunderstood) musings at the end of The Order of Things. In this conclusion to his archaeology of how the understanding of Man has been transformed in different epochs as humanity came into contact with different forces, Foucault suggests that If those arrangements were to disappear as they appeared, if some event of which we can at the moment do no more than sense the possibility ... were to cause them to crumble, as the ground of classical thought did, at the end of the eighteenth century, then one can certainly wager that

132

Security and Privacy

610

Kevin D. Haggerty and Richard V. Ericson

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

man would be erased, like a face drawn in sand at the edge of the sea. (Foucault 1970: 387) Among the proliferation of late-modern forces which are candidates for contributing to such a radical transformation we can include the intensification of technologized forms of observation. COMPONENT PARTS

The analysis of surveillance tends to focus on the capabilities of a number of discrete technologies or social practices. Analysts typically highlight the proliferation of such phenomena and emphasize how they cumulatively pose a threat to civil liberties. We are only now beginning to appreciate that surveillance is driven by the desire to bring systems together, to combine practices and technologies and integrate them into a larger whole. It is this tendency which allows us to speak of surveillance as an assemblage, wilh such combinations providing for exponential increases in lhe degree of surveillance capacily. Ralher lhan exemplifying Orwell's totalitarian state-centred Oceana, this assemblage operates across both state and extra-state institutions. Something as apparently discrete as the electronic monitoring of offenders increasingly integrates a host of different surveillance capabilities to the point that no one is quite sure any longer what [Electronic Monitoring] is. Voice, radio, programmed contact, remote alcohol testing, and automated reporting station ('kiosk') technologies proliferate and are used both singly and in a dizzying array of combinations. (Renzeman 1998: 5) The police are continually looking for ways to integrate their different computer systems and databases, as exemplified by ongoing efforts by the FBI forensics section to link together databases for fingerprints, ballistics and DNA (Philipkoski 1998). Still another example of such combinations is the regional police computer system in Central Scotland Phone conversations, reports, tip-offs, hunches, consumer and social security databases, crime data, phone bugging, audio, video and pictures, and data communications are inputted into a seamless GIS [geographic information system], allowing a relational simulation of the time-space choreography of the area to be used in investigation and monitoring by the whole force. The Chief Constable states: 'what do we class as intelligence in my new system in the force? Everything! The whole vast range of information that comes into the possession of a police force during a twenty four hour period will go on to my corporate database. Everything that every person and vehicle is associated with'. (Norris and Armstrong (1997) quoted in Graham 1998: 492) In situations where it is not yet practicable to technologically link

133

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The surveillant assemblage

611

surveillance systems, human contact can serve to align and coalesce discrete systems. For example, various 'multi-agency' approaches to policing are institutionalized. Originally, such efforts were wedded to a welfarist ideology of service delivery, but in recent years social service agencies have been drawn into the harder edge of social control (O'Malley and Palmer 1996; Ericson and Haggerty 1999). The coming together (face-to-face, or through electronic mediation) of social workers, health professionals, police and educators to contemplate the status of an 'at risk' individual combines the cumulative knowledge derived from the risk profiling surveillance systems particular to each of these institutions.

THE BODY

A great deal of surveillance is directed toward the human body. The observed body is of a distinctively hybrid composition. First it is broken clown by being abslraclecl from ils lerrilorial selling. Il is lhen reassembled in clifferenl sellings lhrough a series of clala flows. The resull is a clecorporealizecl body, a 'clala double' of pure virlualily. The monitored body is increasingly a cyborg; a flesh-technology-information amalgam (Haraway 1991). Surveillance now involves an interface of technology and corporeality and is comprised of those 'surfaces of contact or interfaces between organic and non-organic orders, between life forms and webs of information, or between organs/body parts and entry /projection systems (e.g., keyboards, screens)' (Rogarcl 1996: 33). These hybrids can involve something as direct as tagging the human body so that its movements through space can be recorded, to the more refined reconstruction of a person's habits, preferences, and lifestyle from the trails of information which have become the detritus of contemporary life. The surveillant assemblage is a visualizing device that brings into the visual register a host of heretofore opaque flows of auditory, scent, chemical, visual, ultraviolet and informational stimuli. Much of the visualization pertains to the human body, and exists beyond our normal range of perception. Rousseau opens The Social Contract with his famous proclamation that 'Man was born free, and he is everywhere in chains'. To be more in keeping with the human/machine realities of the twenty-first century, his sentiment would better read: 'Humans are born free, and are immediately electronically monitored'. If such a slogan seems unduly despairing, one might consider the new electronic ankle bracelet for infants, trademarked HUGS, which is being marketed to hospitals as a fully supervised and tamper-resistant protection system that automatically activates once secured around an infant's ankle or wrist. Staff [are] immediately alerted at a computer console of the newly activated tag, and can enter pertinent information such as names and medical

Security and Privacy

134

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

612

Kevin D. Haggerty and Richard V. Ericson

conditions. Password authorization is needed to move infants out of the designated protection area and -if an infant is not readmitted within a predetermined time limit- an alarm will sound. An alarm also sounds if an infant with a Hugs tag is brought near an open door at the perimeter of the protected area without a password being entered. The display console will then show the identification of the infant and the exit door on a facility map. Alternatively, doors may also be fitted with magnetic locks that are automatically activated. As well, Hugs can be configured to monitor the progress and direction ofthe abduction within the hospital. Weighing just 1/3 of an ounce, each ergonomically designed infant tag offers a number of other innovative features, including low-battery warning, the ability to easily interface with other devices such as CCTV cameras and paging systems and time and date stamping. (Canadian Security 1998) Professor Kevin Warwick of Reading University is the self-proclaimed 'first cyborg,' having implanted a silicon chip transponder in his forearm (Bevan 1999). The surveillance polenlial of lhis lechnology has been rapidly embracecllo monilor pels. A microchip in a pel's skin can be read with an electronic device which connects a unique identifying number on the microchip to details of the pet's history, ownership and medical record. Warwick has proposed that implanted microchips could be used to scrutinize the movement of employees, and to monitor money transfers, medical records and passport details. He also suggests that anyone who wanted access to a gun could do so only if they had one of these implants ... Then if they actually try and enter a school or building that doesn't want them in there, the school computer would sound alarms and warn people inside or even prevent them having access. (Associated Press 1998) These examples indicate that the surveillant assemblage relies on machines to make and record discrete observations. As such, it can be contrasted with the early forms of disciplinary panopticism analysed by Foucault, which were largely accomplished by practitioners of the emergent social sciences in the eighteenth and nineteenth centuries. On a machine /human continuum, surveillance at that time leaned more toward human observation. Today, surveillance is more in keeping with the technological future hinted at by Orwell, but augmented by technologies he could not have even had nightmares about. The surveillant assemblage does not approach the body in the first instance as a single entity to be molded, punished, or controlled. First it must be known, and to do so it is broken clown into a series of discrete signifYing flows. Surveillance commences with the creation of a space of comparison and the introduction of breaks in the flows that emanate from, or circulate within, the human body. For example, drug testing striates flows of chemicals, photography captures flows of reflected lightwaves, and lie

135

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

The surveillant assemblage

613

detectors align and compare assorted flows of respiration, pulse and electricity. The body is itself; then, an assemblage comprised of myriad component parts and processes which are broken-down for purposes of observation. Patton (1994: 158) suggests that the concept of assemblage 'may be regarded as no more than an abstract conception of bodies of all kinds, one which docs not discriminate between animate and inanimate bodies, individual or collective bodies, biological or social bodies'. It has become a commonplace among cultural theorists to acknowledge the increasing fragmentation of the human body. Such an appreciation is evidenced in Grosz's (1995: 1 08) schematic suggestion that we need to think about the relationship between cities and bodies as collections of parts, capable of crossing the thresholds between substances to form linkages, machines, provisional and often temporary sub- or micro-groupings ... their interrelations involve a fundamentally disunified series of systems, a series of disparate flows, energies, events, or entities, bringing together or drawing apart their more or less temporary alignments. Likewise, the surveillant assemblage standardizes the capture of flesh/information flows of the human body. It is not so much immediately concerned with the direct physical relocation of the human body (although this may be an ultimate consequence), but with transforming the body into pure information, such that it can be rendered more mobile and comparable. Such processes are put into operation from a host of scattered centres of calculation (Latour 1987) where ruptures arc co-ordinated and toward which the subsequent information is directed. Such centres of calculation can include forensic laboratories, statistical institutions, police stations, financial institutions, and corporate and military headquarters. In these sites the information derived from flows of the surveillant assemblage are reassembled and scrutinized in the hope of developing strategies of governance, commerce and control. In the figure of a body assembled from the parts of different corpses, Mary Shelly's Frankenstein spoke to early-modern anxieties about the potential consequences of unrestrained science and technology. Contemporary fears about the implications of mass public surveillance continue to emphasize lhe dark side of science. Today, however, we are wilnessing lhe formation and coalescence of a new type of body, a form of becoming which transcends human corporeality and reduces flesh to pure information. Culled from the tentacles of the surveillant assemblage, this new body is our 'data double', a double which involves 'the multiplication of the individual, the constitution of an additional self' (Poster 1990: 97). Data doubles circulate in a host of different centres of calculation and serve as markers for access to resources, services and power in ways which are often unknown to its referent. They are also increasingly the objects toward which governmental and marketing practices are directed (Turow

Security and Privacy

136

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

614

Kevin D. Haggerty and Richard V. Ericson

1997). And while such doubles ostensibly refer back to particular individuals, they transcend a purely representational idiom. Rather than being accurate or inaccurate portrayals of real individuals, they are a form of pragmatics: differentiated according to how useful they are in allowing institutions to make discriminations among populations. Hence, while the surveillant assemblage is directed toward a particular cyborg flesh/technology amalgamation, it is productive of a new type of individual, one comprised of pure information.

RHIZOIVLituations where person(f>) of higher authority (e.g. f.ecurity guards, department f.tore ownerf., or thP likP) watch ovPr citizPns, suspPets, or shoppPrs. ThP higllf'r authorit~r ha:-; oft.Pn lwPn said to lw :'Godlikp'' rathPr than down at the f.hopperf. photographing shopkeepers, and taxicab passPngPrs photographing cab drivPrs. a:-; WPll a:-; pPrsonal sousvPillancP (bringing camPras from thP lamp posts awl ('eilingf.) down to eye-lcvrl) for human-(·entered recording of pPrsonal PXpPriPncP). It should be noted that the two af.pe('tf. of f.ousveillan('e (hirran·hy rcverf,al and lmmaH-('Ciltcrrdnesf.) often interchange. P.g. thP drivPr of a cab (Hlf' da~r. ma~r lw a passPngPr in somPonP PlSP)S cab thP nPXt da~r. Thuf. a main feature of :.f.ousveillan('(/' af. a tool for multimedia mtbts is rffortlef.f. ('aptun\ pro('essing, storage) recalL

146

Securi~v

and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

621 and tranf.mbf.ion of an activity by a participant in thr a('tivity. Disclainrer the role of the individual artist and personal passion outside the traditional academic laboratory: Drcanf.c thb paper dcs('ribcf. the author'f. own personal cxprrirnccs of inventing, drf.igning, building, and living with a variPt~r of body bornP eornputPr-bc_u.;Pd visual information capture\ procrf.f.ing) awl mediation dcvicrf. in f'Vf'I)·da~r lifP, thPrf' is a llf'CPssar~r narrathrp Plf'lllf'nt that would 1w diminislH-'d if it WPH" forn-'d to conform to thP objectivity Uf.ually found in a f.cholmly mticlc. Thr practicr brginning in the author's childhood) involved ;3(1 ~rp;_us of1warabh-' (wPrs CrlrllH-'ratiVP J(Jurnalism.

Pa('kmd 'f. :'Caf.ual Capture" projc('t also build upon varionf. of SOUSVPilla.nef'. Sonf.vrillancr is rrlatrd (cvrn if by invcrsrs) to the tntclition of sur·vpillann->, and to tllf' c_utistic practin-' Pxplon"d by mtists. such as .TuliP Sdlf'r, and thP SurvPillaneP CamPra Playrrf.) among othrrs, working in the medium of survcilCOIH'f'lJtS

laH('C.

Organizations such a:-; FuturP Physical an-' also" stn-'tching a human advPntun,'' and dPvPloping "cultural program Pxploring boundariPs 1wtwPPn virtual and ph~rsi­ ('aF', e.g. ''How will thr human body intrra('t with digital tools in the futurr?''. Ser for example, \Vearablc Computing Links, www. futun-'1 Jh~rsical.org/1 JagPs / contPnt / wpar·a blP /links .html In rPlation to thP FinP Arts. thP continuous naturP of sousvPilla.neP (i.P. continuous an·hival of pPrsonal PXpPri('ll('e) is very much likr the ('OlH'ept of :'living art''. Tehching defined '·living aJ:t" perfornikin. The author fiiHb that Dcrmabond (T~vi) wound dof.urr matrrial manufactured by Closure ::vicdical is oftPn usPful for making. gnnving. or maintaining dPnuaplants.

Security and Privacy

149

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

624

(a)

(b)

Early tools for lightvector paintings: .Jacb-•t hasPd that was cmnplt->tPd in tlw sumnwr rlf 198f) was USPd with a 2..± kJ flashlamp in a 14 inch rPf!PctrJr. long conlnlunicatirms antPllllaS . frmn the aud o~H' frmn the jacket based computer. (b) A userfriendly mass-produced tool for artists to use provides clear step-by-step instructions on a TV screen that's attached to a light source. The TV staudard ::\fTSC TV) attaches to the buttmn uf the a staudard electronic flash attaches tu the automotive) cr1mptltPr at tlw base-> statirm and P!iminatP nc->Pd fr1r a cumbPrsonw WPCHufthe 1D70s and lDBOs lightvcctur tu ruake work arc freely

FigurP :3: Painting with lightvc->ctors: difft.•n•nt SrilUCPS

fnilll \'etririUS PXlHISUl'PS to

rlf

the upcu bascHH'llt dour nuder cell block tu light froru a flash held to the thPll lllUVPd to tJH-'

is

four pla('rf. on the right ryr, to a('Commodatc a brrak in the rycglaf.f. framr along thr right rye (the right lrns bring hrld on with two miniature boltf> on cithrr f.idc of the brrak). Thr author then bondrd fibrr optic bundlrf> concealed by tllf' framPs. to loeatP tllf' eaJ.uPra and ;n·PlWt.e in back of tllf' dPVief'.

Thr rycglaf.srf. of Fig 5 wrrc crudr awl simple. A more sophisticated dcf.ign usrf. a plaf>tk ('Oating to complctrly con('cal all the rlcmrntf.) so that rvrn when rxaminrd dof.rly, rvidciH'r of the EyrTap is not visiblr. ThP pPPr discrimination b~r thP 1ua:-;:-;ps was also simply SPPn as a mattPr of Pducation and accPptancP. ThP author found that this form of dbcrimination began to dcclinr sharply in thr mid 19cSCb (brginning around 19cS4) amid the new-wave androgyny whcrr transhumanbm brgan to take accPptancP first in thP transgPndPr cmumunity and tllf'n in soeiPty as a wholP). Dy thr 199Cb, such pcrr dis('rimination had largely dbapprmrd) yet thr organhational dbcrimination ('ontinurd to in('rca..--,c and intrnsi±y. For rxcuuplr) rr('cntly) thr author was physically assaultrd by a number of sr('urity guards at tllf' Art GallPry of Ontario. nathPr than a:-;king tllf' author to lPaVP. thP guards simply pusllf'd tllf' author out of tllf' gallrry. The author latrr askrd thr Chirf Curator af. to thr

rPason for this action. ThP rPason givPn was a possibility of cop~rright infringPmPnt. This rabcs an important qurf.tion af. to thr right to fair usc of one's prrsonal environs, and prrf.onal rxprrirn('cS, rf.pPcia.lly in viPw of an acquirPd dPpPndPncP on cmuputPrii';Pd vbual memory. It srcmrd thr author had unwittingly come to confront, PxplorP. and undPrstand issuPs concPrning tllf' mvnPrship of spacP and whPthPr such ownPrship should provirtc->d along bundle. into a 1niuiaturc camcr~t. directs auothcr fiber systcu1 on the left temple to redraw the ontr1 tlw rPtina. Tlw dt->tails an· providPd in [7].

awl would ('ausr morr drtailcd rc('ordings of ra('h onr to br made. It b thcrrforc futile to resort to violcll('(' af. a mrans of suppn-'ssing PvidPneP gatllf'ring tPehnologiPs. Thus thP fundarnPntall~r most difficult PlPIHPnt of discrimination appraJ:rd to br the official disnimination basrd on functionality of the ('yborg. The author brgan to undrrf.tawl this dbcrimination throughout thr 1970s and caJ:ly 1980s, af. being correlated to thr dPgn-'P of survPillaneP pn"sPnt in an PstablishrrH-'nt. It appPaTPd, for PXaiuplP. that thP PstablishmPnts whPrP official discrimination wa.--,. gTratest) werr the very f.amr rf.tablishmrnts where thr Uf.e of video survrillancr was the grratrf.t. Therrfore thr author, through simply a personal desire to livP in a cmuputPr mPdiatPd world. PncountPrPd hostilitiPs from paranoid sPcurit~r guanls, Sf'f'lllingl~r afraid of lwing held ac('ountablr. It f.ermed that thr vrry proplr who pointed camrntf. at cithenf. wrre thr onrs who wrre mof.t afraid of nrw invrntionf. and trchnologics of cithrn camrras. The harf.h and sometimes hostilr dis('rimination against thP author, by officials, sPcurity guanls. and rPlH'f'SPntativPs of largP organizations lPd tllf' author to lwgin thinking mainly about official dis('rimination against ('yborg functionality. In ordrr to lraJ_.n from thrf.e hof>tilitirf.) thr author wbhrd to understand thb dis('rimination by applying thP sciPntific mPthod. within an PthnomPthodologieal sPnsP. which PvolvPd into using bod~r-bornP multimPdia cmuputation af. a tool for f.ocial inquiry and action re8earch [:-.l][cS] on surveillancr as an emergent agenda. Howrver thr unique framPwork and situation did not conform to a partieulc_u acaillancc->, typically attailwd thwugh tlw USP rlf pc->n·asivP r1r ubiquitrJUS cmnputing.

Privacy violation may go Privacy violation is usually un-noticed, or un-checked. immediately evident. Tends Tends to not be self-correcting. to be self-correcting. It's hard to have a heart-toheart conversation with a lamp post on top of which is mounted a surveillance camera.

At least there's a chance you can talk to the person behind the sousveillance camera.

When combined with computers, we get ubiquitous computing ("ubiqcomp") or pervasive computing ("pervcomp"). Perveillance (ubiq. /perv. comp) tends to rely on cooperation of the infrastructure in the environments around us.

When combined with computers, we get wearable computing. ("wearcomp"). Wearcomp usually doesn't require the cooperation of any infrastructure in the environments around us.

With surveillant-computing, the locus of control tends to be with the authorities.

With sousveillant-computing, it is possible for the locus of control to be more distributed. and, in particular, to rest

iuterrnediaries. Examarises from the compu-

modifit->d SPllSr1ry input of a St->lf-crmstructc->d P!Pctric

lauce or

way uf a huruan-burne (alsu as reverse surveilsurveillance), i.e. the recording or rnuuituriug

of a high ranking official by a pt->rson rlf lm,vc->r autlwrity. • SrJusvival: A sustailwd PXistPncc-> farP. To activt->ly SPPk \Vays of uthers.

6.

[I[

REFERENCES

with the individual.

A hugrr sympof>inm on f.ousveillance b planned for 2005. I\Iore b writtrn on sonf.vrillan('r af. thr paper corresponding to the krynote addrrf.f. at a workshop atta('hrd to this ACI\I I\Iultirw-'dia confPrPncP, PntitlPd ':Continuous Archival and TIPtriPval of PPrsonal ExpPriPneps'' (CARPE). In that 20 papr paper) rclatrd concrpts of surjf.ollf.vrillan('r) rquivPillarH'P, and auditor/viditor rPlationships arP dPseri1wd, togrthrr with EyeTap drvice invention, design, and rralization.

4.

[:3] \V. Carr and S. Kt->mmis.

[4[

[S[

ACKNOWLEDGEMENTS

1\l likr to thank my many past and presrnt f.tudents, in particulaL .JarnPs Fung, Corp~r :\IandPrs, DaniPl ChPn (dusting), :\Iark Post (sPquPnePr), Chris AirnonP, and Anurag SPhgal (kp~rpr), who\rp tolPratPd and eontrilmtPd to thP growth oftllf' c_utistic praetieP of sousvPillarH'P, as WPll as AlPx .JaiiuPs who haf. made many Uf.cful suggef.tions on the original awl rrvisrd nmnnf.criptf.. I'd abo likr to acknowledge our many sponsors, including Nikon, for supplying thP earnPra s~rs­ tPillS, and Da~riuPn Photo, for supplying thP I\IPt!'; I\IPeablit!'; units usPd in our lightvPetoring art.

[K[ [0[

[HI[

Edttcatwn. Falnwr Prc->ss, Lrmdrm,

[9] SURVEILLANCE AS CULTURAL PRACTICE Torin Monahan*

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Vanderbilt University

This special section of The Sociological Quarterly explores research on "surveillance as cultural practice;' which indicates an orientation to surveillance that views it as embedded within, brought about by, and generative of social practices in specific cultural contexts. Such an approach is more likely to include elements of popular culture, media, art, and narrative; it is also more likely to try to comprehend people's engagement with surveillance on their own terms, stressing the production of ernie over etic forms of knowledge. This introduction sketches some key developments in this area and discusses their implications for the field of "surveillance studies" as a whole.

Jorge Luis Borges (1962:25) once wrote, "A system is nothing more than the subordination of all the aspects of the universe to some one of them." As with all scholarly fields, surveillance studies has for a long time privileged certain theoretical frames over others. There have been remarkable growths and mutations in the study of surveillance as the field has engaged with, modified, and sometimes rejected influential concepts such as the panopticon, Gig Brother, and privacy (Regan 1995; Haggerty and Ericson 2000; Gilliom 2001), but a focus on institutional-level power dynamics has been a gravitational force, pulling other scholarly approaches into its orbit and sometimes eclipsing promising alternative modes of inquiry. There are logical reasons for this. After all, surveillance is about exercises of power and the performance of power relationships, most of which are more evident when status and other hierarchies are pronounced. Some of the originary and most influential works in the field started with a critique of institutional power or of the activities of institutional actors. For instance, James Rule (1973) traced the ways in which datacollection practices of large bureaucracies facilitate privacy invasion and social control of individuals. Gary Marx ( 1988, 2002) probed the covert practices employed by police using new surveillance technologies, such as infrared cameras, to obtain intelligence on subjects without corresponding increases in legal or procedural protections. David Lyon (1994, 2001), William Staples (1997, 2000), and others drew attention to the routine, systematic, and automated collection of data on individuals by organizations, contributing to the production of surveillance societies and enforcing corresponding degrees of social control. Haggerty and Ericson (2000) developed the concept of "the surveillant assemblage" to describe the ways that the many information systems to which people are exposed translate bodies into abstract data, which are then re-assembled as decontextualized "data doubles" upon which organizations act. While this general emphasis on

'Direct all correspondence to Torin Monahan, Deparment of Human and Organizational Development, Vanderbilt University, Peabody #90, 230 Appleton Place, Nashville, TN 37203-5721; e-mail: torin.monahan@ vanderbilt.edu

154

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance as Cultural Practice

Torin Monahan

institutional or organizational power has been amazingly productive, it also set a trajectory from which it has been difficult to deviate. Nonetheless, the field's areas of interest are changing, and have been for some time. As scholars trained in different academic disciplines entered into the field and participated in its conversations, foci shifted-along with methods and theories-to be more inclusive of the full range of surveillance systems and activities throughout societies. Whereas surveillance studies may have gained considerable momentum from the early works of sociologists, it has now expanded to become a truly transdisciplinary enterprise with representatives from sociology, criminology, political science, philosophy, geography, science and technology studies, communication, media and information studies, anthropology, and other fields. This has brought about an enhancement rather than a dilution of sociological inquiry; it has fostered a sociological imagination in the deepest sense of the term, of tracing everyday practices of surveillance in local contexts to larger assemblages of power and influence. Moreover, these changes in disciplinary demographics have forced scholars to debate the direction of the field, criteria for evaluating scholarship, and definitions of surveillance, thereby requiring members to confront and defend, and oftentimes revise, their own disciplinary perspectives, subsequently advancing collective knowledge in the process. 1 The task of this special section of The Sociological Quarterly is to identifY some recent developments in surveillance studies as the field undergoes what I view to be healthy expansion and redefinition. In particular, this section seeks to explore research on "surveillance as cultural practice;' which indicates an orientation to surveillance that views it as embedded within, brought about by, and generative of social practices in specific cultural contexts. Rather than analyzing surveillance technologies, for instance, as exogenous tools that are mobilized by actors to deal with perceived problems or needs, studying surveillance as cultural practice would understand these technologies a priori as agential (as "actants" within a social system) and constitutive of knowledge, experience, and relationships. Such an approach is more likely to include in the field of inquiry elements of popular culture, media, art, and narrative; it is more likely to try to comprehend people's experiences of and engagement with surveillance on their own terms, stressing the production of ernie over etic forms of knowledge. Studies of surveillance as cultural practice offer vital insights to surveillance and-as with other approaches-such studies similarly pursue critical understandings of complex systems; they just start, oftentimes, with data residing at different points within those systems. SOCIAL STUDIES OF SURVEILLANCE

Whereas much of the accepted theoretical apparatus of surveillance studies has contended with institutional-level power dynamics, as witnessed by the influence of Michel Foucault's ( 1977) treatment of the panopticon, social studies of surveillance tend to concentrate on individuals in local contexts. Oftentimes, this means holding empirical data on local practices up to existing concepts to see whether the data fit those concepts, and if not, deciding how theory should be modified to account for differences (e.g., Lyon 496

155

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Torin Monahan

Surveillance as Cultural Practice

2006). This is effective on one level because it advances knowledge, but it can lead to diminishing intellectual returns, especially if scholars content themselves with making modifications to concepts rather than developing something altogether new. 2 The risk of this mode of knowledge production, which is of course not unique to surveillance studies, is forcing concepts upon data instead of allowing patterns to emerge in a more organic and inductive way (Clarke 2005; Charmaz 2006). It is nonetheless clear that the field has advanced rapidly because of empirical research on surveillance, which itself has expanded out from those doing surveillance, to those subjected to it, to those appropriating it for their own purposes. Some of the notable early work in this evolution was by criminologists studying police and security personnel operating public-area, closed-circuit television (CCTV) systems (Norris and Armstrong 1999; McCahill 2002; Wilson and Sutton 2003 ). The privileging of institutional actors like the police encouraged the development of analytic frames that tried to account for the political conditions that fuel CCTV implementation (Fussey 2007), as well as for the motivations and intentions of those behind the cameras (Goold 2004; Smith 2004). In this vein, David Lyon (2001) made the insightful observation that different forms of surveillance could be positioned along a spectrum from "care" to "control"-from watching over one for purposes of protection to scrutinizing one's behavior in order to enforce discipline, respectively. This was a major contribution in that it called upon scholars to eschew simplistic critiques of surveillance as inherently negative; rather, evaluations of surveillance would have to be made on a case-by-case basis, acknowledging the reality that surveillance often operates simultaneously in both of these registers (care and control). Taken on its own terms, though, this insight also raises to the surface the limitation that such evaluations effectively lend greater validity to the intentions of surveillance subjects, while subordinating the experiences and agency of those monitored as surveillance objects. Departing from investigation into CCTV and the police, a turn to study surveillance in everyday life exploded the field, directing researchers to document the manifold instantiations of surveillance in routine activities and engagements with all organizations (Staples 2000; Monahan 2006; Deflem 2008; Aas, Gundhus, and Lomeli 2009; Nelson and Garey 2009). From this perspective, researchers noted that effects and experiences of surveillance differ by population, purpose, and setting. The many surveillance systems to which people are exposed sort populations according to anticipated risk and value (Torpey 2007). Such "social sorting" (Lyon 2003) manifests in the unequal regulation of people's mobilities (Graham and Wood 2003; Adey 2006), unequal monitoring and disciplining of people accessing public services (Eubanks 2006; Willse 2008; Monahan and Torres 2010), unequal treatment of consumers (Gandy 1993; Turow 2006), and unequal handling of people in just about every other domain as well. It is important to note that surveillance does not simply slow down or single out people considered risky-it also accelerates and augments the experiences of people considered to be of commercial value and low risk, as can be seen with dedicated toll lanes on highways (Graham and Marvin 2001), priority response from call centers (Graham and Wood 2003), or security prescreening and preapproval schemes at airports (Adey 2006; 497

156

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance as Cultural Practice

Torin Monahan

Lahav 2008). Social sorting characterizes just about all contemporary surveillance systems, the net result being the amplification of many social inequalities (Monahan 2010b). Targets of marginalizing forms of surveillance deal with their experiences in thoughtful ways, oftentimes mitigating deleterious effects through subtle forms of resistance. john Gilliom's work exemplifies these possibilities, where in Overseers of the Poor, for instance, he unearths some of the tactics used by women welfare recipients to evade the bureaucratic surveillance of welfare systems and records their sophisticated ethical rationales for their actions (Gilliom 2001). More recently, Gilliom (2010) has studied some of the ways that public school teachers and administrators attenuate the disciplinary force of a widespread and routine form of institutional surveillance: standardized tests. Surveillance-studies scholars have documented and problematized other resistance practices too, such as Cop \"'atch programs where activists film police to try to reduce instances of abuse (Huey, Walby, and Doyle 2006; Wilson and Serisier 2010) or technological interventions where groups monitor state agents and use cell phone text messages to coordinate police avoidance at mass public protests (Institute for Applied Autonomy 2006). Recognizing the agency of the watched is one crucial aspect of inquiry into surveillance as cultural practice, even if resistance sometimes confirms, more than challenges, the reach of abstract systems of control. If one employs a symmetrical approach to research (Bloor 1991), however, then the set of technologies, techniques, and practices that the field calls "surveillance" should be identified and studied as such when deployed by individuals or groups operating outside government or corporate organizations (Monahan, Phillips, and Murakami Wood 2010). Surveillance can be defined as the systematic monitoring of people or groups in order to regulate or govern their behavior. This is but one possible definition, of course, but it is useful for being agnostic about the subjects and objects of scrutiny and control. Surveillance can be mobilized to repress populations or bring about conditions of collective empowerment; it can be used by people occupying positions of high institutional status or by those excluded from traditional arenas of power and influence. From this perspective, surveillance can serve democratic or empowering ends if it brings about openness, transparency, accountability, participation, and power equalization among social groups and institutions (Monahan 2010a). For example, Gwen Ottinger (20 10) writes about grassroots monitoring of air quality by people living near oil refineries in Louisiana, which when coupled with some control over the criteria for deciding what constitutes a health risk has the potential to empower residents, regulate polluting industries, and make communities safer. In another example, james Walsh ( 2010) shows how progressive activist groups engage in technological surveillance of the U.S.-Mexico border, border agents, and vigilantes to prevent immigrant deaths, by using geographic information systems, for instance, to determine where to site water stations. Lane DeNicola (2009) investigates activists' use of earth remote sensing satellite systems, long associated with military operations, to engage in environmental forensics and 498

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Torin Monahan

157 Surveillance as Cultural Practice

counter-mapping efforts to render visible environmental disasters, overdevelopment, and even genocide, subsequently introducing a valence for community and government intervention. 3 This section sketched a rough continuum for research emerging from the social studies of surveillance: from the intentions and practices of the watchers, to the experiences and (re)actions of the watched, to the proactive mobilization of surveillance from below. One problem with this narrative, other than being partial and artificially linear, is that it does not adequately account for the thorough integration of surveillance into social worlds, not as a set of tools to be used for instrumental ends but as forms of life in their own right.

EXPLORING CULTURAL PRACTICE People, as creative actors, constantly draw upon and reproduce cultural knowledge (de Certeau 1984). While each culture maintains itself through habitus, or through a series of logics that acquire durability and presence through practice (Bourdieu 1977), this is an evolving play that exceeds the instrumental objectives of individuals. In fact, the bulk of everyday life is comprised of unplanned events, occurring on micro-levels of human interaction, below the surface of conscious awareness or intentionality (Mauss 1973; Ortner 1994). Technological systems are clearly integral to cultural practice and important components of modern myth and ritual (Pfaffenberger 1990 ). Just like all technologies, then, surveillance systems attain presence as negotiated components of culture and accrete meaning by tapping a culture's immense symbolic reservoirs, which can include narrative, media, and art, among other things. Nils Zurawski's article in this special section illustrates the power of narrative to weave surveillance artifacts and systems into webs of local meaning and signification. A mundane surveillance artifact-the customer loyalty card-is his foil for tracing the ways in which basic social activities like shopping plug people into vast, global networks of surveillance based on data collection and manipulation. Instead of positioning loyalty cards in the center of his map of surveillance relationships, which would be the expected approach if someone were strictly adhering to actor-network theory, for example, Zurawski starts with and privileges the narratives and practices of his informants. For them, loyalty cards are subordinate accessories to the shopping experience, which is primarily a social activity predicated on interaction with others and an affirmation that they are part of a community. As with all designed objects, the loyalty card absorbs meaning through its use and through what it symbolizes to its users (cf. Boradkar 2010). Whereas the honed surveillance-studies researcher might quickly conclude that loyalty cards are manipulative connections to extractive surveillance systems that diminish privacy and trust, by postponing judgment, Zurawski uncovers something more interesting and empirically accurate: Even when people are aware that they are giving away personal data, this is of little concern to them and is a trivial part of the larger shopping experience. This does not imply in the least that consumer-based surveillance systems are inconsequential or that researchers should 499

158

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance as Cultural Practice

Torin Monahan

stop analysis with the reporting of informant articulations; rather, it highlights the challenges researchers face in situating those systems in local and global contexts and mobilizing cultural critique that takes seriously the perspectives of the people being studied. Ariane Ellerbrok, also in this special section, similarly destabilizes easy criticisms of commercial face-recognition applications by showing the ways in which these biometric systems can be enjoyable and fun for their users. Automated face-recognition systems invite users of Facebook, Picasa, iPhoto, or other applications to identify people within their digital photo libraries by training the programs to link names with faces. This can be a playful experience for users, predicated on creating varied groupings of photos and sharing within and across social networks. As with Zurawski's discussion of shopping, play is a vital cultural practice that demands theorization (Albrechtslund and Dubbeld 2005). Play may also serve as a mechanism of enrolling users in their own exploitation as they willingly generate data for the benefit of industry and government organizations. While this larger critique should be integrated into robust analysis, it can be insulting to begin from a position that presumes people are dupes and that they simply do not understand their situations as clearly as do researchers. Rather, people can and do appropriate surveillance systems for their own ends to achieve forms of recognition, independence, and empowerment (Burgin 2002; Koskela 2004), to embed themselves in social networks (Regan and Steeves 2010), and to become creative-and critical-producers of content that others can appreciate and enjoy (Postigo 2008). Finally, David Barnard-Wills's article explores media discourses of surveillance and investigates their role in shaping public knowledge and debate. The media have long been recognized as fostering "moral panics" and circulating misleading information about public threats, which is something that has been well documented with discussions of terrorism and national security (Altheide 2006; Monahan 2010b ). With a focus on UK print media, Barnard-Wills argues for an expansion of conceptual categories to take seriously the discursive engine that propels meaning-making practices about surveillance, as a complement to examination of sociotechnical systems and more traditional forms of politics. Whereas the concept of the surveillant assemblage stresses the "machinic" elements of the system, Deleuze and Guattari's ( 1987) original assemblage concept also included an enunciative dimension that is often neglected or marginalized in many treatments of surveillance. Enunciations can be understood, drawing upon linguistic theory, as requiring contextual cues to interpret meaning because unlike statements, they neither aspire toward generalization nor contain sufficient information to be understood on their own (Barthes 1986). The enunciative dimension of surveillance, therefore, must always be grasped in local contexts, which should, in turn, push scholars to confront cultural, geographic, and other differences and be suspicious of grand generalizations about the role of surveillance or the existence of "the surveillance society" as something singular or monolithic (pace Murakami Wood 2009b ). The mass media, as well as alternative forms of media, are keys to the unfolding and understanding of surveillance systems (Gates and Magnet 500

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Torin Monahan

159 Surveillance as Cultural Practice

2007). Engagement with media, as producers or consumers (or as "prosumers"), is a local cultural practice with global significance. There are many other possible avenues for the study of surveillance as cultural practice. The creation and study of artistic interventions are clearly fruitful in this regard, as artists provide imaginative resources that oftentimes channel latent concerns and anticipate future worlds in ways that social scientists would have difficulty doing without deviating from disciplinary norms. Artistic works or performances, which enroll others as witnesses or actors, can also serve as vital agents of social change. Because the topic of surveillance seems to lure creative minds, the field has been in a loose conversation with artists, fiction writers, and their robust material for a while (e.g., Levin, Frohne, and Weibel 2002; Nellis 2005; Murakami Wood 2009a; Vee! 2010). Surveillance-themed films, novels, photographs, plays, performance pieces, installations, and the like abound, and some artists have made explicit forays into the field of surveillance studies (e.g., Levin eta!. 2002; Institute for Applied Autonomy 2006; Surveillance Camera Players 2006; Luksch 2008). And while there has been one special issue of the journal Surveillance & Society devoted to the subject, there is ample room for more serious treatments of artistic works in the field. Similarly, popular culture in general presents abundant material for explorations of surveillance in societies. Notable in this regard is John McGrath's (2004) book Loving Big Brother, which interrogates how people use and understand surveillance systems and how television shows and movies contribute to cultural imaginaries. The field is rapidly coming to grips with cultural practices in this sense and working to theorize them in connection with broader political economies. Some of the work being done in these directions includes research on general interactive media (Andrejevic 2007), social networking (Albrechtslund 2008), games (Chen 2008; Steeves 2010), cell phones (Koskela 2004, 2009), and television (Trottier 2006). Studies of surveillance in popular culture open a window into the construction and interrelation of symbols that shape quotidian meaning, on one hand, and that operate as powerful truth constructs that drive ideology and policy, on the other (Monahan 2010b). In a different register, an exciting new area of investigation for the field is on cultural differences in the use and meaning of surveillance, whether within or across national boundaries. Important new work is now being done on differential surveillance experiences by race, class, gender, sexual identity, and age (Kenner 2008; Currah and Moore 2009; Browne 2010; Eubanks 2011; Magnet Forthcoming), as well as on surveillance's role in propagating intersectional forms of oppression (Campbell2006). National crosscultural comparisons are now taking off as well, whether through ambitious, multi-sited individual projects (e.g., Murakami Wood 2010 ), large-scale team efforts (e.g., UrbanEye 2004; The New Transparency 2010), or the production of findings from heretofore understudied counties, which fill the empirical record and stimulate comparison (e.g., Samatas 2004; Arteaga Botello 2007; Kanashiro 2008; Frois Forthcoming). Research on culture and surveillance in this sense is about seeking out meanings and practices in local contexts, embracing rather than ignoring particularities, and problematizing dangerous presumptions of universality. 501

160

Security and Privacy Surveillance as Cultural Practice

Torin Monahan

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

CONCLUSION: AN INVITATION TO REFLEXIVITY

Surveillance studies, as a field, is an evolving system of knowledge production. As a system, it necessarily subordinates certain elements of interest or ways of knowing to other ones. It has been organized in large part by an interest in institutional-level power dynamics and theoretical explanations of them. Thus, even as the field has expanded into a full-fledged, international network of scholars engaged in social studies of surveillance, empirical research on experiences in local contexts predictably circles back to conversations about macro-level, institutional forms of power. Obviously, the disciplinary origins and initial interests of the field's early practitioners guide these scholarly practices and provide the stage upon which scholars from other backgrounds have had to act. Because the field is growing rapidly and is transdisciplinary, the center is shifting and new areas of interest are challenging existing paradigms, which is ultimately a healthy and productive development. An approach to surveillance as cultural practice is one of the directions that surveillance studies is heading, or is being pulled. Cultural practice is merely another lens or a different point from which to view, organize, and understand the knowledge-production activities of the field. I am making no claims that this construct or its foci are better than traditional sociological interests in institutions and their agents. What I would assert is that disciplinary diversity is good and brings about more thoughtful scholarship. Research on cultural practice is currently providing a venue for marginalized disciplines within the field to assert themselves and inject alternative concepts and content areas into the collective conversation. These changes also invite critical reflexivity both for the field as a whole and its members. In many respects, research is a form of surveillance. Researchers systematically collect, organize, analyze, interpret, and disseminate data with the aim of influencing others, including those whom they study (Ball and Haggerty 2005; Haggerty 2009). Because research functions as surveillance, scholars should strive to avoid the fundamental critiques that the field's members often make of contemporary surveillance: that it affords the violent abstraction of people and their actions from their primary contexts; that it is predicated upon biased valuations of some populations or activities over others; that its governing logics are opaque, making them difficult to discern or contest; that it denies or ignores its own partiality and situatedness. Modern science aspires toward placeless knowledge, toward universal facts that do not require an explanation of their origins and that resist inquiry into the value-laden process of their construction (Latour 1987; Haraway 1988). Reflexive science, conversely, does not try to eliminate partiality and the messy particularities of knowledge construction but instead own up to them, articulate them, and subject them to further scrutiny (Woodhouse et al. 2002; Fisher 2011). To avoid reproducing that which they critique, surveillance-studies scholars should be pursuing reflexive science. They should try to keep their research embodied and grounded in its full context, interrogate the values and constraints of their systems of knowledge production, and be suspicious of truth claims that float above particularities. 502

161

Security and Privacy Torin Monahan

Surveillance as Cultural Practice

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

It is crucial to note that being reflexive is not at all the same thing as being reflective. As Lynch and Woolgar (1990) explain: The organization, sense, value, and adequacy of any representation is "reflexive" to the settings in which it is constituted and used.... "Reflexivity" in this usage means, not self-referential nor reflective awareness of representational practice, but the inseparability of a "theory" of representation from the heterogeneous social contexts in which representations are composed and used. (Pp. 11-12) Perhaps the emerging "cultural studies of surveillance" is better equipped to embrace reflexivity, at least as an unproblematic starting point, because its constitutive and aligned fields (literary studies, film and media studies, science and technology studies, cultural anthropology, and some stripes of communication) already prioritize local meanings, interpretations, and knowledge construction. Even if being reflexive may be an uncomfortable mode for people operating more firmly in the social studies of surveillance, which is clearly still the dominant orientation, these practitioners, as well as the field as a whole, could surely benefit from taking steps in this direction. The trend toward studying surveillance as cultural practice is encouraging in this regard because it directs attention to local, grounded meanings as the primary units of analysis, which can implicitly challenge current hegemonic organizing frames, as the articles in this special section demonstrate.

ACKNOWLEDGMENTS I give special thanks to the contributing authors and to the journal's editors and peer reviewers who generously invested their time and energy to make this special section possible.

NOTES 'The online journal Surveillance & Society has been one of the primary forums where these advances have occurred, along with many workshops, conferences, and edited volumes. 'For instance, the field is awash with embellishments on the concept of the panopticon, including the superpanopticon (Poster 1990), synopticon (Mathiesen 1997), ban-opticon (Bigo 2006), and oligopticon (Latour and Hermant 2006). 3See, for example, http://www.skytruth.org or earth.google.com/outreach/cs_darfur.html.

REFERENCES Aas, Katja Franko, Helene Oppen Gundhus, and Heidi Mork Lomeli. 2009. Technologies of InSecurity: The Surveillance of Everyday Life. New York: Routledge- Cavendish. Adey, Peter. 2006." 'Divided We Move': The Dromologics of Airport Security and Surveillance." Pp. 195-208 in Surveillance and Security: Technological Politics and Power in Everyday Life, edited by T. Monahan. New York: Routledge. 503

162

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance as Cultural Practice

Torin Monahan

Albrechtslund, Anders. 2008. "Online Social Networking as Participatory Surveillance." First Monday 13(3 ). Retrieved December 26, 2010 (http://firstmonday.org/htbin/cgiwrap/bin/ojs/ index. php/fm/ article/viewArticle/2142/ 1949). Albrechtslund, Anders and Lynsey Dubbeld. 2005. "The Plays and Arts of Surveillance: Studying Surveillance as Entertainment:' Surveillance & Society 3(2/3):216-21. Altheide, David. 2006. Terrorism and the Politics of Fear. Lanham, MD: Altamira. Andrejevic, Mark. 2007. iSpy: Surveillance and Power in the interactive Era. Lawrence: University Press of Kansas. Arteaga Botello, Nelson. 2007. "An Orchestration of Electronic Surveillance a CCTV Experience in Mexico." International Criminal Justice Review 17( 4):325-35. Ball, Kirstie and Kevin D. Haggerty. 2005. "Editorial: Doing Surveillance Studies." Surveillance & Society 3(2/3):129-38. Barthes, Roland. 1986. The Rustle of Language. New York: Hill and Wang. Bigo, Didier. 2006. "Security, Exception, Ban and Surveillance:' Pp. 46-68 in Theorizing Surveillance: The Panopticon and Beyond, edited by D. Lyon. Cullompton, England: Willan. Bloor, David. 1991. Knowledge and Social Imagery. 2d ed. Chicago, IL: University of Chicago Press. Boradkar, Prasad. 2010. Designing Things: A Critical Introduction to the Culture of Objects. New York: Berg. Borges, Jorge Luis. 1962. Ficciones. Translated by E. Editores. New York: Grove. Bourdieu, Pierre. 1977. Outline of a Theory of Practice. Translated by R. Nice. Cambridge, England: Cambridge University Press. Browne, Simone. 2010. "Digital Epidermalization: Race, Identity and Biometrics:' Critical Sociology 36(1):131-50. Burgin, Victor. 2002. "lenni's Room:' Pp. 228-35 in CTRL (Space): Rhetorics of Surveillance from Bentham to Big Brother, edited by Thomas Y. Levin, Ursula Frohne, and Peter Weibel. Cambridge, MA: MIT Press. Campbell, Nancy D. 2006. "Everyday Insecurities: The Micro-Behavioral Politics of Intrusive Surveillance." Pp. 57-75 in Surveillance and Security: Technological Politics and Power in Everyday Life, edited by T. Monahan. New York: Routledge. Charmaz, Kathy. 2006. Constructing Grounded Theory: A Practical Guide through Qualitative Analysis. 2d ed. Thousand Oaks, CA: Sage. Chen, Judy. 2008. "Playing with Surveillance:' Presented at the CHI 2008 Workshop: Interaction After Dark. Retrieved December 26, 2010 (http://citeseerx.ist.psu.edu/viewdoc/download? doi= 10.1.1.150.4 771 &rep=rep 1&type= pdf). Clarke, Adele. 2005. Situational Analysis: Grounded Theory after the Postmodern Turn. Thousand Oaks, CA: Sage. Currah, Paisley and Lisa Jean Moore. 2009. " 'We Won't Know Who You Are': Contesting Sex Designations in New York City Birth Certificates:' Hypatia 24(3):113-35. de Certeau, Michel. 1984. The Practice of Everyday Life. Translated by S. Rendall. Berkeley: University of California Press. Deflem, Mathieu, ed. 2008. Surveillance and Governance: Crime Control and Beyond. Bingley, England: Emerald. Deleuze, Gilles and Felix Guattari. 1987. A Thousand Plateaus: Capitalism and Schizophrenia. Translated by B. Massumi. Minneapolis: University of Minnesota Press.

504

163

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Torin Monahan

Surveillance as Cultural Practice

DeNicola, Lane. 2009. "Civil Surveillance of State and Corporate Activity Using Remote Sensing and Geographic Information Systems." Presented at Workshop on Surveillance and Empowerment, March, Vanderbilt University, Nashville, TN. Eubanks, Virginia. 2006. "Technologies of Citizenship: Surveillance and Political Learning in the Welfare System." Pp. 89-107 in Surveillance and Security: Technological Politics and Power in Everyday Life, edited by T. Monahan. New York: Routledge. - - . 2011. Digital Dead lind: Fighting for Social justice in the Information Age. Cambridge, MA: MIT Press. Fisher, )ill A., ed. 2011. Gender and the Science of Difference: Cultural Politics of Contemporary Science and Medicine. New Brunswick, NJ: Rutgers University Press. Foucault, Michel. 1977. Discipline and Punish: The Birth of the Prison. New York: Vintage. Frois, Catarina. Forthcoming. A Sociedade Vigilante. Ensaios sabre TdentificaF1o, Privacidade e Vigilancia. Lisbon, Portugal: Imprensa de Ciencias Sociais. Fussey, Pete. 2007. "An Interrupted Transmission? Processes of CCTV Implementation and the Impact of Human Agency:' Surveillance & Society 4(3 ):229-56. Gandy, Oscar H. 1993. The Panoptic Sort: A Political Economy of Personal Information. Boulder, CO: Westview. Gates, Kelly and Shoshana Magnet. 2007. "Communication Research and the Study of Surveillance:' The Communication Review 10(4):277-93. Gilliam, John. 2001. Overseers of the Poor: Surveillance, Resislance, and the Limits of Privacy. Chicago, IL: University of Chicago Press. - - . 2010. "Lying, Cheating and Teaching to the Test: The Politics of Surveillance under No Child Left Behind." Pp. 194-209 in Schools under Surveillance: Cultures of Control in Public Education, edited by T. Monahan and R. D. Torres. New Brunswick, NJ: Rutgers University Press. Goold, Benjamin ). 2004. CC'1'V and Policing: Public Area Surveillance and Police Practices in Britain. Oxford, England: Oxford University Press. Graham, Stephen and Simon Marvin. 2001. Splintering Urbanism: Networked Infrastructures, Technological Mobilities and the Urban Condition. New York: Routledge. Graham, Stephen and David Wood. 2003. "Digitizing Surveillance: Categorization, Space, Inequality." Critical Social Policy 23(2):227-48. Haggerty, Kevin D. 2009. "Methodology as a Knife Fight: The Process, Politics and Paradox of Evaluating Surveillance." Critical Criminology 17(4):277-91. Haggerty, Kevin D. and Richard V. Ericson. 2000. "The Surveillant Assemblage:' British journal of Sociology 51(4):605-22. Haraway, Donna. 1988. "Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective:' Feminist Studies 14( 3) :5 75-99. Huey, Laura, Kevin Walby, and Aaron Doyle. 2006. "Cop Watching in the Downtown Eastside: Exploring the Use of (Counter) Surveillance as a Tool of Resistance." Pp. 149-65 in Surveillance and Security: Technological Politics and Power in Everyday Life, edited by T. Monahan. New York: Routledge. Institute for Applied Autonomy. 2006. "Defensive Surveillance: Lessons from the Republican National Convention:' Pp. 167-74 in Surveillance and Security: Technological Politics and Power in Everyday Life, edited by T. Monahan. New York: Routledge. Kanashiro, Marta Mourao. 2008. "Surveillance Cameras in Brazil: Exclusion, Mobility Regulation, and the New Meanings of Security." Surveillance & Society 5(3 ):270-89.

505

164

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance as Cultural Practice

Torin Monahan

Kenner, Alison Marie. 2008. "Securing the Elderly Body: Dementia, Surveillance, and the Politics of 'Aging in Place':' Surveillance & Sociely 5(3):252-69. Koskela, Hille. 2004. "Webcams, TV Shows and Mobile Phones: Empowering Exhibitionism." Surveillance & Society 2(2/3):199-215. - - . 2009. "Hijacking Surveillance? The New Moral Landscapes of Amateur Photographing." Pp. 147-67 in Technologies of InSecurity: The Surveillance of Everyday Life, edited by K. F. Aas, H. 0. Gundhus, and H. M. Lomeli. New York: Routledge-Cavendish. Lahav, Gallya. 2008. "Mobility and Border Security: The U.S. Aviation System, the State, and the Rise of Public-Private Partnerships." Pp. 77-103 in Politics at the Airport, edited by M. 13. Salter. Minneapolis: University of Minnesota Press. Latour, Bruno. 1987. Science in Action: How to Follow Scientists and Engineers through Society. Cambridge, MA: Harvard University Press. Latour, Bruno and Emilie Hermant. 2006. "Paris: Invisible City." Retrieved December 26, 2010 (http://w.vw.bruno-latour.fr/livres/viii_paris-city-gb.pdf). Levin, Thomas Y., Ursula Frohne, and Peter Weibel, eds. 2002. CTRL (Space): Rhetorics of Surveillance from Bentham to Big Brother. Cambridge, MA: MIT Press. Luksch, Manu. 2008. "The Faceless Project." Retrieved December 26, 2010 (http:// www.ambienttv.net/pdf/facelessproject.pdf). Lynch, Michael and Steve Woolgar. 1990. "Introduction: Sociological Orientations to Representational Practices in Science:' Pp. 1-18 in Represenlalion in Scienlijic Praclice, edited by M. Lynch and S. Woolgar. Cambridge, MA: MIT Press. Lyon, David. 1994. The Electronic Eye: The Rise of Surveillance Society. Minneapolis: University of Minnesota Press. - - . 2001. Surveillance Society: Monitoring Everyday Life. Buckingham, England: Open University Press. - - , ed. 2003. Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination. New York: Routledge. - - , ed. 2006. Theorizing Surveillance: The Panopticon and Beyond. Cullompton, England: Willan. Magnet, Shoshana. Forthcoming. 1Afhen Biometrics Fail. Durham, NC: Duke University Press. Marx, Gary T. 1988. Undercover: Police Surveillance in America. Berkeley: University of California Press. - - . 2002. "What's New about the 'New Surveillance'? Classifying for Change and Continuity." Surveillance & Society 1(1):9-29. Mathiesen, Thomas. 1997. "The Viewer Society." Theoretical Criminology 1(12):215-34. Mauss, Marcel. 1973. "Techniques of the Body:' Economy and Society 2:70-88. McCahill, Michael. 2002. The Surveillance Web: The Rise of Visual Surveillance in an English Cily. Cullompton, England: Willan. McGrath, John E. 2004. Loving Big Brother: Performance, Privacy and Surveillance Space. New York: Routledge. Monahan, Tarin, ed. 2006. Surveillance and Security: Technological Politics and Power in Everyday Life. New York: Routledge. Monahan, Torin. 2010a. "Surveillance as Governance: Social Inequality and the Pursuit of Democratic Surveillance." Pp. 91-110 in Surveillance and Democracy, edited by K. D. Haggerty and M. Samatas. New York: Routledge. - - . 2010b. Surveillance in the Time ofTnsecurity. New Brunswick, NJ: Rutgers University Press.

506

165

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Torin Monahan

Surveillance as Cultural Practice

Monahan, Torin, David J. Phillips, and David Murakami Wood. 2010. "Editorial: Surveillance and Empowerment." Surveillance & Society 8(2):106-12. Monahan, Torin and Rodolfo D. Torres, eds. 2010. Schools under Surveillance: Cultures of Control in Public Education. New Brunswick, NJ: Rutgers University Press. Murakami Wood, David. 2009a. "Can a Scanner See the Soul? Philip K. Dick against the Surveillance Society:' Review of International American Studies 3.3-4.1:46-59. - - . 2009b. "The 'Surveillance Society': Questions of History, Place and Culture." European Journal of Criminology 6(2):179-94. - - . 2010. "Cultures of Urban Surveillance (Research Project):' Retrieved December 26, 2010 (http://u bisurv.wordpress.com/ about -my-project/). Nellis, Mike. 2005. "Future Punishment in American Science Fiction Movies:' Pp. 210-28 in Captured by the Media: Prison Discourse and Popular Culture, edited by P. Mason. Cullompton, England: Willan. Nelson, Margaret K. and Anita Tlta Garey, eds. 2009. Who's Watching?: Daily Practices of Surveillance among Contemporary Families. Nashville, TN: Vanderbilt University Press. Norris, Clive and Gary Armstrong. 1999. The Maximum Surveillance Society: The Rise ofCCTV. Oxford, England: Berg. Ortner, Sherry B. 1994. "Theory in Anthropology since the Sixties." Pp. 3 72-411 in Culture/Power/ History: A Reader in Contemporary Social Theory, edited by N. B. Dirks, GeoffEley, and Sherry B. Ortner. Princeton, NJ: Princeton University Press. Ottinger, Gwen. 2010. "Constructing Empowerment through Interpretations of Environmental Surveillance Data." Surveillance & Society 8(2):221-34. Pfaffenberger, Bryan. 1990. "The Hindu Temple as a Machine, or, the Western Machine as a Temple." Techniques et Culture 16:183-202. Poster, Mark. 1990. The Mode of Information: Poststructuralism and Social Context. Chicago, IL: University of Chicago Press. Postigo, Hector R. 2008. "Video Game Appropriation through Modifications Attitudes Concerning Intellectual Property among Modders and Fans:' Convergence: The Tnternational]ournal of Research into New Media Technologies 14(1):59-74. Regan, Priscilla M. 1995. Legislating Privacy: Technology, Social Values, and Public Policy. Chapel Hill: University of North Carolina Press. Regan, Priscilla and Valerie Steeves. 2010. "Kids R Us: Online Social Networking and the Potential for Empowerment." Surveillance & Society 8(2):151-65. Rule, James B. 1973. Private Lives and Public Surveillance: Social Control in the Computer Age. London, England: Allen Lane. Samatas, Minas. 2004. Surveillance in Greece: From Anticommunist to Consumer Surveillance. New York: Pella. Smith, Gavin J. D. 2004. "Behind the Screens: Examining Constructions of Deviance and Informal Practices among CCTV Control Room Operators in the UK." Surveillance & Society 2(2/3):376-95. Staples, William G. 1997. The Culture of Surveillance: Discipline and Social Control in the United States. New York: St. Martin's. - - . 2000. Everyday Surveillance: Vigilance and Visibility in Postmodern Life. Lanham, MD: Rowman & Littlefield.

507

166

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance as Cultural Practice

Torin Monahan

Steeves, Valerie. 2010. "Online Surveillance in Canadian Schools." Pp. 87-103 in Schools under Surveillance: Cultures of Control in Public Education, edited by T. Monahan and R. D. Torres. New Brunswick, NJ: Rutgers University Press. Surveillance Camera Players. 2006. We Know You Are ~Vatching. N. p.: Southpaw Culture Factory School. The New Transparency. 2010. "The New Transparency: Surveillance and Social Sorting." Retrieved December 26, 2010 (http://www.sscqueens.org/projects/the-new-transparency/). Torpey, John. 2007. "Through Thick and Thin: Surveillance after 9/11." Contemporary Sociology 36(2):116-19. Trottier, Daniel. 2006. "Watching Yourself, Watching Others: Popular Representations of Panoptic Surveillance in Reality TV Programs." Pp. 259-76 in How Real Is Reality TV?: Representations and Reality Television, edited hy D. S. Escoffery. Jefferson, NC: McFarland. Turow, Joseph. 2006. Niche Envy: Marketing Discrimination in the Digital Age. Cambridge, MA: MIT Press. UrbanEye. 2004. "The UrbanEye Project:' Retrieved December 26, 2010 (http://www. urbaneye.net/index.html). Vee!, Kristin. 2010. "Surveillance Narratives: From Lack to Overload." Pp. 3-12 in Humanity in Cybernetic Environments, edited by D. Riha. Oxford, England: Inter- Disciplinary Press. Walsh, James P. 2010. "From Border Control to Border Care: The Political and Ethical Potential of Surveillance." Surveillance & Society 8(2):113-30. Willse, Craig. 2008." 'Universal Data Elements,' or the Biopolitical Life of Homeless Populations." Surveillance & Society 5(3):227-51. Wilson, Dean and Tanya Serisier. 2010. "Video Activism and the Ambiguities of CounterSurveillance:' Surveillance & Society 8(2):166-80. Wilson, Dean and Adam Sutton. 2003. "Open Street CCTV in Australia: A Comparative Study of Establishment and Operation, a Report to the Australian Criminology Research Council ( CRC Grant 26/0 1-02), Melbourne, Australia." Woodhouse, Edward, David Hess, Steve Breyman, and Brian Martin. 2002. "Science Studies and Activism: Possibilities and Problems for Reconstructivist Agendas:' Social Studies of Science 32(2):297-319.

508

[10] Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance and Security: A Dodgy Relationship 1 Walter Peissl Modern societies are vulnerable. We have known this long before the attacks of 11 September 2001, but they made it clear to everyone. The second lesson learned from the attacks was that it is impossible to foresee such events. Although these attacks to the real world were "low-tech", now there are attempts around the globe to control especially the electronic or virtual world. However, does more surveillance really lead to more security? If so, what will be the price we have to pay? National states try to provide their citizens with a high level of security, but the effort for better security often gets mixed up with the claim for more surveillance. This is one reason why, over the past few months, governmental activities seemed to jeopardise the internationally acknowledged fundamental right of privacy. Societal security versus personal freedom is an old and well-known area of conflict. In the light of the incidents of 11 September 2001 some old ideas for surveillance and for measures restricting privacy got on the agenda again - and new ones keep emerging. This article will give an overview of what happened on a governmental level after 11 September 2001 in the EU, in some EU-member states and in the USA. Apart from political actions, we already face even direct socio-economic implications as some anonymiser services were shut down. They empowered Internet users to protect their right of privacy, and they were the first targets of investigation and suspicion. Shutting down these services reduces the potential room for users to protect their privacy by using Privacy Enhancing Technologies (PETs). This is an indicator for a serious societal problem: democracy has already changed. In the second part I will analyse the relationship between surveillance and security and I will argue that, and give reasons why, these international over-reactions will not lead to the intended effects. Rather, they will have long-term implications for the respective societies. So in the end this has to be acknowledged in a necessary appreciation of values.

Overview of International Reactions 2 The following overview does not claim to fully describe what happened after 11 September 2001. Moreover, it is not possible to give a comprehensive assessment of the impact of different governmental activities. This short overview only aims to give an idea of the hectic activities and to indicate the direction in which politics are heading. Some of the measures mentioned are already being enforced while others are in the phase of implementation, so they may still be watered down when passing the legislative bodies. I will first have a brief look at those nations that were mainly affected, restricting myself to a short summary of the main legal actions. The other contributions in this special issue will provide us with a more profound analysis of the ongoing developments.

The United States of America One of the main activities was to enact the Patriot Act: 'Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism' (US Congress, 2001) this law substantially lowers the standard of juridical control of law enforcement activities and gives a broad range of new possibilities to law enforcement agencies 3

The United Kingdom The United Kingdom (UK) enacted a handful of anti-terrorism laws under which certain suspects can be indcfinitelv detained without trial. In order to pass the' legislation, the government derogated from i.e., opted out of article 5 (1) of the European Convention on Human Rights (ECHR), which guarantees the right to liberty (Dyer, 2001).

168

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

20

]OUR1\IAL OF CONTiNGENCIES AND CRJSlS MA1\IAGEMENT

Furthermore, the Freedom of Information Act that was resolved in 2000 will be suspended until 2005 (Medosch, 2001). Telecommunications service providers are forced to retain communications data for 12 months under a "voluntary code". In demanding such a practice, the government dismisses the 1997 EU Directive on data protection and privacy. This directive permits data to be retained for the purpose of billing only (i.e., for the benefit of the customer). Just as the indefinite detention of terrorist "suspects" derogates from the ECHR, under the new law the UK government violates one of the fundamental rights on privacy established by the EU (Statewatch, 2001b) 4

Germany In Germany the Minister for the Interior, Otto Schily, put up two legislative proposals (Sicherheitspaket I and Sicherheitspaket 11) 5 Cornerstones of these proposals are enlarged competencies of various law enforcement and military agencies as well as considerable infringements of several existing laws. All this severely threatens basic human rights (including the confidentiality of the mail).

France France also has its Anti-Terror-Package ;vith new rights for the police and for judicial authorities. In the future, it will be easier for them to acquire warrants for searching private houses and cars. Service providers will have to store transmission data for up to 12 months, and providers of security services may be forced to reveal secret keys to de-encrypt data (Roller, 2001) 6

Austria The Austrian government did not immediately undertake extraordinary activities. However, in the light of the attacks of 11 September 2001 it was easier for the Austrian government to change the status of a new law on electronic eavesdropping and "Rasterfahndung" from temporary to permanent. Additionally, like in some other European countries, there was a debate on taking fingerprints of all inhabitants. On 1 December 2001 an order on interception was issued, which pledges telecommunication service providers to provide a technical interface for electronic eavesdropping. And in June 2002 there were two pieces of legislation, which made interception easier and brought new matters of offence into being. Finally, Austria ratified the Cybercrime Convention of the European Council in November 2002.

The European Union On the European level, an Extraordinary European Council held on 21 September 2001

approved a plan of action. It implies to enhance co-operation between national police and judicial forces by the introduction of a European arrest warrant and the adoption of a common definition of terrorism, as well as a better exchange of information between the different intelligence services of the Union. Furthermore, Member State authorities will share with Europol, systematically and without delav, all useful data regarding terrorism. A sp~cialist anti-terrorist team, that will co-operate closely with their US counterparts, will be set up within Europol as soon as possible. Additionally, the Council stated the necessity to develop international legal instruments in order to put an end to the funding of terrorism, to strengthen air security and to coordinate the European Union's global actions (EC, 2001) 7 In the long-run, the most negative impact on the privacy of European citizens may emanate from a new Directive on privacy and electronic communications 8, which will allow Member States to adopt legislative measures to restrict privacy ensuring rules, 'when such restriction constitutes a necessary, appropriate and proportionate measure within a democratic society to safeguard national security, (i.e., State secu~ity) defence, public security, the prevention, investigation, detection and prosecution of criminal offences or of unauthorised use of the electronic communication system'. Especially the permission for Member States to 'adopt legislative measures providing for the retention of data for a limited period' is a fundamental change in European privacy policy. Furthermore a recently published draft Framework Decision (Statewatch, 2002) shows that there are intentions to make data retention compulsory across Europe.

Canada The new anti-terror law intends to 'criminalize terrorist financing, establish a procedure to freeze, seize, and forfeit proceeds for and proceeds of terrorist activities/terrorist groups, enhance our ability to protect sensitive information, create new investigative tools and allow for preventative arrest when needed to address the serious threat posed by terrorist groups and those who would carry out terrorist activity, establish a means to identify and list terrorist groups' (Zaccardelli, 2001 and RCMP, 2001) 9

Australia In Australia, federal elections have delayed specific legislative changes from being implemented. However, the Federal Cabinet has announced some extraordinary measures. There are extremely concerning announcements, like the Attorney General stating that arrested people

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

SURVEiLLANCE AND SECURiTY

could face up to five years in jail for refusing to answer questions, which abolishes the right to refuse to make a statement. People could be held incommunicado vvith no right to see a lawyer. Such powers would not be restricted to those suspected of terrorism, but could be applied against anyone who might have information regarding politically motivated violence. The Attorney General has stated that this could include lawyers and journalists (Statewatch, 2001a).

Other Socio-Economic Impacts Apart from the governmental activities mentioned above, there arc socio-economic impacts on the everyday lives of the most different people already. The erroneous arrest of Pierre Boulez, the famous conductor, provides an impressive example of on the one hand, the hectic policy activities during the first weeks following the attacks and, on the other, of the very long memory of computer systems and databases. Some time during the sixties, the now 75 years old musician had said that all opera houses should be blown up. This metaphorical sentence brought him into trouble some 30 years later (Coomarasamy, 2001). Thousands of kilometres away, in Somalia, all Internet traffic was shut down. 'Somalia's onlv Internet company and a key telecoms busines's have been forced to close because the United States suspect them of terrorist links. The two firms, Somalia Internet Company and al-Barakaat, both appear on a US list of organisations accused of funnelling money to the al-Qaeda network. Hassan Barise in Mogadishu told the BBC's Network Africa programme more than 80% of Somalis depended on money they receive from relatives outside the countrv. He said all Internet cafes had now shut doWl~ and international phone lines run by two other companies were failing to cope with the extra pressure of calls' (BBC, 2001). In the US, an anonymiscr service was shut down. 'ZeroKnmvledge, providers of Freedom.net and Freedom privacy software, haYe abruptly decided to stop providing anonymous v,reb brmvsing and private, encrypted, untraceable email for its customers. They t,>ivc users 7 days before the system is shut down and illl untraceable email addresses arc disabled. They also say that your "sccrct1 ' identity may not remain a secret fm long.' (Wnuters, 2001)

Anonymiser services like other PETs (Privacy Enhancing Technologies) empower Internet users to protect their right to privacy. They were one of the first targets of investigation and

169 21

suspicion. Shutting down these services reduces the potential scope of possible action for users to protect their privacy by using PETs. This is another indicator for a serious societal problem: it demonstrates that the quality of democracy already has profoundly changed to the worse. The list of examples is of course incomplete. There are lots of other developments all around the world. This short overview just flagged up some important ones that are directly connected to the privacy issue. Additionally in many countries, measures are undertaken that restrict other basic human rights and undermine minimum standards of judicial control.

The Relationship between Surveillance and Security All measures mentioned above could easily be enacted because citizens have been presented with a simple equation: more surveillance means more securitv. Still under shock of the attacks of 11 Septemb~r 2001 many people were willing to waive a little of their personal freedom in order to gain more security in exchange. To be able to profoundly appraise the different measures on the one hand, and the basic human rights, on the other, it is necessary to look into the matter in more depth. The question is whether more surveillance really leads to more security, and if so, at what price? Before answering this question, we need to define the relationship between two key terms that are often used synonymously, namely surveillance and control. Control is about the comparison of a target value with an actual value; control may trigger a correcting action if necessarv. Hence control is necessarily based on a no;mative idea and the will to realise this order. By contrast, surveillance may be seen as a sequence of actions of control. Furthermore surveillance implies an asymmetric hierarchical relationship between those who observe and those who are observed. Thus surveillance is to a great extend conservative and oriented towards retaining the status quo (Nogala, 2000: 141). Coming back to our core question whether more surveillance will bring about a higher degree of security, it seems useful to distinguish further between an ex-ante and an ex-post perspective. If new surveillance technologies could contribute to identifying potential terrorists before they can perform their criminal activities, this would increase security. But as said above, surveillance is a sequence of acts of control and control is about comparing actual behaviour to standardised behaviour - however it may be defined. This is true for telecommunications wiretapping and eavesdropping as well as for video-surveillance of public places and for the

170

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

22

}OUR1'JAL OF CONTINGENCIES A1'JD CRISIS MANAGEMENT

biometric methods that are in discussion all over the world. Electronic fingerprints or iris-scans can only tell you something about the authenticity of a person. However, if the person did not attract attention yet, he or she will never be in a database and therefore cannot be detected as "suspicious". If there is one thing to be learned from the attacks of 11 September 2001, it is that the persons involved lived "normal" lives for years. In other words: their behaviour appeared to be standardised. Hence they could not be detected by intelligence agencies. And even if all citizens were to be in a database, authentication will not provide any information with a surplus value. Thus we can conclude from an ex-ante perspective that, in general, surveillance does not raise securitv. From an ~x-post perspective, data sets gained from huge databases and widespread surveillance may certainly facilitate the prosecution of criminals. Hence its contribution to a higher level of security is of an indirect order only. Deterrence, i.e. the positive impact of high rates of reconnaissance and high penalties, may be enhanced. However, deterrence docs not seem to be a successfu 1 instrument to keep people from performing terrorist attacks. Especially when they are willing to risk their lives anyway. Another problem is whether widespread surveillance is possible at all. There arc at least two boundaries: technical and social. Any comprehensive surveillance of telecommunication networks depends on technical measures. Systems have to be programmed to react to specific "catchwords" or other parameters, i.e. to behaviour that is deviating from a pre-defined "norm behaviour". As the attacks of 11 September 2001 showed, the terrorists had led fully adapted "mainstream lives" for a considerable time and they thus escaped from being observed. The second problem occurs as soon as surveillance is no longer just a theoretical term that will be restricted for "more security". As soon as technical means like video systems in public places or wire-tapping of telecommunications systems will be perceived by ordinary people in their everyday lives, they will try to circumvent those surveillance systems. Two different ways of doing so arc conceivable: (i) strategies of avoidance and (ii) usc of preventive technologies. Strategies of avoidance may be the usc of personal communication and face-to-face meetings rather than digital communication and virtual meetings like phone calls, e-mail or chat-rooms. Preventive technologies or so called Privacy Enhancing Technologies (PETs) are using cryptographic tools, stcganography or other newly developed systems like onion routing etc. Although some of these means are difficult to apply, he who has the resources, the education

and the determination to do so, will be able to remain incognito. Most probably, criminals will do so and it will rather be "ordinary citizens" who will be traced and restricted in their basic human right of privacy. This was to point out that more surveillance does not necessarily lead to a higher level of societal security. Hence it is highly questionable whether massive constraints in human rights are justified.

Impacts of Widespread Surveillance Why are basic human rights like privacy, freedom of speech, freedom of assembly etc. that important? What are possible impacts of constraints? There are short- and long-term effects. In a short-term perspective, we can sec that surveillance leads to adapted behaviour of human beings. We will sec not their "real", "own" behaviour, but rather the behaviour the individuals think they are supposed to show. This leads to a loss of autonomy. This development is critical from a democratic point of view, because liberal democratic societies arc built on the idea of self-conscious and autonomous citizens. The more surveillance (governmental and private) we tolerate, the more we are heading towards a socalled "panoptic society". The panoptic society traces back to the well-known Panopticon of Jeremy Bentham who modelled a prison in which the prisoners could be observed from a central point, but were themselves not able to see their observers. Foucault (1994: 258 pp) pointed out that the specific meaning of this model lies in the creation of a permanent awareness of being observed that ensures power to take effect automaticallv. In other words the effects of surveillance' are permanent even if actual surveillance is not performed. In this respect, power is automated and de-individualised. Foucault transferred the mechanisms of the Panopticon to describe modern societies. Today, it is used to describe the new kind of transparency in the socalled information society (Rossler, 2001: 219). In the long run, a much greater problem is arising: The short-term "mainstrcaming" of citizens' behaviour, i.e. restricting variation and avoiding what would be deemed deviation, may turn out to prevent any "driving momentum" in societal, cultural and economic terms. Dissenting or variant behaviour is considered to be one of the most important driving forces of economic and societal development. When losing this force, Western democratic societies mav soon ' get into trouble. In the social sciences, it is beyond any dispute that non-conformist behaviour is a necessary driving force for societal development. One of the most prominent examples is the concept of

Security and Privacy 23

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

SURVEiLLANCE AND SECURiTY

"charismatic leadership" by Max Weber. Charismatic leadership is based on "non-ordinary" claims of an individual or a group. These claims suffice themselves and do not need tradition or law. Charismatic leadership is revolutionary by definition, it is destructing and by purpose different to existing rules and tradition (Berger et a!. 1976: 227). Charismatic leadership may be transformed into a traditional or rational kind of governance; but as soon as these actions are becoming an everyday habit, this directly leads to a decline. For the second generation all the famous actions of the charismatic revolution are perceived as an "old hat" - what was nonordinary in the beginning becomes part of everyday life (Berger et al. 1976: 229). Even Talcott Parsons who denied the possibility to establish a general theory of social change may be cited as a witness for the above stated thesis. He defined social change as a product of the system's endeavours to react upon external intederences. One important reason for social change is a failure in the socialisation of individuals or groups. The latter did not learn to adapt to the societal needs and therefore generate disturbances of the societal equilibrium (Berger ct al. 1976, 234). But the structural-functional theory did not deal consequently enough with social change and, hence, cannot deal adequately with conflict. However, conflict is the great and creative power that promotes social change (Dahrendorf, 1965: 109 in Endruweit, 1989: 803). Mainstreaming is not a danger for societies in abstract models of social change only. We can see the same kind of problem in the economic world, too. The best example is the notion of "entrepreneur" in Schumpeter's writing. Schumpeter's entrepreneur is not placed within a static theory of equilibrium or disequilibrium. According to Schumpeter, entrepreneurs are economic agents whose functions are 'the carrying out of new combinations, the creative destruction of equilibria, thereby preparing the ground for a superior state of equilibrium.' (Swoboda, 1984: 17). This is not the place to go into the details of Schumpeter's theory but a telling quote stays for this part of his theory: 'While an economic subject swims with the stream in the well known cycle, it swims against the stream if it wants to change its way.' (Schumpeter, 1952: 118/transl. by the author) If our societies stop to develop10 - even in a very long term- they will perish. This means the terrorists will have achieved what they aimed at after all.

Conclusions In this article, I presented an overview of legal actions as well as socio-economic impacts

171

of the attacks of 11 September 2001. Furthermore, I argued that more surveillance docs not necessarily lead to more security. Rather, this equation seems to be a brilliant marketing trick of law enforcement authorities around the world to get things going they had on their agenda for many years already. Finally, I tried to sketch a picture of possible short- and long-term impacts of widespread surveillance for democratic societies. If one agrees that the discussed possible impacts of widespread surveillance are undesirable, and if one recognises that the discussed restrictions on basic human rights (producing these unwanted side-effects) may even be the wrong drug for the necessary therapy, I cannot do anything but conclude that one should better discontinue the drug.

Notes 1. Special thanks to Monika Bergmann at ITA, who did most of the data collecting for the survey of international legal action after 11 September. 2. More information on recent activities mav be found on http://www.statewatch.org/ob'servatory. htm and Viithin the privacy linklist of the Institute of Technology Assessmenst of the Austrian Academy of Sciences (http:// www.oeaw.ac.at/ita/privacylinks.htm) 3. For further information on this, see the article by Barry Steinhardt. 4. An analysis of recent developments is given by Charles Raab in his paper. More information can be found on http://wvJw.wood.ccta. gov.uk/homcofficc/hoprcss.nsf; http:// politics. guardian.co.uk and http://www.state watch. org/news2001/nov/17ukdata.htm. 5. For the respective pressreleases see http:// www.eng.bmi.bund.de/services/externa!Views. 6. Parlamentarian report http://www.senat.fr/ rap/I01-007/I01-0071.pdf; changes to laws http:/ /www.senat.fr/pl/5-01 02.pdf and http:/ I www.statewatch.org/news/2001/oct/1 Ofrance. htm. 7. Updated information will be provided at http:! /europa.eu.int/news/110901/index.htm. 8. 2002/58/EC of 12 July 2002. 9. For a substantial analasvs of the situation in Canada see the' article by Colin Bennett. 10. This, of course, does not mean the author is blind for possible negative impacts of change. This is just being said to highlight the possible longterm negative impacts of stagnation.

172

Security and Privacy ]OURJ\IAL OF CONTiNGENCiES AND CRiSiS lv1ANAGELI1ENT

24

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

References BBC (2001), US shuts down Somalia internet; Last update: 23 November, 20o-l [Accessed on: 02-01-1 2] http:/ /news_ bbcco_ u k/hi/english/wnrld/africa/ newsid_-1 672000!11i72220_stm_ Berger, P_L_ and Berger, R (1976), Wir und die

Gesel/schajl - Eine EinjiXhrung in die Soziologie entwickelt an der illltagserfahrung, Reinbeck bei

Hamburg: Rowohlt Taschenbuch Verlag GmbH Clare Dyer (2001), Woolf admits concern at new detention law (The Guardian); Last update: january 1, 2002 [Accessed on: 02-01-'12] http://www.guardian_co.uk/ukresponse/stnry/O, 11 017,n26445,00.htmL Coomarasamy, I- (2001), Conductor held over 'terrorism' comment; Last update: 4 December, 2001 [Accessed on: 02-01-12] http://news.bbcco.uk/hi/english/ world/europe/newsid_l692000/lli921i28.stm_ EC (2001), Brussels Extraordinan; European Council: Conclusions and 1'/an of Action (140/01); Press Release; Last update: 21-09-2001 [Accessed on: 02-01-12] http://ue.eu.int/Newsroom/LoadlJoccfm? Mi\X=l& lJOC=!'!&IJ]])=76&lJ]])=67808&GRP=3778&LANG =1Endruweit, G_ and Trommsdor£, G_ (Eds), (1989), W6J-tcrbuch dcr Stuttgart: Enke_ Foucault, M (1994), und Strafen Die Geburt des Gefdngnisses, Frankfurt am Main: Suhrkamp Taschenbuch 227-1 _ Medosch, A (2001), informationsfreiheit auf Warteschlcife (Telepolis); Last update: 14.112001 [Accessed on: 02-01-12] http://www.heise.de/tp/ deutsch/special/frei/11113/l_htmL Nogala, D_ (2000), 'Der Frosch im hei!len Wasser,' in: Schulzki-Haddouti, CH (Ed_): Vooz Eude der finanymittz't, Hannover: Heinz Heise GmbH & Co.KG, PP- '139--155_ RCMP (2001), New Techuologies, Intelligence

and Integrated Law Eufom:mcntto Improve Safety

Security of Canadians [Accessed on: 02-01-12] http://wwvucmp-grcgcca/news/nr-01-2n.htm_ Roller, K (2001), Mobilisierung in Frankreich (Telepolis); Last update: 21.10.2001 [Accessed on: 02-0112] http://www.heise.de/tp/deutsch/inhalt/te/9880/ l.htmL Rossler, 13_ (2001), 'Der Wert des Privaten,' in: Wissenschaft, , edited by Suhrkamp, 1st Edition, Frankfurt am Main: Suhrkamp Verlag_ Schumpeter, l- (1952), Theorir der wirtschaftlicheu Enr-u,icklung- f.ine Untersuchung, iiber UntemehmerKapital Kredit Zins wzd den Koujunkturzyklus,

Edition, Berlin: Duncker & Humblot Statewatch (2001a), New Terrorist Laws Threaten Democratic Rights [Accessed on: 02-01-12] http:// www.statewatch.org/news/2001/dec/asio.pdf_ Statewatch (2001b), UK plans for the retention of data for 12 months [Accessed on: 02-01-12] http://wvN:_ statewa tch .org/news/2001/nov/17ukda ta_h tm_ Statewatch (2002), Surveillance of communications, ELl:

data retention to be "compulsory" for 12-24 months, draft Framework Decision leaked to Statewatch [Ac-

cessed on: 02-10-02] http://www.statewatch.org/ news/2002/aug/OSdatafdl.htm_ Swoboda, P_ (19B4), 'Schum peter's Entrepreneur in Modern Economic Theorv,' in: Seidl, C (Ed.):

Economics :-. _ Scl1urnmeter Ccntcnant IAxtures Craz 1983, Heidelberg

New York Tokyo: Springer_ PP- 17-30_ US Congress (2001), HR3162 [Accessed on: 02-01-12] http://thomas.loc.gov/cgi-bin/bdquery/z?d107:h_r_ 3162: Wouters, p_ (2001), ZeroKnowledge to Discontinue finonymity Service; email to mailinglist (vibe.at) [Accessed on: 04-10-01] _ Zaccardelli, G_ (2001), BlLL C-36, The Proposed AntiTerrorism Act [Accessed on: 02-01-12] http:// vvvvw.rcmp-grc.gc.ca/news/c-36_e.htln.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[11] Counter-surveillance as Political Intervention? Tarin Monahan

This paper analyzes practices of counter-surveillance--particularly against closed-circuit television systems in urban areas-and theorizes their political implications. Counter-surveillance is defined as intentional, tactical uses, or disruptions of surveillance technologies to challenge institutional power asymmetries. Such activities can include disabling or destroying surveillance cameras, mapping paths of least surveillance and disseminating that information over the Internet, employing video cameras to monitor sanctioned surveillance systems and their personnel, or staging public plays to draw attention to the prevalence of surveillance in society. The main argument is that current modes of activism tend to individualize surveillance problems and methods of resistance, leaving the institutions, policies, and cultural assumptions that support public surveillance relatively insulated from attack. Surveillance; resistance; activism; art; social movements; social control; globalization

Keywords

What happens when the cameras are turned back on those monitoring us? This paper analyzes such practices of counter-surveillance-particularly against video and closed-circuit television (CCTV) systems in urban areas-and theorizes their political implications. Counter-surveillance can include disabling or destroying surveillance cameras, mapping paths of least surveillance and disseminating that information over the Internet, employing video cameras to monitor sanctioned surveillance systems and their personnel, or staging public plays to draw attention to the prevalence of surveillance in society. In some cases, marginal groups selectively appropriate technologies that they might otherwise oppose when used by those with institutional power. On one hand, these examples illustrate the under-determination of technologies and suggest further avenues for political intervention through counter-surveillance. On the other hand, because surveillance systems evolve through social conflict, counter-surveillance practices may implicate opposition groups in the further development of global systems of control. Counter-surveillance operates within and in reaction to ongoing global transformations of public spaces and resources. According to social theorists

174

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

516

T. MONAHAN

(for example, Harvey 1990; Castells 1996), a crisis in capital accumulation in the mid-1970s precipitated a shift from mass-production to flexible production regimes, catalyzing organizational decentralization, labor outsourcing, computerized automation, just-in-time production, and, increasingly, the privatization of that which has historically been considered "public." These structural transformations aggravated conditions of social inequality, leading to the development of new mechanisms of social control to regulate bodies in this unstable terrain. Some of the most effective forms of social control are those that naturalize the exclusion of economically or culturally marginalized groups through architecture or infrastructure. Mass incarceration of over two million individuals in the United States alone is one extreme measure of such postindustrial exclusion (Kupchik and Monahan 2006). Less dramatically, but perhaps more pervasively, fortified enclaves such as gated communities, shopping malls, and business centers have multiplied exponentially over the past decade and seem to be as prevalent in "developing" countries as in "developed" countries (Davis 1990; Caldeira 2000; Low 2003; Monahan 2006a). Additionally, privatized streets, parks, and security services effectively sacrifice civic accountability and civil rights while increasing affordances for the monitoring of public life (Zukin 1995). Finally, telecommunications and other infrastructures unevenly distribute access to the goods and services necessary for modern life while facilitating data collection on and control of the public (Reiman 1995; Graham and Marvin 2001; Monahan 2005). Against this backdrop, the embedding of technological surveillance into spaces and infrastructures serves not only to augment existing social control functions, but also capital accumulation imperatives, which are readily seen with the sharing of surveillance operations and data between public and private sectors (Gandy 2003; ACLU 2004; O'Harrow 2005; Monahan 2006c). Through a range of tactical interventions into the logic and institutions of global capitalism, counter-surveillance tacticians seek to disrupt these trends in the privatization, sanitation, and elimination of that which is "public." While the ideologies and intentions of those engaging in counter-surveillance are manifold and disparate, they are unified in the mission of safeguarding-or creating-the necessary spaces for meaningful participation in determining the social, environmental, and economic conditions of life. Because of this orientation, the term counter-surveillance will be used here to indicate intentional, tactical uses, or disruptions of surveillance technologies to challenge institutional power asymmetries. This article reviews several counter-surveillance practices and analyzes the power relations simultaneously revealed and produced by resistance to institutionalized surveillance. Importantly, the emphasis here is upon the framing of surveillance problems and responses by activists, or on points of symbolic conflict rather than physical confrontation. Thus, it is assumed that while counter-surveillance practitioners may have immediate practical goals, such as circumventing or destroying video cameras, that they are foremost engaged in acts of symbolic resistance with the intention of raising public awareness about

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

175

517

modern surveillance regimes. 1 The body of this paper will analyze two types of counter-surveillance efforts (interventions into the technical and the social faces of public surveillance) and then theorize the efficacy and implications of counter-surveillance more generally. The data are drawn primarily from websites, video productions, and publications, but several interviews were conducted with activists in the United States to corroborate the critical readings offered here. The main argument is that activists tend to individualize both surveillance problems and methods of resistance, leaving the institutions, policies, and cultural assumptions that support public surveillance relatively insulated from attack. Furthermore, while the oppositional framing presented by activists (i.e. counter-surveillance versus surveillance) may challenge the status quo and raise public awareness, it also introduces the danger of unintentionally reinforcing the systems of social control that activists seek to undermine.

Technical Interventions Surveillance circumvention and destruction are two activist interventions that concentrate on the technical side of modern surveillance. Of course the technical and the social dimensions of all technologies are thoroughly intertwined, as science and technology studies scholars have well demonstrated (for example, Bijker, Hughes, and Pinch 1987; Bijker and Law 1992), so the point in separating them out here is to draw attention to the specific sites of intervention as defined by counter-surveillance tacticians.

Institute for Applied Autonomy The first example is offered by the Institute for Applied Autonomy (IAA), which is a collective of technicians/artists/activists engaged in projects of productive disruption and collective empowerment (Schienke and IAA 2002). According to their website: [The IAA] was founded in 1998 as a technological research and development organization concerned with individual and collective self-determination. Our mission is to study the forces and structures which effect self-determination; to create cultural artifacts which address these forces; and to develop technologies which serve social and human needs. (IAA 2003) Some of these projects include automated graffiti-writing robots, a propaganda distributing robot called "Little Brother," and a web-based application called "iSee" that allows users to map paths of least surveillance in urban areas. 1. See Marx (2003) for a typology of acts of resistance to dominant uses of surveillance (or "tacks in the shoe"), which exploit the ironic vulnerabilities of ubiquitous surveillance projects.

Security and Privacy

176

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

518

T. MONAHAN

The surveillance-mapping iSee application offers a provocative entry point into counter-surveillance territory (see Figures 1 and 2). 2 The opening flash display on the website depicts a blue robot-like icon of a person who has a cannonball bomb with a lit fuse for a head (Figure 1). Next, the lens of the viewing area pulls back, revealing that the person is squarely placed underneath a large microscope with a red video surveillance camera as its lens. The red cameras then multiply, triangulating on the person who begins a passage, represented by yellow dash marks, along city streets. The graphic rotates and pulls back one last time to reveal that the name of the application ("iSee") has been traced by the route taken by the robot-like figure. In the upper left corner, in a mock allusion to numbering conventions for software development, the website overtly references the tense political climate of surveillance in places like New York City: "iSEE v. 911: 'Now more than ever."' Once past the opening scene, the application presents the user with a street map of Manhattan with a dramatic black background and abundant red boxes

Figure 1

iSee application introduction.

2. www.appliedautonomy.com/isee.html

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

177

519

Figure 2 iSee application. indicating areas under video surveillance (Figure 2). The map is engaged by clicking first on a starting point and second on a destination point. After a few seconds of calculation, a yellow route is indicated for a person to travel the path of least surveillance to his or her destination. As with other online mapping programs, the user can zoom in or out, scroll up, down, or sideways, or "reset" to begin the mapping process anew. Finally, the bottom right corner displays the travel distance and the number of cameras to which one will be exposed along the route specified. This website offers rich symbolic referents that extend well beyond the utility of a route-generating application. The figure of a person as robot communicates both the dehumanizing threat of individual conformity (or self-regulation) in the face of ubiquitous surveillance and the construction of individuals as social machinery, data points, or risk potentialities from the point of view of those doing the monitoring. That this iconographic robot-person has a cannonball bomb with a lit fuse for a head represents the explosive volatility of the situation: viewing people as threats to be monitored and controlled, rather than as citizens with civil rights, may destroy civil society and/or may lead to violent opposition. Finally, the placement of this solitary figure underneath its own scrutinizing microscope(s) stresses the atomization of individuals as suspect bodies strangely decontextualized and divorced from political, social, or economic realities. The tacit critique here is that, once atomized as such, surveillance regimes view individuals from a universalistic perspective and are therefore unable to perceive particularistic conditions, such as racism or economic inequality, which inscribe all social relations. If such particulars fall outside the sterilizing camera frame,

178

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

520

T. MONAHAN

then they cease to exist as mitigating circumstances or-perhaps more importantly-as social problems worthy of attention and correction. Such iSee applications are now available for the cities of New York in the USA, Amsterdam in The Netherlands, and Ljubljana in Slovenia. The aim of these websites is neither to directly interfere with the surveillance apparatus in these cities, nor is it to allow individuals to effectively circumvent monitoring, although that is the immediate and practical outcome. Instead, the goal is to raise public awareness and foster public debate over the prevalence of surveillance cameras and their effects on public life. Because technological infrastructures become invisible when they are functional (Bowker and Star 1999), and the political effects of technologies, more generally, are off the radar screen of most people (Winner 1986), the intervention of iSee renders visible the larger pattern of surveillance proliferation and calls into question its purpose, agenda, and effects. The iSee intervention jolts viewers and users into awareness; it invites inquiry into surveillance devices distributed throughout our lives; it opens up a space for discussion about what kinds of surveillance are acceptable and what kinds are not.

®™ark If the Institute for Applied Autonomy's iSee application offers an intervention for circumventing video surveillance in public spaces, the group ®™ark advocates a more radical and direct approach: destroying the cameras. ®™ark is well known in activist and culture-jamming circles for their high-profile projects. One of their more famous ventures was "The Barbie Liberation Organization," which swapped voice boxes between Barbie and Gl Joe dolls so that Barbie would say things like "Vengeance is Mine!" and Gl Joe would declare things like "Math is hard!" (Dery 1994; Greenberg 1994; RTMark 2000). More recently, ®™ark's "The Yes Men" gained international attention by pretending to be spokespersons for Dow Chemical (which is now the parent company of Union Carbide) and promising on BBC World Television to provide reparations for gas victims in Bhopal, India (Democracy Now! 2004; DowEthics 2004; The Yes Men 2004). ®™ark's guide to surveillance camera destruction is an engaging and deliberately messy website (vis-d-vis their other pages) that celebrates low-tech, decentralized, populist, and "fun" approaches to these activities (RTMark 2000). 3 The "Guide to Closed Circuit Television (CCTV) destruction" is a nut-and-bolts document that supports this humble orientation through its stark design and formulaic style. After the document's title, a provocative black-and-white picture of a CCTV tower, and other prolegomenon (e.g. contact email for suggestions), the page presents an itemized and hyperlinked table of contents for quick reference. The main sections include "WHY DESTROY CCTV CAMERAS," "TYPES OF CCTV CAMERA," "METHODS OF ATTACK," and "TRAINING." Each 3. www.rtmark.com/cctv/

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

179 521

section remains structurally true to the "guide" genre by parsimoniously explaining exactly what one needs to do and what one needs to know, without much other information to distract from the counter-surveillance task(s) at hand. The section on methods of attack is the most provocative. The methods described are placing around cameras plastic bags filled with glue, affixing stickers or tape over camera lenses, shooting cameras with children's highpowered water gun toys filled with paint, temporarily disabling lenses with laser pointers, cutting CCTV cables with axes or garden tools, and dropping concrete blocks on cameras from rooftops. Rather than simply describing methods for disabling video surveillance cameras, however, the instructions reveal a pattern of values and an engaging subtext about camera destruction as an embodied social practice. Thus, methods that draw public attention to the surveillance systems or that reveal cameras as inoperable (such as shooting them with paint guns) are preferred to those that do not heighten public awareness (such as disabling cameras with a laser pointer). Along these lines, regarding the method of bagging cameras, the site says: "To Bag a camera theres a high chance that you can reach it with ease. If this is the case dont hesitate to smash the glass, lens and any other components. Dont bag it afterwards, people need to see the units smashed" (spelling errors and abbreviations are accurate to the website and serve to reinforce the low-tech aesthetic). Other values communicated by the guide are those of fun, efficiency, and permanence. The paint gun method is celebrated as being "Fast, fun and easy," and therefore "Highly recommended"-and, similarly, cutting camera cables emits "Satisfying sparks." The paint gun method is also touted for its relative efficiency ("one hour action can easily take out 10 cameras") compared with the laser pointer, which has questionable efficacy and is therefore not recommended. Finally, permanently destroying equipment is valued more highly than temporarily disabling it: cutting cables "Requires complete costly rewiring" and block drops on cameras will "totally" destroy them "in a shower of sparks." With the paint gun method, by contrast, the camera is "easily cleaned," so the intervention is "only effective for short time only." These counter-surveillance activities are intended to be social and to raise social awareness. A section on "working together" highlights the importance of trusting those you work with and getting to know their strengths and weaknesses. And clearly disabled cameras, much like the website itself, are intended to alert people to the prevalence of unregulated surveillance. (One might even say that camera destruction by activists is not a goal of the website at all; that it instead seeks to provoke the public to insist that controls be placed on surveillance proliferation.) The subtext of the site is one of overcoming both conformity and compliant adaptation to the surveillance society. This message can be read in passages on physical training ("Dont go to the gym-you need to be deconditioned not conditioned") or on learning one's territory ("Don't use paths or streets [only cross them at right angles]"). In this document, embodied social practice and acts of reflexive subversion serve as responses to surveillance technologies that are seen as socially sterilizing.

180

Security and Privacy 522

T. MONAHAN

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

*** On the surface, the Institute for Applied Autonomy's iSee project and ®™ark's guide to surveillance camera destruction may seem like radically different responses to public surveillance. iSee is a high-tech application facilitating circumvention of video surveillance through its generation of paths of least surveillance. The guide to camera destruction, on the other hand, encourages resistance against public surveillance through low-tech, neo-Luddite attacks. Both forms of counter-surveillance, however, focus their attention and critique on the technologies of surveillance, which act as material representations of large-scale monitoring regimes. Neither of them directly targets the public and private institutions that are mobilizing surveillance or the individuals within these institutions. 4 The few glimpses these groups offer of their social or political adversaries reveal them as individual police officers or private security guards who are emboldened by the technologies. Both groups note the valence of video surveillance to amplify existing conditions of discrimination or abuse while obscuring the actors behind the scenes. The IAA's information page for iSee mobilizes social science research on surveillance, complete with academic citations, to thoroughly document trends toward increased abuse of marginalized groups with video surveillance, most notably of minorities, women, youth, outsiders (such as the homeless), and activists. Part of the reason that the technologies lend themselves to these uses, the site explains, is that policies for surveillance oversight, access, or retention are either purposely opaque or nonexistent; this is especially true in the United States because so much public surveillance in that country is conducted by private companies, so the equipment and footage is privately owned. Of course, it should be pointed out that, regardless of regulation or oversight, the technologies themselves insulate the operators from immediate, if not all, scrutiny, thereby encouraging widespread voyeurism of women (Norris and Armstrong 1997; Koskela 2000; Rosen 2001) and profiling of racial and other minorities (Lyon 2001 ). The ®™ark site echoes these sentiments with quotes from well-known surveillance studies scholars (i.e. Clive Norris, Gary Armstrong, and Jason Ditton), but otherwise answers the question of "Why destroy CCTV cameras" with the simple response of "Trust your instincts." Because in the eyes of these groups surveillance technologies catalyze abuse by individuals, the answer is to draw attention to the cameras through provocative websites or, more overtly, to circumvent and/or destroy them. The interventions of these activist groups concentrate on the technical side of modern surveillance, and, while they are explicitly critical of social or 4. In contrast, @™ark's "The Yes Men" clearly do agitate for change on an institutional level (The Yes Men 2003). Their website explains this mission: "Identity theft: Small-time criminals impersonate honest people in order to steal their money. Targets are ordinary folks whose ID numbers fell into the wrong hands. Identity correction: Honest people impersonate big-time criminals in order to publicly humiliate them. Targets are leaders and big corporations who put profits ahead of everything else" (The Yes Men 2005).

Security and Privacy

181

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

523

institutional structures, the tendency is to individualize the problems by individualizing the abusive actors-who are identified either as police or security guards. Still, it must be noted that attention to police officers or other agents of surveillance is marginal in their presentations. As the next section will show, counter-surveillance can also be targeted at the institutional agents enmeshed within corporate or law-enforcement systems and/or located behind the cameras.

Social Interventions While some counter-surveillance activities, such as those described above, direct criticism at the technologies themselves, other modes of intervention seek to engage with specific agents of surveillance (e.g. camera operators, police, security personnel, etc.). This section will investigate two of these social interventions: Steve Mann's "Shooting Back" project, and performances by the Surveillance Camera Players. As with the IAA and ®™ark, these interventions represent two ends of the spectrum from high-tech to low-tech (see Table 1), and, as I will argue, they have similar difficulty in moving critiques of surveillance beyond the level of the individual to their larger institutional and political origins (Mann 2004).

Shooting Back Drawing explicitly upon military metaphors, Steve Mann's Shooting Back project utilizes wearable, high-tech surveillance devices to take video footage "shots" of security personnel and other workers in privately owned stores and shops. Mann (or his collaborators) is equipped with two sets of recording technologies for this project: a covert wearable camera, either implanted in sunglasses or a baseball cap, and a handheld video recorder that is kept concealed in a bag until needed as a "prop." For the intervention, Mann first walks into a store that has a fairly obvious surveillance system, with tinted glass domes in the ceiling, for example. He then asks clerks, managers, and/or guards what the glass domes are for and receives a variety of responses along the lines of "I don't know" or "They're light fixtures" or "They're for security, but you don't need to worry about them if you're not doing anything wrong." All of these interactions are Table 1 Counter-surveillance interventions Mode of intervention Arena of intervention Technical Social

High-tech Institute for Applied Autonomy Shooting Back

Low-tech

®™ark

Surveillance Camera Players

182

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

524

T. MONAHAN

recorded with Mann's hidden, wearable camera. Next, Mann removes the handheld video recorder from his bag and points it in the faces of his interlocutors. As might be expected, they promptly shy away from the recorder, tell him that pictures or other recordings are disallowed in the store, and ask him to leave. Mann responds by parroting their earlier words about not needing to worry about recordings if one is not doing anything wrong and asking them why they are so uncomfortable. Then he leaves. The footage from these interactions is placed on websites for public viewing, and Mann has also created a documentary film from this material. 5 This counter-surveillance intervention is explicitly conceived of as an art project that appropriates surveillance technologies to challenge their dominant meanings and uses. Mann mobilizes a tactic he calls "reflectionism," or reflecting experiences of being under surveillance back on the surveillers, with the goal of destabilizing store employees to make them realize that they are merely "totalitarianist officials" involved in acts of blind obedience and conformity (Mann 2002). Mann writes: It is my hope that the department store attendant/representative sees himself I herself in the bureaucratic 'mirror' that I have created ... [and that this helps them] to realize or admit for a brief instant that they are puppets and to confront the reality of what their blind obedience leads to. (Mann 2002, 541)

Beyond this (somewhat dubious) educational goal, the Shooting Back project further aspires to explode the rhetoric behind systematic public surveillance in places of commerce. For example, the project raises the following question: if surveillance is intended for public safety, then would not more cameras increase the potential for such safety? The answer is an obvious "no," because the primary (intended) function of cameras in stores is theft prevention, and they are as often trained on employees as on customers (Staples 2000). Shooting Back is a provocative project because it calls attention to the embodied experiences of watching and being watched, of recording and being recorded. Usual uses of video surveillance, in contradistinction, tend to erase all sense of embodied action and interaction through their ambiguity (e.g. you do not know who is watching or when), through their integration into infrastructure (e.g. they become the taken-for-granted backdrop to social life), and through their mediation of experience (e.g. camera operators may feel a disconnect from those they are watching, and vice versa). Shooting Back disrupts the illusion of detached, objective, impersonal, disembodied monitoring-a camera in one's face personalizes the experience of being recorded in a very direct and uncomfortable way. One can speculate that the project is especially destabilizing and annoying for employees, because for them store surveillance systems and monitoring practices are institutional projections that they are relatively powerless to alter. 5. weare am. org/ shootingback. html

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

183

525

Mann's rather unforgiving denouncement of individuals working in stores, however, reveals certain assumptions about the problems of modern surveillance. First, by criticizing employees as being "puppets" who blindly accept their companies' explanations for surveillance and comply with company policies, Mann implies that all individuals are rational actors with equal social and economic footing. Thus, if low-income employees elect not to fight the system like he does, then they must be either ignorant or weak-willed, or both. Second, by calling store clerks and security guards representatives of totalitarian surveillance regimes, Mann conflates individuals with the institutions of which they are a part, effectively sidestepping the important but more difficult problem of changing institutional relations, structures, or logics. Both these assumptions lead to the conclusion that one can contend with the problem of rampant surveillance by intervening on the level of the individual and by educating people about their complicity with the systems. Unfortunately, the fact that people have very real dependencies upon their jobs or that vast asymmetrical power differentials separate workers from the systems they work within (and perhaps from the activists as well) become unimportant issues once the critique of surveillance is abstracted and individualized in this way.

Surveillance Camera Players The Surveillance Camera Players (SCP) are a New York-based, ad-hoc acting troupe that stages performances in front of surveillance cameras in public places (SCP 2005b). 6 Founded in 1996 with a performance of Alfred Jarry's Ubu Roi in front of a subway station, they have since performed numerous play adaptations of famous (and not-so-famous) works of cautionary fiction or troubling nonfiction, ranging from George Orwell's Nineteen Eighty-Four to Wilhelm Reich's The Mass Psychology of Fascism (Burns 2001 ). Because most surveillance cameras are not sound-equipped, the troupe narrates their performances with large, white placard signs, which they hold up for remotely located camera operators to read. A performance of Nineteen Eighty-Four, for instance, uses placards describing scene locations (e.g. "ROOM 101") or key lines from the book (e.g. "WE ARE THE DEAD") (Marcus 2000). When possible, fellow troupe members document the plays with video cameras and distribute information brochures to curious spectators. The players are routinely confronted by security guards or NYC Police and asked to disperse, often before the conclusion of their performances. Up close, it appears as if the SCP are directing their messages at camera operators, police, or security guards. Their determination to notice and respond to video surveillance, rather than let it fade uninterrogated into the urban landscape, places them in confrontation with institutional representatives. By speaking to cameras (and their representatives), the actors become perceived as 6. www. notbored. org/ the·scp. html

184

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

526

T. MONAHAN

threats to the political and economic systems that support and indeed demand public surveillance, so institutional agents move in to contend with the perceived threat. As with Steve Mann's concentration on individuals, SCP performances force interactions with others, and, because of this, they draw attention to the always present embodiment of surveillance technologies and the social relations they engender. If one takes a step back, however, the SCP are really performing for the public: they enroll the unwitting police and security personnel into their play so that the public can witness the spectacle and perhaps the absurdity of modern surveillant relations. The troupe acknowledges this staging explicitly: "The SCP no longer consider their primary audience to be the police officers and security guards who monitor the surveillance cameras installed in public places. Today, the SCP concentrate on the people who happen to walk by and see one of their performances" (SCP 2005a). In a mode true to their "situationist" theoretical orientation, the SCP affirm that the revolutionary potential of art thoroughly infuses everyday life because everyday life is a complex artistic performance. In this vein, the SCP seek to repoliticize the everyday by inviting the public to participate in their performances, by inviting all of us to recognize that we are already enmeshed in political performances and that we are required to act-and act well. The primary adversary for the SCP is the state. They are concerned about the erosion of public space and personal privacy brought about by the state's support of police surveillance and its permissive non-regulation of private surveillance. They write: The SCP is firmly convinced that the use of video surveillance in public places for the purposes of law enforcement is unconstitutional, and that each image captured by police surveillance cameras is an unreasonable search. We also believe that it is irresponsible of the government to allow unlicensed private companies to install as many surveillance cameras as they please, and to install them wherever they please. (SCP 2001)

The implication is that the state is not living up to its responsibility to safeguard civil liberties through improved regulation of public surveillance. Thus, SCP performances confront individual agents of public sector and private sector security, but their primary audience is the general public, whom they hope to cast as transformative actors who can collectively agitate for social change, especially on the level of public policy.

*** Both Steve Mann's Shooting Back project and the SCP performances intervene on an explicitly social level by challenging institutional agents of surveillance. Mann draws upon relatively sophisticated technical apparatuses to place store representatives in uncomfortable positions. By doing so, he aims to reflect

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

185

527

back to them the hypocritical logics and empty rhetoric that they impose upon others and to raise their awareness about their complicity with the surveillance society. The SCP, on the other hand, employ decidedly low-tech countersurveillance props (e.g. signs and costumes) to address police and security guards with the aim of creating a public spectacle-and to raise public awareness about the everyday surveillance spectacle of which we are all already a part_ These two interventions share in common their focus on individual representatives of institutionalized surveillance. By engaging with store employees or speaking to those behind the cameras, Mann and the SCP seek to reveal and challenge the larger structures and rationalities that those individuals represent_ A key difference is that the SCP overtly enroll members of the public in activist performances, whereas Mann's project only invites public involvement through the technical mediation of websites. Because of this difference, the SCP seem more successful at moving beyond their initial site of intervention (the individual) to critique institutions for their dominance over the public (which is a relationship betrayed by the ironic juxtaposition of police removing SCP performers from public streets while private companies remain free to monitor the public at will). While each of the four counter-surveillance interventions covered so far seeks to raise public awareness and to mobilize for social change, none of them are completely successful at moving their critique from the individual to the institutional plane. The SCP come closest to doing this, but so far their plays remain too isolated and discrete to effect long-term change. This deficiency may be in part because activists construct surveillance problems in individualized and abstracted terms in order to make them somewhat tractable and receptive to intervention. The challenge lies in ratcheting-up the unit of analysis to the institutional level so that lasting change can be effected. The desired outcomes might take the form of better regulation and oversight of surveillance and/or meaningful democratic participation in the process of setting surveillance policies, for instance. In the long run, as the next section will argue, the oppositional framing of surveillance versus counter-surveillance may be counterproductive for achieving these goals.

Counter-surveillance and Global Systems of Control When viewed from a distance, surveillance and counter-surveillance appear to be engaged in a complicated dance, with the larger, cumbersome partner pushing and pulling while the smaller, defter dancer negotiates herself or himself-and others-out of harm's way. The oafish leader is, of course, the state and corporate apparatus surveilling the public, and the partner is the collective of activist adversaries circumventing or destabilizing surveillance systems. Drawing upon Michel Foucault's insights about the disciplinary potential of modern bureaucratic regimes, one could read this as a disciplinary or panoptic relationship (Foucault 1977). But Foucault was also insistent upon the productive

Security and Privacy

186

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

528

T. MONAHAN

capacity of power to generate and sustain social relations apart from any property of control that might be possessed by individuals. As Gilles Deleuze wonderfully explicates: "Power has no essence; it is simply operational. It is not an attribute but a relation: the power-relation is the set of possible relations between forces, which passes through the dominated forces no less than through the dominating ... " (Deleuze 1988, 27). Therefore, the metaphor of the panopticon (or all-seeing prison) is not a static or transcendent statement of disciplinary power, but is instead a contingent and situated articulation of modernity in a fluid field of production regimes (Foucault 1980; Deleuze 1992). In explicit response to Foucault's work, Michel de Certeau's book The Practice of Everyday Life provides a point of departure for thinking about the agency of individuals and groups within disciplinary power structures. For de Certeau (1984), the practices of the dominant dancer clearly would be strategic ones of building control structures to regulate the activities of those in the field of power, whereas the practices of the defter dancer would be much more tactical, poaching-off the existing structures to create new meanings and possibilities. The two dancers may be in opposition, but that does not change the fact that they are engaged in a reciprocal relationship and collective activity but-importantly-without comparable degrees of power. It is this tense connection that is worth probing, even if there is never an embrace or a union, because after all the exchanges of strategic structuring and tactical appropriation the dance has moved somewhere across the floor and created a pattern, or a logic, or a world that was not there before. 7 Examples of this problematic, if not dialectical, relationship between surveillance and counter-surveillance practitioners abound. After the beating of Rodney King in Los Angeles was captured on videotape in 1991, this did not necessarily catalyze correctives to actions of police brutality, nor did it motivate greater police engagement with urban communities. Instead, police have seemingly used this event to distance themselves further from and maintain antagonistic relationships with communities (Klein 1997; Monahan 2002) while learning from the blow-up that they must exert greater control over the conditions where brutality occurs. This enhanced and learned control can be seen in the torture case of Haitian worker Abner Louima by the New York City Police in 1997. Louima was beaten in a vehicle on the way to the 70th Precinct station house and was then sodomized with the stick from a toilet plunger in the police restrooms (Mazelis 1997; Jeffries 2002). Regardless of the fact that the story did finally emerge, the police officers obviously exercised extreme caution in regulating the places of abuse (i.e. in a police vehicle and in a police restroom), and one can speculate that this level of control was a response to their fear of being under surveillance, and thus held accountable, for their actions. 7. Cameron (2004) likens this type of movement to "spy vs. spy" behavior, noting that "Choosing to address the problems of surveillance through technological fixes opens up some strategic options and shuts down others," perhaps deepening our "subjection" (Cameron 2004, 143).

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

187

529

Another example of the dance of surveillance and counter-surveillance can be witnessed in the confrontations occurring at globalization protests throughout the world. Activists have been quite savvy in videotaping and photographing police and security forces as a technique not only for deterring abuse, but also for documenting and disseminating any instances of excessive force. According to accounts by World Trade Organization protesters, the police, in turn, now zero-in on individuals with video recorders and arrest them (or confiscate their equipment) as a first line of defense in what has become a war over the control of media representations (Fernandez 2005). Similarly, vibrant Independent Media Centers are now routinely set up at protest locations, allowing activists to produce and edit video, audio, photographic, and textual news stories and then disseminate them over the Internet, serving as an outlet for alternative interpretations of the issues under protest (Breyman 2003). As was witnessed in the beating of independent media personnel and destruction of an lndymedia center by police during the 2001 G8 protests in Genoa, Italy (Independent Media Center Network 2001; Juris 2005), those with institutional interests and power are learning to infiltrate "subversive" counter-surveillance collectives and vitiate their potential for destabilizing the dominant system. A final telling example of the learning potential of institutions was the subsequent 2002 G8 meeting held in Kananaskis, which is a remote and difficult to access mountain resort in Alberta, Canada. Rather than contend with widespread public protests and a potential repeat of the police violence in Genoa (marked by the close-range shooting and death of a protester), the mountain meeting exerted the most extreme control over the limited avenues available for public participation: both reporters and members of the public were excluded, and a "no-fly-zone" was enforced around the resort. It could be that grassroots publicizing of protests (through lndymedia, for example) are ultimately more effective than individualized counter-surveillance because they are collective activities geared toward institutional change. While the removal of the 2002 G8 meetings to a publicly inaccessible location was a response to previous experiences with protestors and their publicity machines, this choice of location served a symbolic function of revealing the exclusionary elitism of these organizations, thereby calling their legitimacy into question. So, whereas mainstream news outlets seldom lend any sympathetic ink or air time to anti-globalization protests, many of them did comment on the overt mechanisms of public exclusion displayed by the 2002 G8 meeting (CNN.com 2002; Rowland 2002; Sanger 2002). Michael Hardt and Antonio Negri (2000) would describe these ongoing exchanges between dominant and subordinate groups as a mutual and perhaps unwitting advancement of "Empire"-the larger system of global capitalism and its colonization of lifeworlds. They note, for instance, how humanitarian efforts by western countries first establish discursive universal orders-such as "human rights" -as justification for intervention, and then how these universals are capitalized upon by military and economic institutions as rationales for imperialistic invasions. Similarly, activist struggles appear to teach the system

188

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

530

T. MONAHAN

of global capitalism, or those manning its operations, how to increase strategic efficiency by controlling spaces available for political opposition. From this perspective, the flexible ideologies of the 1960s counterculture movements may have disturbed the capitalist system, but in doing so also described a new territory (the self) and a new mode of operation for the growth of capitalism: Capital did not need to invent a new paradigm (even if it were capable of doing so) because the truly creative moment had already taken place. Capital's problem was rather to dominate a new composition that had already been produced autonomously and defined within a new relationship to nature and labor, a relationship of autonomous production. (Hardt and Negri 2000, 276) The post-Fordist colonizations of public spaces and resources today are outgrowths of an earlier colonization of "flexibility" as a viable and successful challenge to the rigidities of technocratic bureaucracies. I would build upon these observations to say that the conflicts between surveillance and counter-surveillance practices today represent a larger struggle over the control of spaces and bodies. It is doubtful that police or security forces are intentionally manipulating spaces and bodies with surveillance and other strategies because they explicitly wish to neutralize democratic opportunities; in fact, they most likely believe that their actions of social control are preserving democracy by safeguarding the status quo (Monahan 2006b ). Be that as it may, such activities advance neoliberal agendas by eliminating spaces for political action and debate, spaces where effective alternatives to economic globalization could emerge and gain legitimacy if they were not disciplined by police and corporate actions. Therefore, it should not be seen as a coincidence that the demise of public spaces is occurring at the same time that spatial and temporal boundaries are being erased to facilitate the expansion of global capital. The two go hand in hand. Whereas one can readily critique Hardt and Negri for their attribution of agency to capitalism or to the amorphous force of "Empire," their systemic viewpoint is worth preserving in what has become a contemporary landscape of social fragmentation, polarization, and privatization. Dominant and subordinate groups serve as asymmetrical refractions of each other in emerging global regimes. Surveillance and counter-surveillance are two sets of overlapping practices selectively mobilized by many parties in this conflict, but the overall effect is unknown.

Conclusions Are counter-surveillance activities political interventions? Yes, they are clearly political. The central question remains, however, as to which counter-surveillance configurations provide productive critiques and interventions. Because counter-surveillance movements, in my definition of them, seek to correct unequal distributions of power, they do destabilize status quo politics on a case-

189

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

531

by-case basis-on the ground, at specific, temporally bounded sites of contestation. If our vantage point is once removed, however, then individualized counter-surveillance efforts appear to provide the necessary provocations for those with institutional power to diagnose and correct inefficiencies in their mechanisms of controL Even if this second conclusion is persuasive, however, it should not imply that activists and counter-surveillance practitioners should dispense with their interventionist projects, but instead that they should diligently avoid reproducing the exclusionary logics and reactionary stances of those whom they critique. For instance, high-tech interventions may attract public attention because of their innovative use of technologies, but they can defy replication by others without comparable technical capabilities or resources. Furthermore, focusing on individual agents of surveillance (such as store clerks, security guards, camera operators, or police) artificially reduces the complexity of the problem: many of these individuals are underpaid yet completely dependent upon their jobs, so they might be easy targets, but not necessarily the best ones. The strength of social movements lies in their inclusiveness and in their participatory structures (Breyman 2001; Juris 2004). So while these attributes might signify areas of vulnerability for activists, they remain the magnets that draw people into movements and mobilize them behind causes-they are the qualities that need to be nourished for less individualistic and more effective activism to take root.

Acknowledgements The author would like to thank the Institute for Applied Autonomy for its support and Michael Musheno and two anonymous reviewers for generous comments on an earlier draft of this article. Arizona State University, School of Justice Et Social inquiry, USA

References ACLU. 2004. The surveillance industrial complex. New York [accessed 20 October 2006]. Available from http: I /www.aclu.org/FilesPDFs/surveillance_report.pdf; INTERNET. Bijker, W. E., and J. Law. 1992. Shaping technology/building society: Studies in sociotechnical change. Cambridge, Mass.: MIT Press. Bijker, W. E., T. Hughes, and T. Pinch. 1987. The social construction of technological systems: New directions in the sociology and history of technology. Cambridge, Mass.: The MIT Press. Bowker, G. C., and S. L. Star. 1999. Sorting things out: Classification and its consequences. Cambridge, Mass.: MIT Press. Breyman, S. 2001. Why movements matter: The West German peace movement and U.S. arms control policy. Albany, N.Y.: State University of New York Press.

190

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

532

T. MONAHAN

- - . 2003. Moyens de communication, mobilisation rapide et actions preventives contre le guerre. EcoRev: Revue Critique D'Ecologie Politique 12 (Printemps):37-44. Burns, A. 2001. Surveillance camera players. Disinformation, 17 January [accessed 20 October 2006]. Available from http:/ /www.disinfo.com/archive/pages/dossier/id323/ pg1/; INTERNET. Caldeira, T. P. R. 2000. City of walls: Crime, segregation, and citizenship in Sao Paulo. Berkeley, Calif.: University of California Press. Cameron, H. 2004. CCTV and (ln)dividuation. Surveillance Et Society 2 (2/3):136 144 [accessed 20 October 2006]. Available from http:/ /www.surveillance-and-society.org/ articles2(2)/individuation. pdf; INTERNET. Castells, M. 1996. The rise of the network society. Cambridge, Mass.: Blackwell Publishers. CNN.com. 2002. World leaders prepare for G8 summit [accessed 20 October 2006]. Available from http: I /archives.cnn.com/2002/WORLD/americas/06/25/g8.summit/; INTERNET. Davis, M. 1990. City of quartz: Excavating the future in Los Angeles. New York: Vintage Books. de Certeau, M. 1984. The practice of everyday life. Translated by S. Rendall. Berkeley, Calif: University of California Press. Deleuze, G. 1988. Foucault. Translated by S. Hand. Minneapolis, Minn.: University of Minnesota Press. - - . 1992. Postscript on the societies of control. October 59:3-7 [accessed 20 October 2006]. Available from http: I /www.nadir.org/nadir/archiv/netzkritik/societyofcontrol.html; INTERNET. Democracy Now! 2004. Yes Men hoax on BBC reminds world of Dow Chemical's refusal to take responsibility for Bhopal disaster, 6 December [accessed 20 October 2006]. Available from http:/ /www.democracynow.org/article.pl?sid = 04/12/06/1453248; INTERNET. Dery, M. 1994. Hacking Barbie's voice box: "Vengeance is mine!" New Media, May [accessed 20 October 2006]. Available from http:/ /www.levity.com/markdery/barbie.html; INTERNET. DowEthics. 2004. Dow "help" announcement is elaborate hoax. The Dow Company [accessed 18 February 2005]. Available from http:/ /www.dowethics.com/r/about/ corp/bbc.htm; INTERNET. Fernandez, L. 2005. Policing protest spaces: Social control in the anti-globalization movement. Doctoral diss., School of Justice and Social Inquiry, Arizona State University, Tempe, Ariz. Foucault, M. 1977. Discipline Et punish: The birth of the prison. New York: Vintage Books, Random House. - - . 1980. Power/knowledge: Selected interviews and other writings, 1972-1977. Brighton, England: Harvester Press. Gandy, 0. H. 2003. Data mining and surveillance in the post-9/11 environment. In The intensification of surveillance: Crime, terrorism and warfare in the information age, edited by K. Ball and F. Webster. Sterling. Va.: Pluto Press: 26-41. Graham, S., and S. Marvin. 2001. Splintering urbanism: Networked infrastructures, technological mobilities and the urban condition. New York: Routledge. Greenberg, B. 1994. The BLO-Barbie Liberation Organization-strikes. Associated Press [accessed 20 October 2006]. Available from http:/ /www.etext.org/Zines/UnitCircle/ uc3/page10.html; INTERNET. Hardt, M., and A. Negri. 2000. Empire. Cambridge, Mass.: Harvard University Press. Harvey, D. 1990. The condition of postmodernity: An enquiry into the origins of cultural change. Cambridge, Mass.: Blackwell.

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

COUNTER-SURVEILLANCE AS POLITICAL INTERVENTION?

191

533

Independent Media Center Network. 2001. IMC statement on Genoa police raid [accessed 1 October 2003]. Available from http: I /italy. indymedia.org/news/2001 /07/7092.php; INTERNET. Institute for Applied Autonomy. 2003. IAA website [accessed 20 October 2006]. Available from http: I /www.appliedautonomy.com/; INTERNET. Jeffries, J. L. 2002. Police use of excessive force against black males: Aberrations or everyday occurrences. Journal of Mundane Behavior 3 (3) [accessed 20 October 2006]. Available from http:/ /www.mundanebehavior.org/issues/v3n3/jeffries.htm; INTERNET. Juris, J. S. 2004. Digital Age Activism: Anti-Corporate Globalization and the Cultural Politics of Transnational Networking. Unpublished PhD dissertation, University of California, Berkeley. - - . 2005. The new digital media and activist networking within anti corporate globalization movements. The Annals of the American Academy of Political and Social Science 597(1):189 208. Klein, N. M. 1997. The history of forgetting. New York: Verso. Koskela, H. 2000. "The gaze without eyes": Video-surveillance and the changing nature of urban space. Progress in Human Geography 24(2):243 265. Kupchik, A., and T. Monahan. 2006. The new American school: Preparation for postindustrial discipline. British Journal of Sociology of Education 27(5):617-63. Low, S.M. 2003. Behind the gates: Life, security and the pursuit of happiness in fortress America. New York: Routledge. Lyon, D. 2001. Surveillance society: Monitoring everyday life. Buckingham, England: Open University. Mann, S. 2002. "Reflectionism" and "diffusionism". In CTRL [space]: Rhetorics of surveillance from Bentham to Big Brother, edited by T. Y. Levin, U. Frohne and P. Weibel. Cambridge, Mass.: MIT Press. - - . 2004. Shooting Back [accessed 20 October 2006]. Available from http:/ /wearcam. org/shootingback.html; INTERNET. Marcus, G. 2000. Real life rock top 10. Salon.com, 3 April [accessed 20 October 2006]. Available from http:/ /archive.salon.com/media/col/marc/2000/04/03/marcus17 I index1.html; INTERNET. Marx, G. T. 2003. A tack in the shoe: Neutralizing and resisting the new surveillance. Journal of Social Issues 59(2):369 390. Mazelis, F. 1997. What the torture of Abner Louima shows: Capitalism and police brutality. The International Workers Bulletin [accessed 20 October 2006]. Available from http: I I www. wsws.org/ public_html/ iwb9-22/louima. htm; INTERNET. Monahan, T. 2002. Los Angeles studies: The emergence of a specialty field. City 8: Society XIV (2):155-184. - - . 2005. Globalization, technological change, and public education. New York: Routledge. - - . 2006a. Electronic fortification in Phoenix: Surveillance technologies and social regulation in residential communities. Urban Affairs Review 42(2): 169-192. - - . 2006b. Securing the homeland: Torture, preparedness, and the right to let die. Social Justice 33(1 ):95-105. - - , ed. 2006c. Surveillance and Security: Technological Politics and Power in Everyday Life. New York: Routledge. Norris, C., and G. Armstrong. 1997. The unforgiving eye: CCTV surveillance in public space. Report for the Centre for Criminology and Criminal Justice, Hull University. O'Harrow, R. 2005. No place to hide. New York: Free Press. Reiman, J. H. 1995. Driving to the panopticon: Philosophical exploration of the risks to privacy posed by the highway technology of the future. Santa Clara Computer and High Technology Law Journal 11 (1 ):27-44.

192

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

534

T. MONAHAN

Rosen, J. 2001. A cautionary tale for a new age of surveillance. New York Times Magazine, 7 October [accessed 20 October 2006]. Available from http:/ /www.schizophonia.com/ archives/cctv.htm; INTERNET. Rowland, R. 2002. Security at G-8; watching on three fronts. CBC News Online, 24 June [accessed 20 October 2006]. Available from http:/ /www.cbc.ca/news/features/g8/ security.html; INTERNET. RTMark. 2000. The Barbie Liberation Organization [accessed 18 February 2005]. Available from http: I /www.rtmark.com/blo.html; INTERNET. - - . 2001. Guide to closed circuit television (CCTV) destruction [accessed 20 October 2006]. Available from http:/ /www.rtmark.com/cctv/; INTERNET. Sanger, D. E. 2002. In Canada, world's most exclusive summer camp. New York Times, 27 June:15. Schienke, E. W., and Institute for Applied Autonomy. 2002. On the outside looking out: An interview with the Institute for Applied Autonomy. Surveillance&: Society 1 (1 ):102 119 [accessed 20 October 2006]. Available from http:/ /www.surveillance-and-society.org/ articles1 /iaa.pdf; INTERNET. Staples, William G. 2000. Everyday surveillance: Vigilance and visibility in postmodern life. Lanham, Md.: Rowman Et Littlefield Publishers. Surveillance Camera Players. 2001. Why legal action should be taken against the City of New York for its installation of surveillance cameras in public places [accessed 17 February 2005]. Available from http:/ /www.notbored.org/to-the-lawyers.html; INTERNET. - - . 2005a. Founding documents of the Surveillance Camera Players [accessed 20 October 2006]. Available from http:/ /www.notbored.org/scp-founding.html; INTERNET. - - . 2005b. New York Surveillance Camera Players [accessed 20 October 2006]. Available from http: I /www.notbored.org/the-scp.html; INTERNET. The Yes Men. 2003. The Yes Men, film directed by D. Ollman, S. Price and C. Smith. Yes Men Films LLC, MGM. - - . 2004. The Yes Men hijinks: Dow [accessed 20 February 2005]. Available from http:/ I theyesmen.org/hijinks/dow/; INTERNET. - - . 2005. The Yes Men [accessed 20 February 2005]. Available from http:/ /theyesmen. org/; INTERNET. Winner, L. 1986. The whale and the reactor: A search for limits in an age of high technology. Chicago, Ill.: University of Chicago Press. Zukin, S. 1995. The cultures of cities. Cambridge, Mass.: Blackwell.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[12] Resistance against Cyber-Surveillance within Social Movements and how Surveillance Adapts Oliver Leistert Research Fcllmv, Center for Media and Communication Studies, Central European University Budapest. [email protected]

Abstract Activists around the world have developed practices and arc taking distinct measure~ to resist cybcr-survcillancc. These range from using code words and taking out mobile phone batteries during meetings to the usc of privacy enhancing technologies. This article discusses such measures by providing interviews with activists from a variety of countries, as well as by analyzing documents from German law enforcement agencies in a recent case against activists. These documents reveal that the meta-data produced via mobile telephony is at least as important for law enforcement as the content of the calls. Furthermore, if there is not enough meta-data, la\v enforcement will produce it to get to know the whereabouts of activists. This article thus argues that a mutual relationship between resistance and smYcillancc unfolds as one side reacts to the practices of the other: as soon as activists advance in the protection of their contents of telecommunication, the surveilling parties concentrate on meta-data to explore the whereabouts of their targets. To counter this threat only the discontinuation of mobile phone usc has been articulated.

Introduction Actually, the problem here is mainly the killing of activists and journalists. The son of a press freedom icon, the leader of the alternative press during the Marcos years, his son was abducted two or three years ago and not yet released. It is suspected that he died (Eder I Manila). Ederic Eder, founding member of Txt-Power, a civil society group set up after the Second People Power revolt in Manila in 2000 that is advocating free SMS in the Philippines, is pointing out the most drastic dangers of advocating press freedom. But even if it is not murder, from torture to soft repression to career endings, there are many risks involved when being an activist for a common cause. Perhaps all these risks can be limited if appropriate measures are taken with regard to how activists and advocates communicate. A fundamental issue for the needs and demands of activists remains the technology involved and its vulnerability to different kinds of breaches, which may stem from the lawful or unlawful interception of telecommunication. The outcome of such interceptions is generally used as a base for further profiling, and possibly leading to repression at some stage. This article provides a detailed look at the practices on both sides: the surveilled and the surveilling parties. While my aim is not to generalize on the basis of what is presented here, the general outline argues in favor of the same problem globally: as activists develop mobile media practices that try to mitigate surveillance, such as by using code words or even encryption, the surveilling parties adapt as they have access to the telecommunication infrastructures used by activists and thus can access communication and the meta-data thereof in total secrecy. By showing this asymmetrical power relation in detail via

194

Security and Privacy Leister!: Resistance against Cyber-Surveil/ance within Social Movements

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

interviews and a case study based on court documents, it becomes clear that as protest practice integrates mobile media into its domain of agency, the very same practice empowers the surveilling parties. In the first part of this article, I recount a collection of statements given by activists that are engaged in fights and campaigns for social justice focusing on how activists protect themselves against digital surveillance or cyber-surveillance. The second part, by contrast, examines evidence provided to the courts in Germany in a recent § 129a case, which since has been dropped. These documents provide valuable insight into the practices of state cyber-surveillance and as such give an indication of the successes and failures of the measures taken by activists for protection. What is more, it may be possible to draw out alternative measures based on this analysis. The case study about the §129a case shows how much it is meta-data of telecommunication which compromises the telecommunications amongst activists. Of course, content is of interest for the surveilling parties, too. But as this case study shows, law enforcement agencies (LEAs) are even producing meta-data on their own to obtain a more finely granulated perspective on the whereabouts of those surveilled by sending silent SMS. Therefore, whatever defensive practices the activists are developing with their mobile phones, whether it is using code words or using the phone only very rarely, the simple fact of having it switched on is enough to be followed. But while switching off the phone is the only way to remain under the radar, this practice then, as the case study shows, triggers much more suspicion leading to longer periods under surveillance. By counterposing interview statements of activists in Part I with the documented practice of German LEAs, in Part II, 1 want to show that the technical infrastructure neatly serves as a surveillance infrastructure and that activists have no direct means to protect themselves from this threat but to switch off their devices. Thus, it would be appropriate to speak of mobile telephones as a hybrid, quasi dual-use technology that empowers and represses possibly at the same time. The relation of Parts I and IT is mutually inclusive: As activists choose specific means to counter the surveillance scenario posed by mobile media while still maintaining the empowering aspects of it, LEAs also develop specific practices to make mobile media use in activism productive for them. This is a dynamic within specific power relations and power struggles. Activists interviewed in Part responded to surveillance by masking the content of their communication. But as Part IT demonstrates, surveillance has iterated since and responds in a new way by targeting meta-data or data about the communication. This can be understood as a dialectical relation or a power dynamic as Fernandez and Huey (2009) suggest. They recommend that we "examine instances of resistance first, since they are likely going to be not only a response to surveillances practices but also present the new starting ground for the next set of surveillance mechanisms" (200). This article takes up just that challenge, and in some detail. Also, it must be said that the question of the trustworthiness of telecommunication in general is nothing new and the history of efforts to secure communication is enormous and versatile (Kahn 1997). Writing when email, web and mobile telephony were still in a nascent state, two well-known scholars of cryptography, Landau and Diffie (1998), identified the potential impact on privacy as "profound:' Telecommunications are intrinsically interceptable, and this interceptability has by and large been enhanced by digital technology. Communications designed to be sorted and switched by digital computers can be sorted and recorded by digital computers. Commonchannel signaling, broadcast networks, and communication satellites facilitate interception on a grand scale previously unknown. Laws will not change these facts (226).

Surveillance & Society 9(4)

442

Security and Privacy

195

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Leister!: Resistance against Cyber-Surveil/ance within Social Movements

Today, this early warning sounds rather soft. The means and tools of cyber-surveillance have developed tremendously since the late 1990s. Jn a dramatic interplay between technological development, mass scale dissemination of digital devices, a rigorous shift in policy after 9/11 on a global scale, and the ongoing pressure for survival caused by a deregulated global economy, the issues stated by Landau and Diffie affect contemporary users of digital communication. Still, as l show, the "dual use" aspect of mobile telecommunication needs to be taken into much more serious consideration in future debates about the surveillance of mobile phones.

Activism and Digital Communication At any given moment we can be the subjects of surveillance. Police have the interest and the resources to practice surveillance. And there has recently been the creation of the cybernetic police on the federal level. There is also a similar entity in Mexico City. Tt is also well known that there are private technical teams, hired as mercenaries, to do these jobs. (Enrique I Mexico City) For my current research 1 interviewed 50 activists from very different regions of the world about how they use mobile medial and about their thoughts on surveillance. Although from geographically and culturally distinct regions, all of them share the opinion that capitalism is systematically unjust and destructive and thus they all engage in efforts to overcome capitalism and build a more just society based on solidarity, and less mediated by financial means. Their activities span from human rights support to free public transportation actions; from providing non-commercial communication services for activists to running autonomous, non-state funded social centers; from providing free meals through Food Not Bombs to documenting police brutality; from solidarity work for prisoners to supporting small self-organized local unions of fishermen. They all share a critique of the concept of the vanguard and reject vertical and hierarchical organizations. The exception are the interviewees from Pakistan, as these activists have all been engaged in what became known as the "Lawyers Movement" (Ahmed 2010, Malik 2008) or "Anti-Emergency Movement" (Bolognani 201 0) and fought for an independent judiciary and the rule of law in Pakistan. Politically, this movement has been very heterogeneous and includes participants from different backgrounds. ln my interviews 1 was specifically interested in the general use of simpler mobile technologies, such as

SMS, than in more advanced technologies like smart phones and the extended capabilities they provide. This has allowed me to turn my attention to a variety of places, including Sao Paulo, Mexico City, Oaxaca, Tokyo, Manila and Bangalore. Although methodological problems occur and need to be reflected when one wants to compare the use of mobile media by activists and social movements in these places, the benefit is a patchwork of reported experiences that in general show the same difficulties and unsolved issues for the safety of activists in all areas. Of course, political regimes, jurisdiction, and law enforcement agencies differ widely. Nonetheless, the technologies involved are essentially the same all around the globe. Additionally, the massive roll out of mobile phones in the Global South has changed and increased the activists' agency via mobile media as well. But still, J agree in many ways with what Christian Kreutz, a consultant for Information and Communication Technologies for Development, emphasizes: If one takes a look at the examples and different approaches of mobile activism, many potential developments can be identified. All these trends will rely not so much on technology, but much more on the activist's ideas for how to use mobile phones as a means of activism and on a critical mass of people pa1ticipating (Kreutz 2010: 18). 1 Mobile media is an umbrella term that includes devices from mobile phones to laptops and the way they are used, although most of the time mobile phones are the devices used as mobile media. Radio is not pmi of this research.

Surveillance & Society 9(4)

443

196

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Leister!: Resistance against Cyber-Surveil/ance within Social Movements

This article does not focus on the actual impact mobile media has on activists, which are manifold both in terms of changes in agency and in the transformation of patterns of exclusion and decision-making. However, it is possible to neglect the specificities of locations to some extent. I share the assumption that when a pattern of conduct (for example, the substantial enhancement of individual and collective autonomy by wireless communication capability) repeats itself in several studies in several contexts, we consider it plausible that the observation properly reflects the new realm of social practice (Castells et al. 2007: 3). As SMS and basic mobile telephony are of special interest here, and on the other hand the infrastructure for mobile communications in all the regions I visited (except Japan) is compatible with or genuinely built on the Global System for Mobile Communications (GSM) standard, my findings can to some degree be generalized. Taking Care about Content: Code Words and Written Words Nearly all interviewees expressed strong concerns about the cyber-surveillance of their mobile and online communications. While very few stated because their activities were legal they do not have to fear cybersurveillance, all of them still understood cyber-surveillance as a means to silence, censor and repress. It is noteworthy that regardless of the political regime under which they were active or human rights situation they were in, all activists decided to adopt specific measures to protect their activities and to safeguard the well-being of themselves and their colleagues. "We adjust. We change phones, although it is very expensive. For example, when there is a rally tomorrow, we say there is a festival tomorrow" (Mina, Minerva, Joan, Julie I Manila). The use of code words can be seen as a ubiquitous practice in activists' mobile communications. This is not very surprising, as this cultural technique is as old as the struggles themselves. Still, it expresses a deep concern about being monitored and eavesdropped on regularly. As such, it signifies a mistrust that strongly contradicts legislation, which in general provides at least some privacy in telecommunications. Mistrust towards the effectiveness of legislation thus expresses an even deeper concern as it understands "the state" as a non-trustworthy entity, independent from what policy and laws proscribe. "Things that are very secret we don't talk about on the list. We then use some codes, like a call to a party, a lunch" (Legume I Sao Paulo). The list referred to is a simple mailing list, although run by an activist tech collective. Many activists I interviewed differentiate between spoken and written communications, well knowing that the latter are more easily intercepted and less costly, as digital content can be processed easily by computing, i.e. by searching for key words. 'To protect the secrecy, people are encouraged to use the telephone. The email is written down, so it is very easy to be surveilled, but voice phone is only caught by wiretapping, which is rare" (Yasuda I Tokyo). No differentiation is made between online and mobile media. "We don't write anything about politics in SMS" (Non Collective I Manila). But precautions taken can be much more substantial, leading to deliberate offline situations: Regarding mobile phone, if we have an action plan, and if we know the issue is sensitive or the movement is in a sensitive moment, we would not explicitly speak about it, and at the meeting, we take out all the batteries of the mobile phones until the meeting is over (Freddie I Hong Kong). Mistrusting the devices one cmTies around all the time is clearly demonstrated by the fact that a good half of those interviewed, regardless of their region of activities, do not just turn off their device, but remove the battery. This is certainly a reasonable thing to do, although it is unclear whether mobile phones that are not specifically prepared can be activated and used as microphones remotely. Nonetheless, the safety gained is twofold: first, one goal of cyber-surveillance is to produce fear and suppress freedom of speech. Surveillance & Society 9(4)

444

Security and Privacy

197

Leister!: Resistance against Cyber-Surveil/ance within Social Movements

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Therefore, any measure taken to feel safer is of high significance for political agency. Second, while one's own phone might not be the actual problem, those of others might be.2 These drastic measures resonate with Braman's (2006) observation that in general the expectation of privacy has decreased, which she understands as an increasing asymmetry of human beings and their agency to cope with advanced surveillance technologies: This loss of an actual expectation of privacy affects identity like other invasions of privacy do, but also provides a species-level challenge. As biological organisms, we still feel that if we pull the blinds and whisper, we will be private, though these actions are now irrelevant to the actuality; our senses and what we need to sense no longer operate at the same scale or level of granularity. Concerns about the impact of this trend include the likelihood of "anticipatory conformity" and decreased loyalty to a surveillance-driven government, as well as a chilling of association and free speech (Braman 2006: 130). Although such technology can easily be rendered harmless by way of complete disconnection, the danger of offline surveillance remains. This is something an activist from Mexico, who prefers to remain anonymous/ unambiguously states: There was a time, when at a social center they had a sign on the wall and that said "turn off your phone and take the battery out." They established this rule that everyone that would go to a meeting had to turn off the phone and take the battery out. The reason being that if you turned off the phone and leave the battery in it, it could be used Isol that people would be able to listen to your conversation through your telephone. It was one of those things that I saw, where I thought: you should be more concerned what you talk in a local bar than taking the battery out of your cell phone (Anonymous I Mexico City). As a general practice taking out batteries seems to be a very strong indicator of how ambivalent activists see mobile phones. A critical use of mobiles and the knowledge when not to use them is instrumental for professional organizers, such as Saldanha from Bangalore, who is very active in rural areas in India: I try not to use my mobile for critical contents; I tend to use for that a landline. I prefer to meet people. Once we organized 5000 people: we got a call from the communities saying there is a public hearing and we need your help to mobilize and they gave us two days time. We arrived one day before the public hearing and we met with the key leaders. So we thought that is simple we just phone them but they said: no, don't do that, turn off your phone, I want you to go house to house, mobilize them. Tt worked, next day there were 5000 people. If we had done it through SMS, there would have been countermobilization. So tactically, it was useful not to use it. Tt is so much easierfor the police to tap you than to go and stand and watch you (Saldanha I Bangalore). Thus, undeniably a strategic use of mobile phones is key for political agency, which does not only choose amongst means of organizing, but may abandon mobile media altogether. In Oaxaca, Mexico, during the uprising of the teacher's union (APPO), which faced severe and deadly repression by federal police, the sheer availability of mobile telecommunications is already perceived dubiously: 2]t is worth mentioning that the iPhonc's warranty is Yoidcd when users do this. 3 A lot of the interviewees either use pseudonyms or wanted to remain totally anonymous for the sake of their personal securitya demand that I fully comply with due to my research ethics.

Surveillance & Society 9(4)

445

198

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Leister!: Resistance against Cyber-Surveil/ance within Social Movements

The mobile phone network was absolutely working during the 200612007 protests. Without interruptions. That is really strange. because Oaxaca was a strategic point of counterinsurgency and all this shit since 2006. In one minute they can shut down all the mobile phones, but they didn't. I don't know why. Maybe capitalism is bigger than we think. Here in Mexico, the boss of the cell phone networks is one of the richest in Mexico. Maybe in situations like this, surveillance and network are the same. This all is part of the contradiction (Blax I Oaxaca, Mx). The asymmetric nature of surveillance also gives rise to interpretations of uncommon sounds and bursts as symptoms of surveillance. Unfamiliar noises and cracks during calls, for no apparent reason, are thus becoming a signifier of cyber-surveillance, although there is no hard proof of it. I am certain many of our phones are wiretapped, because weird things happen with the phone. Mainly because of things that have leaked and that could not have leaked otherwise. So, I am sure some phones are wiretapped. I am not sure what kind it is. During phone calls there are weird noises. Also when calling older activists (M I Sao Paulo). The reality of eavesdropping on activists' communication lines is highlighted when, without public notice of a meeting, authorities are present at meeting points or activists are visited at home after some crucial phone calls they made. The connection of cracks during calls with subsequent real occurrences can still be wrong though as contemporary digital signals can be copied without any loss of quality-a different situation to analog telephony. Beyond a strong embodied practice of what one should or should not say on the phone or write down in digital communications, and the distrust in general towards these digital devices, there are also more firm measures taken to safeguard communication amongst some activists, such as privacy enhancing technologies. Privacy Enhancing Means Securing telecommunications is a hard task. Although the application of common tools like PGP and GnuPG 4 is becoming more frequent amongst activists, very impottant issues remain unsolved. "We use GnuPG for email. We use TLS for the website" (Yasuda I Tokyo). An even more tech savvy activist states: "Encryption, in email TLSISSL, in Jabber OTR and SSL, and VPN to access data. But T do not encrypt my emails by default, because 90 percent don't use email encryption" (lokese I Madrid). As large parts of the Global South lack access to the Internet for the average user and only few people have the financial means to use 3G phones, SMS remains the default solution for common telecommunications. Tn countries like India or the Philippines, SMS is reasonably cheap and affordable for large segments of the populations. But enhancing privacy of SMS remains largely unsolved. "Tf we can encrypt the messages we are sending, this would be very useful" (Mina, Minerva, Joan, Julie I Manila). Thus the mobile phone's use arguably remains limited for activists. "The mobile phone is used only regarding communication about having arrived somewhere-nothing else. In terms of Internet, we use PGP and for voice over IP we use Skype" (Francisco I Mexico City).

4 GnuPG, an open source software that allows users to encrypt and sign their data and communication, features a versatile key management system as well as access modules for all kinds of public key directories. PGP is its non-open source version, from where email encryption for the masses started in the 1990s.

Surveillance & Society 9(4)

446

Security and Privacy

199

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Leister!: Resistance against Cyber-Surveillance within Social Movements

While activists are taking a variety of heterogeneous measures to make their telecommunication safer, the actual interest of the surveilling parties has shifted from the content of telecommunications to data that reveals over a longer period of time something different to the interceptors: meta-data. Recent initiatives by the European Union, for example, to retain meta-data of all telecommunications within the European Union for at least six months demonstrate the significance of such data to LEAs.s There is no satisfying way-at least technologically-to protect one from such surveillance schemes, mainly because this data is a necessary condition for most of the communication technologies involved to function. Thus Landau (20 I 0) comments: It does not help to tell people to be secure. In order for their communications to be secure, security must be built into their communications systems. It must be ubiquitous, from the phone to the central office and from the transmission of a cell phone to its base station to the communications infrastructure itself (99). All that is needed to identify patterns of communications, reconstruct social relations, places and times, frequent whereabouts, and much more, is computational powers and databases; very limited personnel is needed to collect and process meta-data. Not only does this meta-data facilitate analysis of the past but enables predictions about future whereabouts of activists. Only a few activists expressed any concerns about meta-data surveillance at all; they were largely unaware or had at best an imprecise idea about its implications. The powers of collected meta-data is nothing one can see at work easily. It is done elsewhere and symptoms of surveillance, be they imagined or not, like phones not working properly, do not occur from such measures. It is the infrastructure of telecommunications itself, which delivers such data, and the telecommunications providers are the voluntary or involuntary helping hands. The collection and analysis of a specific group's communication meta-data is done unobtrusively. Often it is collected without specific reason and never gets used for further investigations. The following example from Germany demonstrates that LEAs even produce such meta-data themselves to track down suspects' locations and movement.

Section 129a and the Case of "MG" in Germany: It's the Infrastructure, Stupid! So, the thing ... around mobile phones that most shocked me was when it was revealed that the US carriers where surveilling citizens without warrants, AT&T and these things. That was a big knock. Not that I was that surprised, but that it was just so common. That really led me to thinking that surveillance is more of a day-to-day problem (Freitas I NYC).

To illustrate the contemporary possibilities for authorities to investigate and surveil with the help of telecommunication providers, the case of the German MG (Militante Gruppe: militant group) is insightful. The material shown here had been delivered to the courts for preliminary proceedings by the investigating authorities themselves. This makes them highly valuable in a specific sense: usually it is only possible to interpret symptoms of surveillance or be faced with its consequences, whereas here a documented account of such measures is available.6 What conclusions the authorities made from this material is not of primary interest. Of much greater interest is what kind of surveillance had been used, how data had been received, what data was specifically produced to monitor and so forth. For the subject of cyber-surveillance only technical surveillance aiming at the telecommunication of the accused is represented here. Left aside are all other kinds of surveillance, like personal surveillance or video camera surveillance, which was conducted throughout all these years as well. What is more, details of the eavesdropping operations of phone calls are left out. It suffices to say that all calls to and from the accused's mobile and fixed-line 5 See my discussion of this ElJ directive in Leistert 2008. A recent study by Frauenhofer Institute came to the conclusion that data retention does not significantly help law enforcement. See http://vds.brauchts.net/MP! VDS Studie.pdf. 6 The material documented here is provided by sources that prefer to remain anonymous. These are parts of scanned paper pages produced by LEA's. The court reference numbers are GBA 2 B.ls SR/06-2 and ST 45 I ST 14- 140011/06.

Surveillance & Society 9(4)

447

200

Security and Privacy Leister!: Resistance against Cyber-Survei/lance within Social Movements

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

phones had been intercepted, as well as those of relatives and friends. The number of people affected by surveillance operations adds up to more than 200, even though there were only three suspects. The primary purpose of presenting this material is to demonstrate the weakness of telecommunication infrastructure with regards to privacy. This example is illustrative as no legal obstacles prevented the surveilling parties from their work in any way, due to the application of the anti-terror law §129a. These surveillance operations are presented here on a level of documented evidence and demonstrate, in a nutshell, the technical possibilities of the telecommunication surveillance unleashed.

Section 129 of the German Criminal Code Some context is needed beforehand: in Germany, law enforcement's legal means for infiltrating, surveying and detaining political opponents after World War 11 have been steadily extended since the 1970s. The most prominent is § 129a of the criminal code, dealing with terrorist organizations. Its older variant§ 129 deals with criminal organizations and actually predates the Federal Republic of Germany. It was extensively abused during the Nazi era for the persecution of imagined or real opponents. Only a few years later in the 1950s, hundreds of investigations against alleged communists and activists opposing the rearmament of West Germany were performed applying this controversial paragraph. Subsequently §129a was introduced, covering terrorist organizations in the 1970s, and after 9/11 § 129b was introduced, dealing with foreign terrorist organizations.? These laws basically strip suspects of every last bit of their (privacy) rights and have been used effectively by LEAs most frequently to update their knowledge on leftist activists. Hardly any of the numerous § 129a investigations made it to court. More than 95% are silently shut down, often after years of very intense surveillance.s The introduction of this paragraph in the 1970s was amongst other means meant to deal with the Red Army Fraction (RAF) and the Revolutionary Cells (RZ).9 Although these actors now belong to the past, the paragraph has continued to see numerous applications to investigate leftist or social justice groups.lll The application of§ 129(a,b) allows far-reaching surveillance not only of those under suspicion, but also of those who have been in contact with those under suspicion-even if only once. As the suspects are commonly engaged in a diversity of political fields, the application of such an investigation produces a complex reproduction of the social net of many politically active people, regardless if they are themselves suspects defined under the investigation or not. Sharing a flat, belonging to the family or working at the same company is enough to become a target of extensive surveillance once§ l29(a,b) is at work.

The MG Investigations The MG investigations, into the alleged terrorist group known by these initials, started in 2001 (with some pre-proceedings by the German secret service since 1998). Amongst the many different allegations were 7 An official English translation of the German criminal code incl. §129 can be found at http://ww\v.gesetze-imlnlernel.de/englisch_slgb/englisch_slgb.hlml. Bul the surveillance means allowed lo deploy are defined in the code of criminal procedure (Strafprozessordnung, StPO). Especially StPO §I lOa (surveillance of telephone and post), §IOOc and §163 (longongoing observations), §I lOa, §IIOc (systematic deploying of undercover agents and spies), and again §IOOc (surveillance of acoustics and images inside private homes) give almost unrest1icted pmvers to LEAs. An English translation of the StPO can be found at h!ill:ILlvi:UY~s~z~mJn!~r~till;/t;Jigfuch_s1Qoiill9J'2\.h!J.I11. 8 The surveillance conducted under these paragraphs does not always seek to remain unnoticed. The suspects thus react to the investigation~ and provide the LEAs further insights into their social net. Additionally, suspects reportedly have lo~t jobs and suffered as well psychologically from investigations. As such this is highly problematic as this all happens even before pre-trial confinement. 9 Both were using force to achieve political goals, the big difference is that the RAF, whose founding generation is known as Baader-Meinhof-Group. went underground whereas the RZ personnel had their normal day job \vhile pursuing RZ activities at night. Their genealogy can be traced back to the events of 1968, in part e\,en until the protest against the rearmament of WestGermany in 1955, \vhich was a big debate at the time. 10 Right-wing groups are very rarely targets of §129(a,b) although violence originating from right wing groups in Germany, especially against humans, including murder, continues since Gennany 1s reunification.

Surveillance & Society 9(4)

448

Security and Privacy

201

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Leister!: Resistance against Cyber-Surveil/ance within Social Movements

attacks on German military equipment. The material shown here is from the so-called "MG I" investigation, which started in 200 I and ended in 2008. Others are still pending, as there have been numerous different investigations.!! The surveilled had been accused of forming a terrorist group. None of the accused in this case have been sentenced and the case never become a regular court case.12 Additionally, on March 11 2010, the Federal Supreme Comt (BGH) ruled that the entire set of procedures used by the LEAs in this case (MG 1), had been unlawful and that the surveillance conducted was not appropriate as there never was a reasonable enough suspicion .13 The data collected nonetheless remains in the police archives and has recently been transferred to the Berlin criminal state police. In the following passages, some details about the surveillance operations are explained. Translations are by the author.

Retained Meta-Data of (Mobile-) Telephony Figure I shows a typical Auskunftsersuchen (request for information), which reports call data from 1.10.2006 to 31.3.2007 (only 3.10. to 19.10.2010 is shown here, but the astonishingly long duration of five months is mentioned in the upper left area) from one MSISON .14 Besides the number dialed, other information is printed: MCC' 5 266 refers to Germany, MNC 16 01 refers toT-Mobile, the MSC-10 17 is responsible for the end-to-end connection, e.g. allowing hand-over requirements during the call. The Cell10 references the actual cell which the phone was logged in, and on the far right, geographical positions of this cell-10 are printed. Generally all telephone communication meta-data to and from the mobile phones of the suspects had been retained and provided to the authorities. These meta-data include, amongst other rather pure technical parameters, the phone numbers, the TMST 18 (if applicable), duration of call, type of call (it differentiates between service call types), and the geo-coordinates of the cell where the client was connected. This means that during any communication over the phone, its geo-coordinates reveal the location of the person that used it. From a technical perspective the same meta-data is produced by both successful and failed connections. Thus, unsuccessful communication also provides value for the surveillance operations.

11 Details about the cases, their timelines and each investigation are published at htl]lli:~i"J]Me_lhll}g_.~~Vi!J.e1. 12 Anne Roth, pminer and mother of the children of one suspect in these investigations, Andrej Holm, has been blogging extensively about her life under total surveillance. Here, stunning details are published, partly in English

h.ttlliiill!.mili. ~lJ.lQQI.w.. ,Qig.

13 The rule is online (German only): hllp://juris.bLmdesgerichLshoLde/cgi-

!>.inl!:;;chJs!llec!HLn.£/QQcllll!e!!LPYGt;.ri\:l!\c:c.bgb._ 30.11.2009 "The Tech Lab: Brcndon Rilcv" BBC news 10 April 2009 :Shttp://ncws.bbc.co.uk/go/pr/ti/ /2/hi/technology/7992480.stm> 30.11.2009 Robert McMillan. "IBM system to scan streets at Beijing Olympics", NYC Created 06 December 2007 30.11.2009 Ibid. Julia Kantor, "Midtmm Manhattan Anti-terror plan to cost $24 million" Epoch Times. Oct 05 2009. Jeff Roush. ··Gun shot detectors: Pushing murder into the next town''" blog posted 13 May 2009 30.11.2009 Naomi Klein, '·China's All-seeing Eye". op. cit. [15]. !bid. Ibid. Article C at page 3 of the NYl'D Public Security Privacy Guidelines published 02 April 2009

30.11.2009

[30] Naomi Klein. ''China's All-seeing Eye". op.cit. [15] [31] Scan Dodson,'·Goldcn Shield" in ICON 070 April 2009 30.11.2009 [321 Q & A with Naomi Klein 30.11.2009

[33] Kucrbanjiang Saimaiti (1111): ·'Sorry, Your Ethnic Oroup Can't [3~1

rJsl

Use the lntemet." China Digital Times 03 October 2009 Global voices Advocacy, "Internet rights declaration" 09 October 2009 30.11.2009

See Professor Donald .E Clarke, ''Lawyers and the state in China: Recent developments", Testimony Before the Conbrressional-f:xecutivc Commission on China Washington.

D.C. October 7. 2009 [36] NYPD Public Security Privacy Guidelines 02 April 2009 op. cit. [29] [37] Ibid. [38] Ibid. [39J Ibid. f401 Ibid. at Art C, page 3 [41] Ibid. at Art D page 3-4

[42] Chrsitpher Dunn as cited in '·Lm\er Manahattan: The Eye:-.

have if' 17 March 2009. The Shadowland JournaL 30.11.2009 [43J Ibid. [44] Ibid. [45J NYPD Public Security Privacy Guidelines 02 April 2009 op. cit. [29]

L46J Paul Humphreys. ''Video Surveillance on Public Streets: A New Law f.ntOrcement Tool tOr Local Oovernments·'. in

NYSBA One on One. Summer 2008. VoL 29. No.I mentions \Va:-.hington D.C and San Francbcio a:-. two of the tiny minority of municipalities which have produced any legislation at all on v·ideo surYeillance.

[47] Recommendation No. R (87) 15 of the Committee of Ministers to Member States regulating the Use of Personal Data in the Police Sector, Cmmcil of .Europe, Strasbourg,

1987 [.\8] Ibid. [.\9] European Union Council framework Decision 2008/977/JHA

of 27 November 2008 on the protection of personal data processed in the framev·/Ork of police and judicial cooperation

in criminal mallers OJ'Iicial Journal L 350 , 30/12/2008 P. 0060- 0071

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Part III Privacy, Data Protection and Security

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[15] Data Protection Pursuant to the Right to Privacy in Human Rights Treaties LEE

A.

BYGRAVE 1

Abstract This paper examines the extent to which the basic principles of data protection laws may he read into provisions in human rights treaties proclaiming a right to privacy. Two such provisions are analysed in detail: Art 17 of the International Covenant on Civil and Political Rights and Art 8 of the European Convention for the Protection of Human Rights and Fundamental Freedoms. Case law developed pursuant to both provisions indicates that each has the potential to embrace all of the core principles typically found in data protection laws. However, this case law currently falls short of data protection laws in terms of both ambit and prescriptory guidance.

1

Introduction

Catalogues offundamental human rights and freedoms as set out in certain multilateral treaties provide much ofthe formal normative basis for law and policy on data protection. This is expressly recognized in many data protection laws themselves. For example, the main object of the Council of Europe's (CoE) Convention on data protection" is 'to secure ... for every individual ... respect tor his fundamental rights and freedoms, and in particular

1 BA (Hons), LLB (Hons) (Australian National University); BarristeroftheSupreme (~ourtofNewSouth \Vales; Research Fellow at the Norwegian Research Centre for Computers and La'i\'. University of Oslo. Thanks go to Erik Boe for helpful comments on an earlier draft of this paper. ~Convention for the Protection oflndiYidual" ''~'ith regard to Automatic ProcessingofPersonalData (ETS No 10H), acloptecl28.l.l981, inhJrce 1.10.1985.

234

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

DATA 1-'1{0 J'lsure of'private characteristics, actions or data'). 1

253

239

Security and Privacy

240

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

DATA 1-'1{0 J'l, wtjJra n. 20. 'lG J.11arrkx, supra n. Ti, para. 31; "'lir~v, supra n. 33, para. 31. " 7 Ga,lhin v United Kingdonl (19R9) Series A, \Jo 100, discussed inp-a section 4.4. 3 ~ X and Y v Setlwrland1· (1985) Series A, No 91, para. 23.

257

243

244

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

DAIA 1-'ROIECIION PURSUANt TO I'IIE RIG! IT 10 1-'RI\'/\.CY I·.I'C.

organs."" This is in contrast to the General Comment by the Human Rights Committee on Art 17 of the ICCPR which, as shown above, clearly establishes that Art 17 necessitates protection of persons from interferences by private bodies' data-processing practices. It is extremely doubtful that the Courl or Commission would nol inLerpreLArl 8 as providing some measure of protection against the data-processing activities of private bodies, particularly given that these activities are regulated by most European data protection laws, including the CoE Convention on data protection. There is, in other words, firm evidence of common ground amongst CoE member states for applying data protection rules to private data controllers. There is also evidence from as far back as 1970 of some consensus amongst member states for holding that Art 8 ought to be read as affording protection from at least some private actors' data-processing activities. 41' Another matter, though, is that even if the Court were to hear a case of alleged abuse by private bodies, it would be likely to act extremely cautiously when determining the extent to which the state concerned has not fulfilled its positive obligation(s) to protect persons from such abuse. This is because the Court would run the risk of prompting state intervention in the private sphere which could in turn undermine the very interests that Art 8 or olher Convenlion provisions (eg, Arl 10) are inlended Lo safeguard. Accordingly, iL is probable Lhal Lhe Courl would accord slales a broad

'1'' In this respect, special note should be made of 1Virwrv C:nitr>d Kingdom (197H) AppllORil/84, 4R DR },::;4, in vd1ich the Commission refused to find that a private organization', publication of a book containing both true and false st.aternents about the applicant's sexual activities amounted to a breach of Art R. The Commission\; refUsal '''as based partly on the fact that English lm\' provided the applic;_mtv.·ith a remedy in defamation as far as the publication of the false statement'l'hich there is no remedy at all on the bets. Nclr doe~ it address the situation where- the- inte-rfe-re-nce- 'vith privacy take-s the-form cJfan intrusion (e-g by electronic eavesdropping or photog1aphy) in search ofinfonnation. in which case a limitation on freedom oft'xpn·~sion or any otlwr Convt'ntion right \VO!tld notlw dirt'ctly imulvefl': Hcurb el a!, s11jna tL 27. 326. Moreover. the H'inrrdecision has litde bearing· on hmv the Court or Commission mig-ht assess a range ofod1er situations inYolving privsembly on 23.1.1970 ('The right to privacy afl(Jrded by Article R ... "hould not only protect an indiYidual against interference by public authorities, but also against interference by private persons including ma&s media'). Note too that there seems to be broad support amongst academic commentators on the ECHRh>r construing Art R so d1at it covers data processing by private bodies: see, eg, van D~jk & van lloof, Them)' and Prrulia (1 the /'.'urofmm Comrention on Human Righl!J (Deventer/Bo~ton: Kluwer Law and Taxation Publishers, 1990, 2nd ed), 372; Falck, Pn"!Jonwrn sorn mrnnrskerdt. Drn europeiskr> mnmeskt'rt>ttight>tskonrHm.~jon artihkd 8 sorn shrankrfUr innsamling, behawlling og bruk ru' personofJpl;,mingcr, Det _juridiske fakultets skritherie nr 56 (Bergen: University of Bergen, 1995). 24-25: Feldman, 'The Developing Scope of Article: 8 of the European Convention on Human Rights' (1 Y97) EHRLR, 265, 272; Clapham, Human Rights in the Pri·oatr SjJlu:rt> ( Oxf()rd: Clarendon Press, 1993), 214, 286. The latter work should be singled out for particular mention on account of its excellent. sustained argument that the ECHR generally ought to apply so as to protect victims of abuse from private bodies.

258

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

'margin of appreciation' in such a case, at least as a point of departure for its deliberations. 41 Up until the present day, the bulk of case law concerning data protection pursuant to Art 8 has centred upon state authorities' processing of personal data. I present this case law below, first in the light of Art 8 ( 1), then in the light of the exemptions in Art 8(2). This is followed by a presentation of case law on the rights of persons to gain access to, and rectifY, data kept on them by public authorities.

4.2

Interference with respect to Article 8( 1)

The case law makes clear that the surreptitious interception by state agencies of a person's communications constitutes an interference with the person's right under Art 8(1), and thus falls to be justified under Art 8 (2). The leading cases here are Klass and Others v Germany42 and Malone v United Kingdom/' each of which were occasioned by state agencies' secret tapping of persons' telephone calls. In both cases, the Court held these activities as contravening Art 8(1), though in Klass the Courtwenton to find the activities justified under Art 8(2). The more recent case of IIalford v United Kingdom confirms that telephone calls need not be made in a domestic setting in order to qualify for protection under Art 8; also telephone calls made from business premises may be covered.'' The mere existence oflaws and practices allm~ing state agencies to carry out secret surveillance of citizens may be sufficient to interfere with citizens' rights under Art 8 ( 1) .'5 This considerably eases the burden on an applicant of showing that he/ she has been the victim of an interference occasioned by surreptitious surveillance measures. 46 A similar line has been taken in relation to scrutiny by prison authorities of prisoners' correspondence: the mere fact that prison rules allow for the opening and perusal of prisoners' correspondence may mean that a prisoner can claim to be a victim of interference "ith his/her right pursuant to Art 8(1) YIn the absence oflaws and practices permitting surveillance, \ictim status will only be recognized

11 See further Clapham, sujml n. 40, 2201T. "(1978) Series A. No 28. "(1984) Series A, No 82. 11 Supran. 3,), para. 44. 4:; Kla!Js, supra n. 42, paras. 34 & 41; JHalone, supra~ n. 43, pants. 64 & RO. w According to Art 25 of the ECH R, a private party may only bring an action before the Str.:tsbourg organs on the basis thathe/she/itis a victim of a breach of the Convention. The conditions undervd1ich a person rna) claim victim status without having to prove yictimization ·are to be determined in each case according to the Convention right or rights alleged to have been infringed, the secret character of the measures objected to, and the connection ben.veen the applicant and those measures': Klas~, sujn'fln. 42, para. ?,4. See ab.o paras. 64 & 86 of the /\Ja/onejudgement. 1; See CamjJbell1.r l'nited Kingdom (1Y92) Series A, No 233, para. 33.

259

245

246

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

DATA 1-'1{0 J'Irskning, G()teborgs L niversitet, l99i), 179-19i. '~The Commis~ion opinion in the ca~e is also ambiguous on the issue. though it seems to treat both the secrecy elcrnent and the information content a~ necessar) constituents of the interference(s) occasi(med by the processing of the inf(>rmation. 'i:' See also Feldman, ,.,ujJm n. 40, 271 ('overt surveillance might interfere with the right to respect for private life in sorne circumstances .. .').

260

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

need not be surrept1t10us m order to amount to an interference with respect to Art 8 ( 1).'" As for the refusal of opportunity to challenge the information (ie, the refusal to lift the veil of secrecy "is-a-vis Leander), it is sometimes claimed this amounted to a separate (third) interference."" The validity of this claim is, al lhe very lcasl, queslionablc. The synlax of lhe above-ciled slalemenl would suggesl lhal lhe Courl viewed lhe refusal as simply aggravaling lhe interferences occasioned by the storage and release of the information (note especially the wording 'which were coupled' as opposed to just 'coupled'). So too would subsequent case law. 5" However, a later passage in the judgement could be read as indicating otherwise. At para 66, the Court stated: The fact that the information released to the military authorities was not communicated to Mr. Leander cannot by itself warrant the conclusion that the interference was not 'necessary in a democratic society ... ', as it is the very absence of such communication which, at least partly, ensures the efficacy of the personnel control procedure .... the apparent link between the term 'interference' and the 'fact' of non-communication of the information to Leander is far from lighl, and lhe passages preceding lhis slalemenl seemlo link 'inlerference' exclusively lo lhe slorage and release of lhe informalion. Indeed, lhe Court's discussion of the denial of opportunity to challenge the information appears to arise only in relation to assessing whether or not such denial robbed the interference incurred by the information's storage and release, ofjustification under Art 8 ( 2). r.i The Commission has held that 'a security check per se'will not amount to an interference with the right to respect for private life pursuant to Art 8 ( l); an interference will only occur 'when security checks are based on information about a person's private affairs' .5" Lustgarten and Leigh interpret the Commission's statement here as 'reject[ing] the assertion that com pi~evertheless,

See, eg, .\'ilvrr and othe-r.\ v United Kingdom (198.:1) Series A, No 61. For an overview of case law on prisonep;' correspondence, see .Jacobs & VVhite, The Eun1mln (;onvrntion on Human RiJ;,ilts (Oxford: (Jarendon PreS5, 199fi, 2nd ed), 197-204. ;"'See, eg, Eggen, Vernd mn ytring~fiiht>ten rtter rut. 10 i Drn eurojwiskt> menneslw:rrtti,~hetskonven.~jonrn (Oslo: Lniversitet~forlaget, 1994), 66 (n. 100) & 72; Sch'iveizer, 'Europ~iisches Datenschutzrecht- \Vas zu tun bleibt' (1989) Dalemdntlz nnd Datemicherung~ 542,545. ; 1, See especially Gaskin v UnitPd Kingdom (1989) Series A, )Jo 160, para. 41, in which the Court describes the intetferences in lfander a~ arising only from 'compiling·, storing, using and disclosing priyate information about the applicant'; no mention is made of the refusal of opportunit) to gain access to the information. Huwever, the Gaskin judgement opens up the possibility for arguing that denial of access to the information violated a positive obligation on the Swedhh state to ensure ·respect' for Leander's Art 8(1) right: sec: the: discussion ofthe.iudgement (infra section 4.4) in connection with informational access and rectification right'\. '' See paras. 5Rtf. :;e Hilton, mpra n. 48, 117. '

1

261

247

248

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

DAIA 1-'RO'JlUjJra n. 124, para. 52.

IC>

280

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Security and Privacy

demands for access to information that was only of personal character and only of importance for the data subjects. As several writers have argued, the Court may well reach a different decision if the information in question is of a matter of general public concern. Hence, if the case of Gaskin had involved a demand for access to information revealing alleged abuses of power by Lhe social welfare auLhoriLies which affecled more persons Lhan jusl Gaskin himself, Lhe Courl mighL have allowed for a righL of access pursuant to Art 10. 140 The strength of this supposition is not weakened by the recent decision of the Court in Guerra. 141 Here, the Court held that the state authorities' failure to take, of their own accord, steps to inform citizens about serious nearby environmental hazards did not amount to a violation of the citizens' right to information under Art 10. 14" However, the Court did not thereby rule that the state authorities could have no duty, pursuant to Art 10(1), to give out such information upon request. 1401 As for a putative right tor data subject~ to rectify data registered on them, one of the very few cases in point is Chave neejullien in which the applicant sought erasure of the record of her illegal confinement in a psychiatric ward. Her action failed, v.>ith the Commission finding the continued storage of the disputed record to be in accordance with the law and necessary in a democratic society for the protection ofhealth. 144 Nevertheless, the Commission arguably recognized Lhal Arl 8 (1) embodies, in Lhe circumslances of Lhe case, a prirna facie claim for recLificaLion/ erasure of dala. Other cases in point concern the refusal by state authorities to rectify official records, particularly birth certificates, so as to reflect accurately the changed sexual identities oftranssexuals. 145 In these cases, the Commission has consistently found such refusal to violate a transsexual's right to respect for private life under Art 8(1). In doing so, it has recognized Art 8(1) as protecting the interest of transsexuals in being able to determine for themselves their sexual identity, both in relation to themselves and to others. And it has recognized that this interest- described by Harris et alas one of 'self-identification' 14" - is significantly affected by the way in which transsexuals are represented in personal data registers. 147

140 Eggen, sufJran. 55, 67. See also \\Teber, 'Environmental Information and the European Convention on Human Rights' (1991) 12 Hinj, 180; Loukaides, t:ssaJ'i on the DruelopinK IJtw of Human Rij!,hls (Dorclrecht/ Boston/London: lvfartinus Nijho11Publishers, 1995), 22. 141 Sujna n. 1:n. 1 ~~The Commis&ion, by a narro\\' m~jority vote, took the opposite vie\v: see Guerra and Others v Italy (1996) Appll4967 /89, reported at [ 1996] VII HRCD 878. 143 Indeed, seven judges indicated the: view Art 10 as embod}ing such a duty. 111 Chavem'ejullien, supmn. R2, 156. 11 'i See, eg, Van Oostrrt.v{jch t' BPlgium ( 1979) B 36; Rees t' [/nitt>d Kin,~dorn (1 986) Series A. No 106: Cosst~y v United Kingdom (1 990) Series A, No 1 R4; and B v France (1992) Series A, No 232-C. Note that the Court did not consider the merits of the Van Oo,·tmvijck case as the applicant wa~ found not to have c:xhaustcd all remedies available to him pursuant to domestic lav1'. +" Harris d al, supm n. 27, 307. 111 See especially Van OoslenJ.J~jck, sujna n. 146, paras. 46 & 52. 1

281

267

268

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

DAIA 1-'ROIECIION PURSUANt TO I'IIE RIG! IT 10 1-'RI\'/\.CY I·.I'C.

The Court has been more conservative. In Reesand Cossey, the refusal by British authorities to alter the applicants' respective birth certificates to reflect their changed sexual identities was found by the Court not to breach Art 8. The Court held that to accede to the applicants' requests would require the C nited Kingdom to undertake extensive modification of its exisling syslem of birlh regislralion, crealing problcmalic consequences for lhe resl oflhe populalion. 14' The Courl emphasized lhal slales parlies enjoy a wide margin of appreciation with regard to recognising the legal status of transsexuals, given the lack of 'common ground between the Contracting States in this area' . 149 It noted also that transsexuals in the C nited Kingdom are free to change their forenames and surnames at will, and that their sexuality as registered in their birth certificates does not have to appear on many of the official documents with which they are issued.' 50 In B v France, however, the Court found for the applicant, distinguishing the case from those of Rees and Cossey on the grounds that the position of transsexuals in France was more difficult than in the UK and that the administrative changes necessary to accede to the applicant's request were not as major.m In all three cases, the Court stressed that the issue in dispute concerned the extent of a state party's positive obligations flowing from the notion of 'respect' in Art 8 (l), rather than the extent to which there had been 'interference' wilh Arl 8(1) righls. For lhe Courl, lhe 'mere refusal' of a slale parly lo reclify lhe official records in queslion could nol amounl lo inlerference.152 Accordingly, the Court did not find it strictly necessary to apply Art 8 (2). As noted above in relation to the Gaskinjudgment, the Court commented, nevertheless, that the aims listed in Art 8 (2) could be of some relevance in striking a fair balance between the interests of the community and those of the individual."'" The majority decisions of the Court in Ptees, Cossey and B 11 France seem simply to have characterized transsexuals' relevant interests in terms of avoidance of harm suffered when transsexuals are forced to disclose to others their transsexuality. The majority decisions have refrained from explicitly recognising transsexuals' interest in 'self-identification'. This interest is intimately connected with a more general interest in freely developing one's personality. V\'hen the latter interest has received con-

s Rers, sujnn n. 146, paras. 42-44; CosStJ'. sujna n. 146, para. 3H. Rers, sujJran.l46, para. 37; Cossey, sufnan. 146, para. 40. no Rl:'es, supra~ n. 146, para. 40. nl B o Frana, sujJra n. 146, paras. 49-63. m Rl:'es, ,IUjJTa n. 146, para. 35; Cos,\ey, supra n. 146, para. 36. The m~jorityjudgment of the Court in B v France appears alw to have accepted this approach. Cfpara. 3.4 of.Judge Martens' dissentingjudgment in Cossey ('it is at least questionable 'iYhether the Court rightly held ... that in the Rees case only the exi,tence and the scope of the jJosilivl' application of such techniques to existing records, rather than to new transactions, is referred to here as file analysis: "The files arc most useful where they enable the system quickly and unerringly to single out the minority of their clients who warrant some measure of social control" (Rule, quoted in [67]). File analysis can be effective in searching out what Marx and Reichman refer to as "low-visibility offenses" [35]. In a recent instance in the United Kingdom, government investigators applied file-analysis techniques to detect and prosecute multiple applications for shares in "privatized" government enterprises such as Telecom and British Petroleum. Screening, front-end verification, front-end audit, and file analysis may all be undertaken with varying degrees of sophistication. Transaction data may be compared against a formal standard or other norm, for example, highlighting those tax returns that include deductions above a certain value or show more than, say, eight dependents. The norms against which the data are compared may be either legal or other a priori norms that have been set down in advance by some authority, possibly for good reasons, possibly quite arbitrarily. Alternatively, they may be a posteriori norms that were inferred from analysis of the collection of records. Alternatively, transaction data may be compared against permanent data, for example, highlighting tax returns where the spouse's name does not match that on file. Or transaction data may be compared against other transaction data, for example, highlighting people whose successive tax returns show varying numbers of dependents. Th•3 previous examples arc each based on a single factor. Judgments of any complexity must be based on multiple factors, rather than just one. Profiling, as it is commonly known, may be done on the basis of either a priori arbttrary or pragmatic norms, or on a posteriori norms based on empirical evidence. Rule noted in 1974 that the IRS used an a posteriori technique for predicting the "audit potential" of different returns. It did this by inferring unknown characteristics from known characteristics by applying discriminant analysis to a small random sample of returns [55, p. 282]. Marx and Reichman's description of this technique is "correlating a

Communications of the ACM

number of distinct data items in order to assess how close a person comes to a predetermined characterization or model of infraction" [35, p. 429]. These authors further distinguish "singular profiling" from ·'aggrega· tive profiling," which involves analyzing transaction trails over a period of time. Sophisticated profiling techniques are claimed to hold great promise because they can detect hidden cases amid large populations. Benefits could be readily foreseen from profiles of young people with proclivities toward particular artistic and sporting skills; propensity for diseases, disorders, delinquency, or drug addiction: or suicidal or homicidal tendencies. A recent O.T.A. report noted that most U.S. federal agencies have applied the technique to develop a wide variety of profiles including drug dealers, taxpayers who underreport their income, likely violent offenders, arsonists, rapists, child molesters, and sexually exploited children [43, pp. 87-95].

Facilitative Mechanisms Mass-dataveillance techniques may be successfully applied within a single personal-data system. but their power can be enhanced if they are applied to data from several. These systems might all be operated by the organization concerned or by a number of distinct organizations. In such cases a preliminary step may he undertaken:

Computer matching is the expropriation of data main-

tained by two or more personal-data system,;. in order to merge previously separate data about large numbers of individuals.

Matching has become technically and economically feasible only during the last decade, as a result of developments in IT. The first large program reported was Project Match, undertaken by the U.S. Department of Health, Educatiou and Welfare (HEW]. now known as Health and Human Services (HHS). By 1982 it was estimated that about 500 programs were carriec. out routinely in U.S. state and federal agencies [69 ], and O.T.A. estimated a tripling in usc between 1980 and 1984 [43, p. 37]. Moreover. a succession of federal laws, culminating in the 1984 Budget Deficit Reduction Act, imposed matching on state administrations as a condition of receiving federal social-welfare fund'.ng. (For references descriptive of and supportive of matching, see [25], [39)-[41], [65], and [66]. Cautionary and critical comments are to be found in [22]. [23], [26], [35], (43], [50], and [61]). Matching makes more data available about each person and also enables comparison between apparently similar data items as they are known to different organizations. Rather than relating to a single specified person for a sp~cific reason, matching achieves indiscrirninate data cross-referencing about a large number of people for no better reason thao a generalized suspicion: "Computer matches are inherently mass or class

May 1988

Volume 31

Number 5

Security and Privacy

298

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

505

investigations, as they are conducted on a category of people rather than on specific individuals ... in practice, welfare recipients and Federal employees are most often the targets" (43, p. 40]. Matching may be based on some common identifier that occurs in both files, in which case the error rate (measured by the proportion of undetected matches and spurious matches) will lend lo be fairly low. There are few opportunities for such matching, however, and instead il is usually necessary to correlate several items of information. Intuitively, name, birth date, and sex seem appropriate, but it appears that greater success has been achieved by using some component of address as a primary matching criterion. Dangers of personal dataveillance

• • • • • • • •

Wrong identification Low data quality Acontextual use of data Low-quality decisions Lack of subject knowledge of data flows Lack of subject consent to data flows Blacklisting Denial of redemption

DaHgers of mass dataveillattce (1)

To the individual

• Arbitrariness

• Acontextual data merger • Complexity and incomprehensibility of data

• Witch hunts

• Ex ante discrimination and guilt prediction • Selective advertising

• Inversion of the onus of proof

• Covert operations • Unknown accusations and accusers

• Denial of due process (2)

To society

• Prevailing climate of suspicion • Adversarial relationships • Focus of law enforcement on easily detectable and

provable offenses • Inequitable application of the law • Decreased respect for the law

• Reduction in the meaningfulness of individual actions • Reduction in self-reliance and self·determination

• • • •

Stultification of originality Increased tendency to opt out of the official level of society Weakening of society's moral fiber and cohesion Destabilization of the strategic balance of power

• Repressive potential for a totalitarian government

FIGURE 3. Real and Potential Dangers of Dataveillance

Individuals may be judged lobe interesting because of • the existence of a match where none should exist, • the failure to find a match where one was expected, • inequality between apparently common data items (e.g., different numbers of dependents), or

May 1988

Volume 31

Number 5

• logical inconsistency among the data on the two files (e.g., the drawing of social-welfare benefits during a period of employment). Curiously, the current U.S. government matching guidelines (40] define matching only in terms of the first of these criteria. An additional facilitative mechanism for both personal and mass dataveillance is referred to here as data concentration. The conventional approach is lo merge existing organizations in search of economies of scale in administration. If the capabilities of large-scale dataprocessing equipment were to continue to increase, the merger of social-welfare and internal-revenue agencies could be anticipated enabling welfare to be administered as "reverse taxation."

Dataveillance is, by its very nature, intrusive and threatening. Organizational merger is an old-fashioned "centralized" solution. The modern, dispersed approach to data concentration is to establish systems that can function as the hub of a data-interchange network. For example, the U.S. government has developed new systems to facilitate routine front-end verification. These initiatives, in the name of waste reduction, involve both federal government sources (including IRS and criminal records) and private-sector credit bureaus. and their use not only extends across many federal government agencies, but is also imposed on state welfare administration agencies (14; 42, pp. 68-74]. The Australian government's proposal for a national identification scheme involved just such a coordinating database (9]. DATA VEILLANCE'S BENEf'ITS AND DANGERS In this section the advantages dataveillance techniques offer are briefly discussed. Greater space is then devoted to the threats dataveillance represents. Figure 3 summarizes these dangers. Benefits Significant benefits can result from dataveillance. The physical security of people and property may be protected, and financial benefits may accrue from the detection and prevention of various forms of error, abuse, and fraud. Benefits can be foreseen both in government activity (e.g., tax and social welfare) and in the private sector (e.g., finance and insurance). (For the limited literature on the benefits of matching, see (16], (25], (43, pp. 50-52], (65], and [66]. Literature on the benefits of other dataveillance techniques is very difficult to find). Some proponents claim that the deterrent effect of public knowledge that such techniques are applied is significant, perhaps even more significant than direct gains from their actual use. There may be symbolic or moral value in dataveillance, irrespective of its technical effectiveness.

CommunicatiotJs of tlze ACM

Security and Privacy

299

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

506

Few people would contest the morality of an organization applying the more basic techniques, for example, record integration and screening. Some would go so far as to regard organizations that did not apply modern IT in such ways as failing to fulfill their responsibilities to taxpayers and shareholders. Nevertheless, dataveillance is, by its very nature, intrusive and threatening. It therefore seems reasonable that organizations should have to justify its use, rather than merely assuming its appropriateness. DangE•rs of Personal Dataveillance Because so few contemporary identification schemes use a physiological identifier, they are, at best, of moderate integrity. Rather than individuals themselves, what is monitored is data that purport to relate to them. As a result there is a significant likelihood of wrong identification. The vast majority of data systems operators are quite casual about the quality of most of their data; for example, the 0. T.A. reported that few federal government agenc:tes have conducted audits of data quality [43, p. 26]. For many organizations it is cost-effective to ensuw high levels of accuracy only of particular items (such as invoice amounts), with broad internal controls designed to ensure a reasonable chance of detecting errors in less vital data. Some errors are intentional on the part of the data subject, but many are accidental, and some are a result of design deficiencies such as inadequate coding schemes. Similar problems arise with other elements of data quality such as the timeliness and completeness of data. Even in systems where a high level of integrity is important, empirical studies have raised serious doubts [30; 43, pp. 52-53]. Data quality is generally not high, and while externally imposed controls remain very limited, it seems likely that the low standards will persist. People and matters relating to them are complicated, and organizations generally have difficulty dealing with atypical, idiosyncratic cases or extenuating circumstances [35, p. 436]. A full understanding of the circumstances generally requires additional data that would have seemed too trivial and/or expensive to collect, but also depends on common sense, and abstract ideas like received wisdom, public opinion, and morality [54]. When the data are used in their original context. data quality may be sufficient to support effective and fair decision making. but when data are used outside their original context, the probability of misinterpreting them increases greatly. This is the reason why information privacy principles place such importance on relating data to the purpose for which they are collected or used [44], and why sociologists express concern about the "acontextual" nature of many administrative decision processes [35]. Much front-end verification is undertaken without the subject's knowledge. Even where an organization publicizes that it seeks data from third parties, the implications of the notice arc often unclear to the data

Communications of the ACM

subject. International conventions stipulate that data should not be used for purposes other than the original purpose of collection, except with the authority of Jaw or the consent of the data subject (e.g., (44]). Where consent is sought. the wording is often such that the person has no appreciation of the import of the consent that is being given, or the bargaining position is so disproportionately weighted in favor of the organization that the data subject has no real option but to comply. Effective subject knowledge and consent mechanisms arc necessary, both as a means of improving data quality, and to avoid unnecessary distrust between individuals and organizations. Front-end audit and cross-system enforcement give rise to additional concerns. Their moral justification is not obvious, and they create the danger of individuals being effectively blacklisted across a variety of organizations. Credit-bureau operations are extending in some countries into insurance, employment, and tenancy. Acute unfairness can arise, for example, when organizations blacklist a person over a matter that is still in dispute. It is particularly problematic where the person is unaware that the [possibly erroneous. iucomplete, or out-of-date) data have been disseminated. Finally, even where individuals have brought the problems upon themselves, blacklisting tends to deny them the ability to redeem themselves for past misdemeanors. Dangers of Mass Dataveil!ance to the Individual Mass dataveillance embodies far greater thnoats. In respect of ear.h individual, mass surveillanc!::l ir;; clearly an arbitrary action, because no prior suspicion existed. The analogy may be drawn with the powers of police officers to interfere with the individual's quiet enjoyment. If a police officer has grounds for suspecting that a person has committed, or even is likely to commit. an offense, then the police officer generally has the power to intercept and perhaps detain that person. Otherwise, with rare and. in a democratic state, well justified exceptions, such as national security emergencies and, in many jurisdictions, random breath testing, even a police officer does not have the power to arbitrarily interfere with a person. With mass dataveillance, the fundamental problems of wrong identification, unclear, inconsistent, and context-dependent meaning of data, and low data quality are more intense than with personal dataveillance. Data arising from computer matching are especially problematic. Where there is no common identifier, the proportion of spurious matches [type (1) errors) and undetected matches [type [2) errors) can be very high. The causes include low quality of the data upon which computer matching depends (variants, misspellings, and other inaccuracies. and incompleteness]. inappropriate matching criteria, widely different (or subtly but significantly different) meanings of apparently equivalent data items, or records with differing dates of applicability. Marx and Reichman report a New York State pro-

..i\tfay 1988

Volume 31

Number 5

Security and Privacy

300

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

507

gram in which half of the matches were spurious due to timing problems alone [35, p. 435]. In addition, the meaning of the record as a whole must be properly understood. Although it might seem improper for a person to be both in employment and in receipt of a socialwelfare benefit. many pensions and allowances are, in law, either independent of, or only partially dependent on, income from other sources. Data on the error rates of matching programs are difficult to find: They arc mostly conducted away from the glare of public, or indeed any other kind of, supervision. In an incident in Australia in 1986, the federal agency responsible for the Medicare scheme calmly, and without apparent legal authority, expropriated and merged data from several federal government agencies, relating to all inhabitants of the small island state of Tasmania. The agency reported the 70 percent hit rate across the databases as a good result, confirming its belief that a national identification scheme could be based on such a procedure. They ignored the implication that across the national population the records of nearly five million persons would remain unmatched, and failed to apply any tests to establish what proportion of the 70 percent were spurious matches and what proportion of the 30 percent nonmatches were failures of the algorithm used. Australians embrace a popular mythology that everyone in Tasmania is related to everyone else. For this reason alone, the agency might have been expected to recognize the need for such testing. The complexities of each system (particularly a country's major data systems such as taxation and social welfare) are such that few specialists are able to comprehend any one of them fully. It is arguably beyond the bounds of human capability to appreciate the incompatibilities between data from different systems and to deal with the merged data with appropriate care. Computer matching, therefore, should never be undertaken without the greatest caution and skepticism. Profiling makes a judgment "about a particular individual based on the past behavior of other individuals who appear statistically similar" [43, p. 88]. Statistical techniques such as multivariate r.orrelation and discriminant analysis have limited domains of applicability that are often poorly understood or ignored. Even if the statistical procedures are properly applied, a profile needs to be justified by systemic reasoning. In the hands of the inadequately trained, insufficiently professional, or excessively enthusiastic or pressured, profiling has all the hallmarks of a modern witch-hunting tool. Profiling is not restricted to retrospective investigation. It purports to offer the possibility of detecting undesirable classes of people before they commit an offense. 0. T.A. documents a "pre delinquency" profile developed for the U.S. Law Enforcement Assistance Administration [43, p. 90). Even if the technique is perceived to be successful, its use seems to run counter to some fundamental tenets of contemporary society. It is

unclear on what moral and, indeed, legal grounds profiling may be used to reach administrative determina-

May 1988

Volume 31

Number 5

tions about individuals or discriminate between individuals. Such vague constraints may not be sufficient to stultify an attractive growth industry. With computer displays and printouts lending their (largely spurious] authority to such accusations, how will the tolerance needed in complex social environments be maintained? Not only in government. but also in the private sector, dangers arise from both the effectiveness and ineffectiveness of profiling. The combination of consumer profiles with cheap desktop publishing is dramatically altering the cost-effectiveness of customized "mail shots." Applied to cable television, the technique will enable the operator to selectively transmit "commercials" to those subscribers who seem most likely to be susceptible to the client's product (or perhaps just the advertisement]. Whereas Vance Packard could only prophesy the development of such technology [47], the components can now be identified, and the economics described. Conventional justice is expensive and slow. Some procedures are now being structured. particularly in such areas as taxation, such that a government agency makes a determination, and individuals who disagree must contest the decision [61]. This inversion of the onus of proof exacerbates the prohlems of misinterpretation resulting from data merger, and uncertainty arising from correlative profiling. It is further compounded by the imbalance of power between organization and individual. :vlarx and Reichman provide an example in which individuals were confronted by a complex of difficulties: A remote examination authority statistically analyzed answer sheets, and threatened students who had sat in the same room and given similar (incorrect) answers with cancellation of their results unless they provided additional information to prove they did not cheat [35, p. 432]. Some dataveillance is undertaken with dubious legal authority or in the absence of either authority or prohibition. To avoid being subjected to public abuse and perhaps being denied the right to undertake the activity, it is natural for organizations to prefer to undertake some operations covertly. There are also cases where the benefits of surveillance may be lost if it is not undertaken surreptitiously (e.g., because of the likelihood oft he person temporarily suspending, rather than stopping, undesirable activities; or of "skips" on consumer credit transactions). To protect the mechanism or the source, an individual may not be told that dataveillance has been undertaken, the source of the accusation, the information on which the accusation is based or even what the accusation is. Such situations are repugnant to the concept of due process long embodied in British law and in legal systems derived from it. Dataveillance tends to compromise the individual's capacity to defend him or herself or to prosecute his or her innocence. In its most extreme form, one Kafka could not anticipate, the accuser could be a poorly understood computer program or a profile embodied in one.

Communications of the ACM

Security and Privacy

301

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

508

Social Dangers of Mass Dataveillance At ths social level, additional problems arise. With personal dataveillance, investigation and monitoring normally take place after reasonable grounds for suspicion have arisen. Mass surveillance dispenses wilh Lhat constraint because the investigation is routinely performed and the suspicion arises from it. The organization therefore commences with a presumption of guilt on the part of at least some of the data subjects, although at the beginning of the exercise it is unknown which ones. The result is a prevailing climate of suspicion. The organizational functionary who communicates with the data subject often only partially understands the rationale underlying the decision, prefers not to admit that lack of understanding, and is often more concerned with case resolution than with public relations. Hence, there is an increased tendency for organizations and data subjects to develop adversarial relationships. Moreover, since organizations generally have the information, the size and the longevity. the bargaining positions are usually unequal. Some of the "atypical, idiosyncratic, and extenuating cases" that are uncovered by mass dataveillance are precisely Lhe deviants who are being sought. But others are just genuinely different, and such people tend to have difficulties convincing bureaucrats that their idiosyncrasies should be tolerated. Dataveillance encourages investigators to focus on minor offenses that can be dealt with efficiently. rather than more important crimes that are more difficult to solve. Law enforcers risk gaining a reputation for placing higher priority on pursuing amateur and occasional violators (particularly those whose records arc readily accessible, like government employees and welfare recipients), rather than systematic, repetitive, and skilled professional criminals. The less equitably the law is perceived to be enforced, the greater the threat to the rule of Jaw. An administrative apparatus that has data available to it from a wide variety of sources tends to make decisions on the person's behalf. Hence, a further, more abstract, yet scarcely less real impact of dataveillance is reduction in the meaningfulness of individual actions, and hence in self-reliance and self-responsibility. Although this may be efficient and even fair, it involves a change in mankind's image of itself, and risks sullen acceptance by the masses and stultification of the independent spirit needed to meet the challenges of the future. Some people already opt out of official society, preferring bureaucratic anonymity even at the costs of foregoing monetary and other benefits, and, consequently, attracting harassment by officialdom. There may already be a tendency toward two-tiered societies, in which the official documentary level of government facts and statistics bears only an approximate relationship to the real world of economic and social activity. If uncontrolled dataveillance were to cause the citizens of advanced Western nations to lose confidence in the fairness with which their societies are governed, it would be likely to exacerbate that trend.

Communications of the ACM

An increase in the proportion of economic activity outside mainstream society would prompt. and be used to justify, a further increase in the use of mass surveillance. Assuming that world politics continuEs to be polarized into an East-West confrontation, it would be very easy to justify tighter social controls since any sign of serious weakening in the moral fiber and integrity of the West would be destabilizing. Since "mastery of both mass communications and mass surveillance is neces-

sary for an elite to maintain control" [57, p. 176], IT will be a major weapon whereby ruling groups seek to exercise control over the population. Finally, it is necessary to mention [but not overdramatize) the risk of dataveillance tools supporting repressive actions by a future invader, or by the "dirty-tricks department" of some democratically elected government. gone, as Hitler's did, somewhat off the rails: "Orwell foresaw-and made unforgettablea world in which ruthless political interests mobilized intrusive technologies for totalitarian ends. What he did not consider was the possibility that the development of the intrusive technologies would occur on its own, without the spur of totalitarian intent. This. in fact, is what is now happening" [57, p. 179]. In general, mass dataveillance tends to subvert individualism and the meaningfulness of human decisions and actions, and asserts the primacy of the state. SAFEGUARDS Intrinsic Controls over Dataveillance Some natural controls exist that Lend to limit the amount of datavcillance undertaken. The most apparent of these is its expense. There have been claims of dramatic success for matching schemes, but these have generally been made by the agencies that conducted them, and independent audits are hard to come by. The U.S. government's original [1979) guidelines on matching required that cost/benefit analyses be undertaken prior to the program being commenced (39]. However. there are many difficulties in undertaking a cost/benefit analysis of such a program. Many benefits are vague and unquantifiable, and many expenses are bidden or already "sunk." As a result, the requirement was rescinded in 1982 and has not been reimposed (40]. Moreover, there is seldom any other legal or even professional requirement that a cost/benefit analysis be performed [61, p. 540]. In 1985 O.T.A. concluded that few U.S. government programs are subjected to prior cost/ benefit assessment [43, pp. 50-52]. Although reliable audits arc difficult to find. anecdotal evidence throws doubt on the efficacy of matching. In the original Project Match, HEW ran its welfare files against its own payroll files. The 33,000 raw bits that were revealed required a year's investigation before they could be narrowed to 638 cases, hut only 55 of these were ever prosecuted. Of a sample of 15 cases investigated by the National Council for Civil Liberties after HEW released the names of the people involved. 5 were dismissed, 4 pleaded guilty to misdemeanors

May 1988

Volume 31

Number 5

302

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

509

(theft under $50), and only 6 were convicted of felonies. No prison sentences resulted, and the fines totaled under $2,000 [14, 49]. A 1983 match between Massachusetts welfare and bank files found 6,500 hits in five million records, resulting in 420 terminations of benefits, but also much confusion and recrimination [50]. Recent U.S. government reports have also raised doubts about the economic worth of many matching programs. A more positive report on several local government systems is to be found in (16]. There is very little evidence concerning the economics of other datavcillance techniques. Effective cost/ benefit assessment, however, appears to be very rare (e.g., (43, pp. 80~81]). Unless credible cost/benefit analyses are undertaken, at least retrospectively, and preferably in advance, the potential economic safeguard against excessive use of dataveillance cannot be realized.

Extrinsic Controls over Dataveillance The establishment of extrinsic controls over dataveillance cannot even be embarked upon until comprehensive information privacy laws are in place. Proper protection of privacy-invasive data handling was stillborn in the United States in the early 1970s by the limited official response associated with Westin [74~76], the Privacy Act of 1974, and the PPSC report [48]. Westin found no problems with extensive surveillance systems as such, only with the procedures involved, and the PPSC's aim was to make surveillance as publicly acceptable as possible, consistent with its expansion and efficiency [59, pp. 75, 110]. The U.S. Privacy Act was very easily subverted. Publication of uses in the Federal Register has proved to be an exercise in bureaucracy rather than control. The "routine use" loophole in the act was used to legitimize virtually any usc within each agency (by declaring the

Some countries ... have no information privacy legislation .... In jurisdictions where information privacy safeguards do exist, they are piecemeal, restricted in scope, and difficult to enforce. Economic controls, even if they were effective, may not be sufficient to protect individual freedoms. In the early years of personal-data systems, the dominant school of thought, associated with Westin, was that business and government economics would ensure that IT did not result in excessive privacy invasion (74~76]. This view has been seriously undermined by Rule's work, which has demonstrated that, rather than supporting individual freedoms, administrative efficiency conflicts with it. Organizations have perceived their interests to dictate the collection, maintenance, and dissemination of ever more data. ever more finely grained. This is in direct contradiction to the interests of individuals in protecting personal data (55~59]. Meanwhile, the onward march of IT continues to decrease the costs of dataveillance. Another natural control is that surveillance activities can incur the active displeasure of the data subject or the general public. Given the imbalance of power between organizations and individuals, it is unrealistic to expect this factor to have any relevance outside occasional matters that attract media attention. Another, probably more significant control is that an organization's activities may incur the displeasure of some other organization, perhaps a watchdog agency, consumer group, or competitor. In any case, these natural controls cannot be effective where the surveillanr:e activities are undertaken in a covert manner. Intelligence agencies in particular are subject to few and generally ineffective controls. Also, many controls, such as the power to authorize telephone interception, may not be subject to superordinate control. Intrinsic controls over dataveillance are insufficient to ensure that the desirable balance is found.

May 1988

Volume 31

Number 5

efficient operation of the agency to be a routine use) and then virtually any dissemination to any other federal agency (by declaring as a routine use the efficient operation of the federal government) (see (35, p. 449: 43, pp. 16~21: 48]). Rule's thesis (55~59]-that privacy legislation arose out of a concern to ensure that the efficiency of business and government was not hindered-has been confirmed by developments in international organizations and on both sides of the Atlantic. The OECD's 1980 Guidelines for the Protection of Privacy were quite explicitly motivated by the economic need for freedom of transborder data flows [44]. In the United States, the President's Council for Integrity and Efficiency (PCIE) and Office of Management and Budget (OMB) have worked not to limit matching, but to legitimize it (25]. In the United Kingdom, the Data Protection Act of 1984 was enacted explicitly to ensure that U.K. companies were not disadvantaged with respect to their European competitors. There have been almost no personal-data systems, or even uses of systems. that have been banned outright. Shattuck [61, p. 540] reported that, during the first five years, the OMB's cavalier interpretation of the Privacy Act had resulted in not a single matching program being disapproved. Few sets of Information Privacy Principles appear to even contemplate such an extreme action as disallowing some applications of IT because of their excessively privacy-invasive nature. Exceptions include those of the New South Wales Privacy Commitlee (38], which are not legally enforceable, and, with qualifications. Sweden. This contrasts starkly with the mnclusions of observers: "At some point ... the repressive potential of even the most humane systems must

Communications of t!Ie ACM

Security and Privacy

303

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

510

make them unacceptable" [59. p. 120]; and "We need tu recognize that the potential far harm from certain surveillance systems may be so great that the risks outweigh their benefits" [33, p. 48]. Sorne countries, such as Australia, have no information privacy legislation, and only incidental protections exist, such as breach of confidence, telephonic interception, trespass, and official secrecy [18]. In jurisdictions where information privacy safeguards do exist, they are piecemeal, restricted in scope. and difficult to enforce. In particular, many countries restrict the protections to government data or computer-based systems, or make no provision for such conventional safeguards as detailed codes of practice. oversight by an adequately resourced and legally competent authority. or the right to sue for damages. Moreover, technological developments have rendered some information privacy protections ineffective. For example, the O.T.A. concluded that "the Privacy Act . offers little protection to individuab who are subjects of computer matching" [43, p. 38] (sec also [17] and [63]).

Avenues of Change Only once the principles of fair information practices have been engrained into our institutions and our ways of thought will it be possible to address the more complex, subtle, and pervasive threats inherent in contemporary IT. In some countries the courts have absolved themselves of responsibility to change the law for policy reasons, unequivocally asserting not just Parliament's primacy in, but its exclusive responsibility for, law reform. In the United States, although the Bill of Rights does not mention a right to privacy, the courts have progressively established such a right based on elements of several of the amendments. The likely present view of the U.S Supreme Court, however, might be indicated by this quotation: "I think it quite likely that self-discipline on the part of the executive branch will provide an answer to virtually all of the legitimate complaints against excesses of information-gathering'' (Rehnquist, 1971, then a spokesperson for the fustice Department, now Chief fustice, quoted in [59, p. 147]). Moreover, courts throughout the world have difficulty with cases involving recent developments in technology [10, 72]. Accordingly, they prefer to await statutory guidance from parliaments, with their generally betterfinanced and less-fettered access to technological know-how. However, parliaments also tend toward inaction on difficult technological matters, particularly whe·n they are proclaimed to be the salvation of the domestic economy or are tangled up with moral issues, such as "dole cheating" and "welfare fraud." Consumerprotection laws in many countries still have yet to be adapted to cater for the now well-developed EFTS. Although the early literature on EFTS omits mention of its social impact, testimony was given before U.S. Senate subcommittees at least as early as 1975 on the repressive potentials of computerized payment systems

Commuuications of fhf:' ACM

[59, p. 115] [see also [24] and [55]). The call :for protection was still necessary in 1984 in the United States [77] and in 1986 in Australia [3]. Parliaments in some countries such as Australia look less like sober lawmaking institutions than gladiatorial arenas. There are serious difficulties in convincing such legislatures to constrain the development of new "wonder technologies." The conclusion is inescapable that the populations of at least some of the advanced Western nations are severely threatened by unbridled, IT-driven dataveillance.

POLICY PROPOSAlS New and Improved Safeguards Since its brief period in the sun in the early 1 970s, privacy has become unfashionable among lawmakers, and the momentum that the fair information practices/ data protection/information privacy movement once had, has heen lost. The PPSC concluded that "the real danger is the gradual erosion of individual liberties through the automation, integration and int'mter 1Y85). 13-20. 17. Grecnh:!af. G.\V .. ami Clarke, R.A. Database retrieval technology and subject access principles. Aust. Comput J 16, 1 (Feb. 1984), 27-:12. 18. GrmmlAaf, G.W .. and Clarke, R.A. Aspects of the Australii:in Law Reform Commission's information privacy proposals. f. Law and lnf. Sci. 2. 1 (Aug. 1986), 8:1-110. 19. Gross. M.L. The Brain Watchers. Sig~ut. 1963. 20. Hoffman, L.J., Er:l. Computrrs and Prwacy in the Next Decade. Acr.~­ de:mic Pres~. Ne\'\' York, 1980. 21. Huxley. A. Braue j\'ew World. Penguin Uooks. NP.w York. 1975 (originally published in 193.2). 22. Kircher. J. A history of computer matching in federal government programs. Computerworld (Dec. 14, 1981). 23. Kling, R. Automated welfare clienl-tracking and service Integration The political economy of mmputing. Commun. ACM 21. 6 (lune 1978), 484-493.

24. Kling, R. Value conflicts and sor.ial choice in Alectromc funds transfer system developments. Commun. ACM 21, 8 (Aug. 1978), 642-ti57. 25. Kusserow, R.P. The government needs computer matching to root out Wiiste and fraud. Commurr. ACM 27, 6 (June 1984), 542-545. 26. Langan, K.J. Computer matching programs: A threat to privacy? Corumbia f. Law Soc. Probl. 15, 2 (1979). 27. Laudon, K.C. Computers and BureaucratiC Re,form Wiley, :Jew York, 19'74. 28. LaiJdon, K.C. Complexity in large federal databanks Soc.jTrans. (May 1~7Y). 29. Laudon, K.C. Problems of accountability in federal databanks. In Proceedmgs of the Amencan Association f~r the Advancement of SCience (May). American Association for the Advancement of Science. 1979. 30. Laudon. K.C. Data quahty and due proces.o: in large interorwmizational record systems. Commun. ACM 29,1 (Jan. 1986). 4-11. 31. Laur:lon. K.C. Dossier Sufiety. Value Choices in the Des1gn of National Injormation Systems. Columbia University Press, New York, 1986. 32. Lung, E.V. The Intruders. Praeger, NAw York.. 1967. 33. Marx, G.T. The new surveillance. Techno/. Rev. [May-June 1985). 34. Marx, G. T. I'll be watching you: Reflections nn the new surveillance. Dissent (\'\'inter 1985). 35. Marx. G.T., and Rf!ichman, N. Rnutinismg the discovery of secrets Am. Behav. Sci. 27, 4 (Mar.-Apr. 19~4). 423-452. 36. Miller, A R. The Assault on PriPary. ~entor, 1972 37. Neier, A. Dossrer, Stein and Day. 1974. 38. New South Wales Privacy Committee. Guidelines for the Operation of Personal Data Svstems. NSWPC. Sydnev. Australia. 1977. 39. Office of Mana8ement and Budg~t. Gu-rde!ines to Agencies on Conducting Automated Matchrng Programs. OMR, Mar. 1979. 40. Office of Management and Budget. ComprJter Matc/Jing Guidelines. OMB, Mav HlH2. 41. Office of Management and Budget President's Commission for Integrity and Efficiency. Model Control Systrm for Conducting Computer Mate/zing Projects Involving Individual Privacy Data. OMB/PClE. 1983 42. Office of Technology Assessment. Federal govP.rnment iniormation technology: Electronic surveillance and civil liberties. OTA-CIT-293, U.S. Congress, Washington, D.C., Oct. 19fi.'i. 43. Office of Technology Assessment. Federal government information lecbnology: Eledronlr. record sysiAm.~ and indivjdual privr.~cy. OTACIT-296, U.S. Congress, Washington. D.C .. June 1986. 44. Organisation for Economic r.ooperation and Development. Guide-

lines for the Protection of Privacy and Transborder Flows of Personal

D«·ta. OECD, Paris, France, 1980. 45. Orwell, G. 1984. Penguin Books, New York. 1972 (originally published in 1948). 46. Olford Dictionary. Vol. X 1933, p. 248. 47. Packard, V. The Naked Society. ~cKay, NP.w York, 1964

Communications of the ACM

48. Privacy Protection Study Commission. Personal Privacy rn an Information Socrety. U.S. Government Printing Office, Washington, lJ.C., July 1977. 49. Raines, j.C. Attack 011 Priuacy. judson Press. 1974. 50. Reichman. N .. and Marx. G.T. Generating organisational di::;putes: The impact of computerization. In ProceedmKs of the Law and Society Association Conference (San Diego, Calif., June 6-8). Law and Socwty Association, 1985. 51. Rodota, S. Privacy and data surveillam;e: Growing pubhc concern. Inf. Stud. 10, OECD. Paris. France, 1976. 52. Rosfmherg, J.M. The Death of Pnvacy. Random House, 1969. 53. Rosenberg, R.S. Computers and the Informatwn Society. Wiley, New York, 19fl6. 54. Roszak. T. The Cult of Jnformatiorz. Pantheon, 1986. 55. Rule. J.Il Prrvate Lwes and PubliC Surveillance. Socral Controlm t!u Computer Age. Schocken Books, 1974. 56. Rule, j B. Value Choices in E F.T.S. Office of Telecommunications Policy, Washington, D.C., 1975. 57. Rule, J.ll 1984-The ingredients of totalitarianism. In 1984 Revisited-Totahtanamsm in Our Century. Harper and Row. New York. 1983, pp. 166-179. 58. Rule. j.H. Documentary identification and mas~ surwillance in the United StafAs. Soc. Probl. 31,222 {1983). 59. Rule. J.B .. McAdam, D.. Stearns, L. and Uglovor, D. The Politics of Prruacy. New American L1brary. 1980. 60. Russell. B. Authority and the Individual. George Allen and Unwin, 1949. 61. Sbaltuck, J. Computer malching i.~ fl serious threill tu indivir:lur.~l ri~hts. Commun. ACM 27. 6 (June 1984). 538-541. 62. Stone. M.G. Computer Privacy. Anbar, 196fl 63. Thom. J.. and Thorne. P. Privacy legislation and the right of access. Aust. Comput. f. 15,4 {1'\ov. 1983). 145-150. 64. Thompson. A.A. A Big Brother in Britain Today. Michael Joseph. 1970. 65. U.~. Uept. of Health and Human Services. Computer Matchmg m State

Admimstered Benefit Programs: A Manager's Guide to Decision-Making.

HEW. Washington, D.C., 1983.

66. U.S. Dept. of Health and Human Services. Computer Matchmg in State Administered Benefit Programs. HEW. Washington, D.C., june 1984.

67. U.S. Dept. of Health, Education and Welfare, Secretary's Adv1snry Committee on Automated Personal Data Systems. Re::ords, Computers and the R1ghts of Cit1zens. MIT Press, Cambridge, Mass., 1 ~73. 68. U.S. Federal Advisory Committee on False ldtmtification. The Criminal Use of False Identificatwn. FACFI, Washington, D.C .. 1976. 69. IJ.S. Senate. Om:rsight of Computer Matchirtg to Detect Fraud and Mismanagement in Government Programs. U.S. Senate, \Nashington, U.C, 1982. 70. Warner, M., and Stone, M. The Data Bank Society: Or~anisations, Computers and Social Freedom. George Allen and Unwin, 1970. 71. Webster's 3rd Edition 197G. p. 2302 72. Weeramantry, C.G. The Slumbering Sentinels: I.aw and Humatl Rights in the Wake of Tedznology. Penguin Books. New York, 1983. 73. Wessell, M.R. Freedom's Edge: The Computer Threat to Society. Addison-Wesley, Reading, Mass .. 1974. 74. 'Westin, A. F. Prwacy and Freedom. Atheneum, New York. 1967. 75. \Nestin, A.F., Ed. Information Technolo~y m a Democra~y. Harvard University Press, Camhrir:lge, .\1.ass., 1~71. 76. Westin, A. F.. and Baker, M. Databanks in a Free Sociellf. Quadrangle. Nev.: York. 1974. 77. Yestingsmeier, J. Electronic funds transfer systems: The continuing need for privacy legislation. Compuf. Soc. 13.4 (\'VintE'r 19114), 5-9 78. Zamyatin, Y. We. Penguin Books. New York, 18S:J (originally published in Kussian, 1920) CR Categories and Subject Descriptors: [Computer Applications]: J.1 Administrative Data Processing; K.4.1 [Computers and Sodely[: Public Policy Issues; K.4.2 [Computers and Society]: Social Issues: K.5.2 {Legal Aspects of Computing]: Governmental Issues General Terms: Human Factors, Legal Aspects, Management, Security Additional Key Words and Phrases· Data protection, data surveillance. datave1llance, front-end verification, mass surveillance, matching schemes, profiling surveillance Author's Present Address. Roger Clarke. Department of Commerce. Australian National University, GPO Box 4, Canberra ACT 2tiU1, Austraha. Permission to copy without fee all or part of this material is granted provided that the copies are not made or du;tnbuted for dimr.t commercial advantage. the ACM copyright notice and the title of the publication and 1ts date appear, and notice is given that copying is by permission of the Association for Computing Machinery. To copy otherwise, or to republish, requirAs a fee and/or specific permission

May 1988

Volume 31

Number 5

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[18] Public assessment of new surveillance-oriented security technologies: Beyond the trade-off between privacy and security Vincenzo Pavone CCHS-CSIC, Spain

Sara Degli Esposti

The Open University Business School, UK

Abstract As surveillance-oriented security technologies (SOSTs) are considered security enhancing but also privacy infringing, citizens are expected to trade part of their privacy for higher security. Drawing from the PRISE project, this study casts some light on how citizens actually assess SOSTs through a combined analysis of focus groups and survey data. First, the outcomes suggest that people did not assess SOSTs in abstract terms but in relation to the specific institutional and social context of implementation. Second, from this embedded viewpoint, citizens either expressed concern about government's surveillance intentions and considered SOSTs mainly as privacy infringing, or trusted political institutions and believed that SOSTs effectively enhanced their security. None of them, however, seemed to trade privacy for security because concerned citizens saw their privacy being infringed without having their security enhanced, whilst trusting citizens saw their security being increased without their privacy being affected.

Keywords lay expertise, public understanding of science, risk perception, science attitudes and perceptions, technology assessment

I. Introduction Global threats like international terrorism and tnmsnational organized crime have constituted a serious challenge for domestic and foreign security since the end of the Cold War. Although an effective Corresponding author: Vincenzo Pavone, Institute of Public Policies, CSIC- Consejo Superior de Investigaciones Cientificas, Centro de Ciencias Humanas y Sociales, Calle Albasanz, 26--28, 28037 Madrid, Spain Email: vi [email protected]

Security and Privacy

308

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Pavone and Degli Esposti

557

response to these threats remains a contested issue, after 9/11 several Western govermnents, including the European Union, have pursued new policies, which rely heavily on surveillance-oriented security technologies (SOSTs), 1 to foster a proactive attitude towards terror and crime (Rasmussen, 2006; Grevi et al., 2009). While expected to enhance national security (EC, 2004), these technologies are subjecting ordinary citizens to an increasing amount of pennanent surveillance, often causing infringements of privacy and a restriction of civil rights (Levi and Wall, 2004; Lodge, 2007a; Webb, 2007). An important debate has, therefore, emerged around the impact ofSOSTs on the degree of secmity and privacy enjoyed by society. From a cost-benefit approach, the relationship between privacy and security is framed as a trade-off, whereby any security improvement gained through the introduction ofSOSTs curbs the amount of privacy enjoyed by citizens. This trade-off is usually taken as a starting point to allow for cross-country comparisons or study how individual characteristics inf1ucncc people's attitudes (Bowyer, 2004; Davis and Silver, 2004; Strickland and Hunt, 2005; Riley, 2007). However, the trade-off approach is far from uncontrovcrsial. It has been argued, for instance, that it presents privacy and securi~v as abstract categories, instead of enacted social practices emerging from the interaction between people and their social and institutional context (Dourish and Anderson, 2006). By presenting the debate on security and liberty as a zero-sum game, the tradeoff model also functions as a rhetorical device that reduces public opposition to a mere problem of making the necessary sacrifice for the sake of national security (Monahan, 2006; Tsoukala, 2006). As a consequence, alternative frames, which emphasize not only the technocratic and authoritarian implications of emerging security policies based on SOSTs but also the risks of function creep, data commercialization and social discrimination, have been marginalized within the security debate (Spence, 2005; Amoore, 2006; Liberatore, 2007; Lodge, 2007b; Cote-Boucher, 2008). The present study tries to cast some light on the relationship between security and p1ivacy, as it is framed by the lay public when assessing the introduction of new SOSTs. Combining qualitative and quantitative methods, we try to explore whether citizens adopt a trade-off approach and what are the factors that citizens may take into consideration when performing this assessment. We use data gathered, as a part of the PRISE project, in Spain and five other European countries during 2007. The article is organized into six sections: presentation of the theoretical framework; a brief methodological note explaining the research design adopted; an in-depth presentation of the Spanish focus group interviews; the analysis of the cross-country survey data collected; limitations; and conclusions.

2. Technology, security and democracy Over the past ten years, in the face of global ten·orism, nuclear proliferation, and transnational organized crime, new approaches to safeguard national and personal security have emerged (NATO, 2001; Rasmussen, 2001; Coker, 2002; European Union, 2003). As a result of the spatial and temporal unpredictability of criminal actions and of their global repercussions, the traditional means-ends rationality applied to the detetTence of specific local threats is gradually being replaced by a so-called global risk management perspective (Spence, 2005; Williams, 2005, 2008; Heng, 2006; Rasmussen, 2006; Kessler, 2008). From this viewpoint, a safer society is often pursued through the implementation of security policies that increasingly rely on the deployment of SOSTs and intercounected data exchange systems in order to transform unknown threats into predictable events (Zureik and Hindle, 2004; Beck and Lau, 2005; Juels et al., 2005; Zureik and Salter, 2005; Amoore, 2006; Rasmussen, 2006; Tsoukala, 2006; Lodge, 2007b; Lyon, 2007; Cate-Boucher, 2008; Muller, 2008). For instance, the EU has recently stated: "The fight against terrorism and organised crime, the protection of external European borders and civil crisis management have gained impor-

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

558

309 Public Understanding of Science 21 (5)

tance in our daily life .... At the same time, internal and external secmity are increasingly inseparable. Addressing them requires the use of modem technology" (EC, 2009: 2). However, whilst any real improvement of security is yet to be demonstrated (Webb, 2007), new SOSTs are subjecting ordinary citizens to a pervasive system of surveillance (Monahan, 2006). Moreover, the lobbying effort of the industry encourages the uptake ofSOSTs as part and parcel of new security policies across the globe (Hayes, 2009), both at national borders (Spence, 2005) and within national boundaries (Lyon, 2009). Whilst in principle new security technologies may not be surveillance-oriented, many of those that are going to be introduced as part of new security policies are likely to impose massive surveillance. For this reason, their introduction is often justified in terms of a beneficial trade-off, whereby the amount of privacy lost is allegedly compensated by an increase in national and social security. For instance, presenting the 2007 rules on air passenger information exchange to fight terrorism, the Vice-President of the EU Commission, Franco Frattini stated: "Our goal remains preserving the right balance between the fundamental right to security of citizens, the right to life and the other fundamental rights of individuals, including privacy and procedural rights" (Europa, 2007). On the basis of this argument, many European countries are experiencing a gradual restriction of civil rights, which has provoked two main reactions, ranging from those who support SOSTs and justify the reduction of civil liberties in the name of national security, to those who believe these restrictions to be undemocratic, unjustified and useless (Tsoukala, 2006). As a consequence, a variety ofparticipatmy technology assessment processes have been organized in Europe to address what is perceived as a crisis of both cognitive and democratic legitimacy (Beck ct a!., 2003: 14-15). The EU and national governments seem to find themselves trapped in what Juliet Lodge defined as a proximity paradox: "At the very time when technology has made it possible to bring the EU closer to the citizen more directly than ever before, EU citizens and residents appear to be more suspicious of it than ever before" (Lodge, 2005: 535). As a result, while governments feel compelled to introduce new SOSTs, the public perception of these new technologies gradually emerges as a sort of universal measure to guide governmental actions (PRISE Report, 2008). Tnspired by the trade-off approach, these studies have paid special attention to citizens' perceptions ofSOSTs in order to elaborate responsible and shared guidelines on their implementation and regulation, but they have the tendency to assume that the lay public is merely concerned about acceptable balances between privacy and security (Bowyer, 2004; Strickland and Hunt, 2005; Riley, 2007). Privacy is generally defined as the right of the individual to have one's personal information protected from the undue prying eyes of government and private organizations seeking to use such personal information for trade and profit, without the consent of the individual, except in exceptional circumstances dictated by the law (Riley, 2007: 2). However, security has been defined in different ways. For instance, it might refer to the right and duty of national governments either to enforce the laws and taboos of society (Van Loenen eta!., 2007) or to protect its geopolitical and economic integrity (Amoore, 2006). In this study, we define security as the right and duty of national governments to ensure citizens' personal safety- otherwise presented as "freedom from fear" (Manners, 2006: 192) or "human security" (Liotta and Owen, 2006: 40). According to these studies, new SOSTs not only force governments to make a clear distinction between those liberties that can be sacrificed to security needs and those that cannot be included in the trade-off (Bowyer, 2004), they also encourage citizens to trade part of their privacy in exchange for enhanced security. However, as noted by Gaskell et a!. (2004) in studying public opposition to genetically modified food, people tend to approach new technologies along a tradeoff model only when they consider these technologies as risky and useful at the same time. Applying this to SOSTs, only those who perceive both risks and benefits of SOSTs may approach

310

Security and Privacy

Pavone and Degli Esposti

559

Table I. Interpretation of SOSTs as security enhancing and/or privacy infringing devices.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Surveillance-oriented security technologies interpreted as ... ... Security enhancing devices Yes No

. .. Privacy infringing devices Yes

Trade-off model Concerned attitude

No

Trusting attitude Uninterested

them along the lines of a trade-off. ln contrast, people who believe that SOSTs are privacy infringing but not security enhancing (concerned attitude) face a lose-lose situation, whilst in the opposite case (trusting attitude), people rather face a win-win situation. ln neither case, however, does there seem to be a trade-off at stake (see Table 1). Moreover, within the trade-off approach literature, other studies have suggested that not only is the acceptance of SOSTs context-dependent (Furnell and Evangelatos, 2007) but it is also influenced by demographic, institutional and cultural factors (Davis and Silver, 2004). ln fact, if trust in institutions depends on the type of technology in use, trust in technology also depends on citizens' confidence in the institutions using the technology (Knights et al., 2001; Lodge, 2007a). As a consequence, from a more socially embedded perspective, several authors have pointed out how the emphasis of the trade-off approach on an abstract balance between privacy and security purposively obscures a number of ethical, social and political implications increasingly associated with the introduction of new SOSTs. These implications, in turn, not only may be common experience to many citizens, they may also influence whether citizens consider SOSTs as a solution to a security problem or, rather, as a threat to privacy. Several questions have been raised about the retention, ownership and exchange of the data produced through SOSTs. In the case of biometric information, for instance, it should be noticed that once in the database, not only the data no longer belong entirely to the physical holders, but they can also be stolen, commercialized or used for political purposes. In spite of these risks, the EU has not been able to elaborate and introduce a common and effective juridical framework to protect citizens from potential abuses (Lodge, 2007a, 2007b). Given the actual flow of data among EU countries and the lack of a common juridical protection, biometric technologies constitute a serious challenge to current norms of democratic accountability (Lodge, 2007b ), although recent efforts to enhance citizen participation may represent a first step towards more accountable practices (Liberatore, 2007). The breadth and impact of these technologies is especially visible when they are employed at national borders. On the one hand, through the diffusion of SOSTs, national borders have been extended well beyond their geographical locations, constituting "diffuse" borders where the state of law remains permanently suspended. On the other hand, these technologies are endorsing a discriminating process of risk profiling, a procedure through which all monitored citizens are encoded with a risk profile, turning people into low risk "trusted travellers" or high risk suspicious immigrants (Cate-Boucher, 2008: 145). This procedure, usually carried out by private companies (Krahmann, 2010), ends up endorsing certain types of global transfers (business men, managers) whilst restricting others, like immigrants or asylum seekers (Amoore, 2006; Muller, 2008). Taking recent critiques to the trade-off model of privacy and security and current studies on the social and political implications of SOSTs as a starting point, this study aims to explore in deeper empirical detail how SOSTs are framed by the lay public. Thus, in the remainder of the article we try to address the following research questions:

Security and Privacy Public Understanding

560

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

311

of Science 21 (5)

(Ql) Do people actually evaluate SOSTs as pnvacy infringing and security enhancing as assumed in the trade-off model? (Q2) Which security threats do they consider as most urgent and compelling? (Q3) Are people who are concerned for their privacy aware of the risks of function creep for political purposes, data commercialization, and social discrimination? (Q4) In what circumstances and under what conditions would people accept the introduction of new technologies for security reasons? (Q5) How do they feel about public participation in security policy decision-making?

3. Methodology The data hereby analysed and discussed have been retrieved from May to July 2007 in six European countries as part of the PRISE project. 2 The methodology employed is called "interview, meeting" 3 which consists of group interviews complemented by a questionnaire. Each meeting involves around 30 participants without any expert or professional knowledge about the technology at stake and of different ages and educational backgrounds. Before the meeting, participants receive informative material about the new technology and its potential implications. 4 The meeting begins with an expert introducing the topic, where the main advantages and disadvantages of the technology are discussed and the participants have the opportunity to ask questions. Subsequently, the participants complete a questionnaire and then divide into focus groups of6-9 people with a mediator. Discussion lasts approximately an hour and aims at revealing participants' opinion on the main issues at stake. This technique is expected to increase the reliability of results because the qualitative component allows enriching the set of feasible explanations beyond thcmy, while the questionnaire isolates individual attitudes and provides a useful benchmark against weak or spurious inferences. We must clarify that the interview meeting pursues different objectives compared to a survey. In fact, it is not representative of any distribution of preferences at national or regional level. In contrast it uses demographic characteristics to ensure heterogeneity of opinion (sec Table 2). The idea is to explore the range of possible reactions SOSTs can generate. Running an interview meeting is particularly suitable in cases where prior public knowledge is limited and the issues at stake are either technically complex or pose ethical or political dilemmas.

4. Presentation of the qualitative results: The Spanish case The relationship between privacy and security can better be analysed from a more discursive approach that considers privacy and security as social products and technology as a site where social meaning Table 2. Demographic characteristics of the participants to the study.

Country

Younger Having Male than 50 children % % %

Austria Germany Denmark Spain Hungary Norway

41.2 61.9 51.9 42.4 50.0 42.3

43.8 57.1 55.6 66.7 52.9 60.0

52.9 42.9 74.1 48.5 55.9 80.8

Living with children %

Living alone %

Undergraduate or inferior education %

Living in a metropolitan area %

Participants No.

5.9 14.3 40.7 45.5 44.1 69.2

52.9 47.6 22.2 12.1 23.5 3.8

47.1 52.4 22.2 69.7 52.9 46.2

82.4 81.0 81.5 81.8 85.3 69.2

16 21 27 33 34 25

312

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Pavone and Degli Esposti

561

is constantly negotiated and produced (Dourish and Anderson, 2006 ). Consequently, we first explored how different nanatives of SOSTs may be constructed by analysing records from the Spanish meetings. Spain represents an interesting case study as its society is known to hold a supportive attitude towards the development and application of new technologies (Eurobarometer, 2003, 2005, 2006; FECYT, 2005, 2007). Yet, Spanish citizens have a critical attitude towards their political and administrative institutions, especially with regards to data retrieval and protection (Eurobarometer, 2008a, 2008b, 2009). Besides, Spain has been the target of both ETA and Muslim fundamentalist terr01ist activities, especially in the city of Madrid, where the meeting was carried out.

Technology, security and democracy in Spain Whilst, in principle, the majority of the participants in the focus groups acknowledged that the introduction of new SOSTs may give rise to a potential trade-off between security and privacy, in practice, the participants divided into two separate groups. One group clearly stated that if we have nothing to hide there is no problem in being monitored; whilst the other group argued that if we have nothing to hide there is no reason to be monitored. These things are necessary; they help us to move on with transparency, if you have nothing to hide ... Tthink it is necessary. Tf Thave nothing to hide, why should they monitor me? Confirming Gaskell et al.'s (2004) analysis, therefore, these two groups actually considered these technologies either 1isky and useless, or useful and harmless. The first group felt that the privacy margins are increasingly smaller, and suggested that the new security measures may be used for perverse and illegitimate purposes. In their opinion, not only is the increasing implementation of new SOS Ts not justified by real dangers but it also fuels a growing fear among the citizens, which is purposively encouraged by the government for political purposes of control and manipulation. More specifically, they seemed especially concerned with the political use of the data collected, "They look at us; they control us and who is in possession of these data? If later there is a change in government, those who have been under scrutiny and are considered negatively by the government will easily find themselves on a black list ... " Otherwise, the commercial implications of using fear to promote business interests were mentioned: "In this consumer society, they offer you something that is presented as absolutely necessary, but in fact it is not necessary at all." The second group of participants supported the introduction of new security measures to contrast what they perceive as real risks, proceeding from different sources: terrorism, organized crime as well as common criminality. These participants, however, did not feel that their privacy was affected because they believed they had nothing to hide and nobody would be interested in monitoring ordinary people's lives. In their view, these new SOSTs increased security without affecting their privacy. They did not see themselves as potential suspects. In general, the ascription to either group seemed to be strongly affected by people's level of trust towards the contextual interplay between technologies and institutions. People displaying a low level of trust towards the scientific and political institutions implementing these technologies tended to join the first group, prioritizing privacy and expressing concern that these technologies could be manipulated and diverted to other purposes. In contrast, the people more trustful towards both technologies and institutions tended to join the second group, prioritizing security and showing a much lower level of concern for privacy.

Security and Privacy 562

313 Public Understanding

of Science 21 (5)

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Effectiveness and appropriateness Although crucial, privacy was not the only concern of the participants, who often questioned the effectiveness and appropriateness of SOSTs. For instance, some participants criticized the surveillance implications of these technologies: "Do we really need smart cameras? Sometimes, to have more security you just need to put more light in the streets ... I mean you feel more secure, and nobody is watching you." Other participants questioned that SOSTs will ever be able to cover all the threats and argued that criminals are capable of fooling the security systems. I believe that catching a plane does not carry the same risk of shopping in a shopping mall, that is, they can always put a bomb in a plane as well as in any other place, and you can't monitor all of them [i.e. the terrorists]. Well, 1 believe that no matter how many cameras, how much security you have, 1 believe that the terrorists are actually benevolent with us ... they can always fool all these technologies. Other participants actually argued that the real effectiveness of these technologies depends on the capability of the operators to deal with the acquired information. In this respect, "capability" implied technical/professional as well as moraVethical aspects. I don't know, how many of these CCTV cameras arc attended by security guards, which is a job like many others, I mean it docs not entail special requirements. I mean, if you spend your time monitoring people and you have to decide whether anybody is showing "strange" behavior, you really need to have some knowledge about people's attitudes. I believe that they should be careful about who is going to have access to our data, to all our data, to all our private things. In fact, more concerns were expressed about "interpretation," that is, being judged on the basis of the gathered information, than about being monitored. As suggested by Amoore (2006), the participants also raised concerns about the pervasiveness of cliches and stereotypes, generally used to match the interpretative scheme of those who are in charge of the security systems. No, I believe that sometimes there is a risk of confusion ... I know what happened to me when I went to Miami, I had some problems, especially after the 11th of September ... they stopped me all the time to ask for my documents, to ask whether I really was Spanish and put me in the cabin to check my luggage. Why was that so? Because I look like an Arab or a Mexican ... you can feel badly when these things happen ...

Not everything goes As a result, there was consensus about increasing security through the adoption of new technologies only in specific cases and places strictly belonging to the public sphere. In the private sphere, the use of invasive technologies was accepted when urgency and gravity are of the highest level and there is no alternative, such as in the case of gender violence and sexual harassment: "violence against women, in this sense we ought to put much more effort in tem1s of secmity," and paedophilia: "the type of criminals that keep committing the same crime, such as rape and paedophilia ... these people should be monitored much more intensively." Though with less emphasis, the use of SOSTs was also accepted to fight terrorism.

314

Security and Privacy

Pavone and Degli Esposti

563

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

I believe these measures are appropriate for international crime and terrorism, this is clear ... I mean, the citizen should know that these measures may occasionally be annoying and there are people who cannot stand them, cannot stand being controlled in the airport, and so on, but it is for their benefit. If there were no terrorism, all of that would not be necessmy. In fact, all participants expressed concern that the ctiort in increasing security measures would be concentrated only against terrorism and only in places considered as sensitive targets. As a result, the citizens would be left unprotected and vulnerable in other places and in relation to other types of crimes: "I am personally worried about other types of crimes ... the introduction of new technology is fine, but it should not merely focus on terrorism." In this respect, they feared that a new "elite" security was emerging, which considered terrorism as the main danger, whilst ordinary people would typically give priority to other types of crimes: "There are far more victims of gender violence than terrorism, I do not know the exact figures, but I am pretty sure about this."

Participation Although positive about participatmy processes of development and implementation of the new technology, the participants assigned more importance to the transparency of information and effectiveness of general rules than to direct participation. However, the issue of public participation encouraged the emergence of two groups: one tried to identify who was supposed to participate, whilst the other tried to single out who was not supposed to pmiicipate. In general, there was agreement on the importance of involving expe1is, consumer organizations and human rights associations, but the participation of ordinary citizens and politicians was more controversial. Citizens supporting the participation of the lay public specified that their participation would ensure that their interests were respected and their concerns taken into account: ''Because, if things go wrong these are the people who are going to suffer from the consequences, both negative and positive." Sceptical participants replied that not only would it be impossible for ordinary citizens to reach a viable consensus: "because I believe that the ordinary citizen ... that we would never reach a consensus on these measures, never" but also that citizens were not properly informed or prepared to effectively participate: I believe that these should be highly qualified people, or maybe the city council, or those who will actually be responsible for their operation in the city or in the specific place where these technologies are going to be used ... because I believe that the citizen will never be [qualified]. The participation of politicians was even more controversial. Some participants argued that politicians needed to be involved to guarantee the correct implementation of these technologies, whilst other participants voiced a deep scepticism about the real value and capability of their political elite: "Maybe it is better to keep the politicians out, maybe they should not express their opinion because they may give a very personal opinion, it would be better to have others who might be able to sec our interest in a more objective way." Sharing a very negative opinion on banks and multinational corporations because of their lack of social responsibility, nearly all participants insisted that these organizations should not be involved in the participatory process: "Banks arc not monitored, telephone companies arc not monitored. Only terrorists arc monitored, but there arc other forms of terrorism, like the one operated by banks charging far more than they should, that arc not monitored." Tn the end, participants agreed on the necessity of introducing clear regulative and participative frameworks, in which also the judges are expected to play a significant role, given that they are responsible for the correct utilization of these technologies: "When there is need to violate privacy,

Security and Privacy 564

315 Public Understanding

of Science 21 (5)

this should be always authorized by a judge, who has to decide the methods as well as the appropriate time and space constraints."

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

5. Security and privacy: Beyond the trade-off model When citizens face a recently installed security camera in the street, some consider the camera as a technical device that may discourage criminal acts and potentially increase their security. In contrast, others believe that the camera is a tool for political propaganda that does not increase security but instead threatens individual privacy. This situation often emerged during the focus groups, which suggests that participants tended to have polarized opinions and privacy was not the only issue at stake. In fact, people's opinion about SOSTs might also be affected by other factors, which related more to how these technologies addressed their social priorities and to the social and institutional context in which they were going to be implemented. The more citizens actually trusted public institutions, the more they saw security as a priority, which SOSTs effectively addressed. In contrast, the more they distrusted these institutions, the more they considered privacy as a priority, expressed doubts about the way in which security as an issue was constructed, and questioned whether technology could really be a solution to security threats. Inspired by these outcomes of the Spanish focus groups, we decided to explore the existence and the background of these contrasting views in the whole sample. First, we selected a set of questions from the survey, which explicitly associated a specific technology (e.g. scanning, eavesdropping, CCTV, etc.) either to privacy infringement or to security enhancement. Each question was measured through a 5-point Likert scale from "completely agree" to "completely disagree," without a forced Table 3. Exploratory factor analysis to measure Privacy and Security. Factor loadings: Rotated results SECURITY

PRIVACY

25. Storing biometric data (e.g. fingerprints or DNA samples) of all citizens in a central database is an acceptable step to fight crime 33. Scanning of persons for detection of hidden items is an acceptable tool for preventing terror 38. The possibility of locating a suspect's mobile phones is a good tool for the police in investigating and preventing terror and crime 40. The possibility of locating all cars is a good tool for the police in investigating and preventing terror and crime

0.6830

--0.1988

0.7654

--0.0591

0.8478

0.1716

0.8114

--0.0475

43. Government institutions should store all data they find necessary for security reasons for as long as they consider it necessary 46. Scanning of and combining data from different databases is a good tool for police to prevent terror SO. Eavesdropping is a good tool for police investigation 32. CCTV surveillance infringes my privacy 37. The possibility of locating all mobile phones is privacy infringing 39. The possibility of locating all cars is privacy infringing 45. Scanning of and combining data from different databases containing personal information is privacy infringing

0.6811

--0.2822

0.8468

--0.0 I 08

0.6276

--0.0649

5 I. Eavesdropping is a serious violation of privacy

0.0546 0.0315 -0.0875

0.6879 0.8454 0.8586 0.8114

-0.2240

0.7238

-0.0939

316

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Pavone and Degli Esposti

565

answer. Then, we ran exploratory factor analysis (Lawley and Maxwell, 1962) and from twelve variables we obtained just two factors able to explain 63% of the total variance. 5 As components are- by construction- mutually orthogonal, we allowed factors to correlate by applying oblique rotation to the factor space. 6 This transformation made it relatively easy to identifY each variable with a single factor without altering the results (Garson, 2009). We named the factors "SECURITY" (SOSTs as secmity enhancing) and "PRIVACY" (SOSTs as privacy infringing), respectively (see Table 3)? By computing Spearman's rank correlation coefficients (Chen and Popovich, 2002) we observed an inverse significant relationship between the two dimensions. In other words, those citizens who believed that SOSTs enhanced their security tended to underestimate SOSTs' impact on their privacy, whilst those citizens who were concerned about SOSTs' impact on their privacy tended to neglect potential benefits in terms of security. This finding deserved further attention because it seemed to contradict an implicit assumption of the trade-off model, which considers SOSTs as both privacy infringing and security enhancing. We decided, therefore, to check whether this effect was produced at a different level. During the focus groups, we noticed that people's assessment of SOSTs was strongly affected (a) by their prior opinion on the norms and functioning of the institutional context in which SOSTs arc likely to be implemented, (b) by their prior opinion on the capability and accountability of those who actively usc these technologies and the data thereby retrieved and (c) by their estimated risk of function creep and abuses. We decided, therefore, to select from the survey those questions that dealt directly with these issues (see Table 4). By applying the same procedure explained before, we obtained two other factors that we called respectively "TRUST" (which related not only to the trustworthiness of the institutional context in which SOSTs were likely to be implemented but also to the legitimacy oftechnology as a solution to security problems) and "CONCERN" (which related to the opposite case, and denoted fear of function creep and abuses while it questioned SOSTs as an effective solution to security problems). 8 Finally, we computed partial correlation for all four factors (Figure 1). We could observe a significant positive correlation between "TRUST" and "SECURITY" as well as between "CONCERN" and "PRIVACY." Furthermore, we found a negative significant correlation between "TRUST" and "PRIVACY," and a negative, and yet not significant, correlation between "CONCERN" and Table 4. Exploratory factor analysis to measure Trust and Concern. Factor loadings: Rotated results TRUST IS. The security of society is absolutely dependent on the development and use of new security technologies 17. If you have nothing to hide you don't have to worry about security technologies that infringe your privacy 18. When security technology is available, we might just as well make use of it 16. Many security technologies do not really increase security, but are only being applied to show that something is done to fight terror 20. It is uncomfortable to be under surveillance, even though you have no criminal intent 21. New security technologies are likely to be abused by governmental agencies 22. New security technologies are likely to be abused by criminals

CONCERN

0.8269

0.0401

0.8111

-0.1465

0.8242

-0.0103

-0.0573

0.6902

-0.0173

0.6840

-0.2587

0.6662

0.1213

0.6559

Security and Privacy

317 Public Understanding of Science 21 (5)

566 0.668**

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

TRUST

CONCERN

SECURITY

0.549**

PRIVACY

** p < 0.001 * p < 0.05

Figure I. Relationships among factors measured through partial correlation.

"SECURITY." Tn other words, it seemed that a more trusting attitude led citizens to consider SOSTs as an effective solution to enhance security without infringing privacy. Tn contrast, a more concerned attitude led citizens to interpret surveillance-oriented security technologies merely as privacy infringing. These findings confi1med that the participants actually divided into two groups, trusting and concerned people, with different attitudes towards privacy and security. Even if these technologies imply higher surveillance, trusting people believed not only that SOSTs were effective measures to fight terror and crime, but also that their p1ivacy was not likely to be affected. Concerned people believed that these technologies only restricted privacy (see Figure 1). For them there was no tradeoff either, because they did not see security being increased while their privacy was affected. As robustness checks, "Security" and "Privacy" were regressed against "Trust" and "Concern" respectively and other control variables without any appreciable change. Results also remained consistent when we replicated the analysis after applying an orthogonal rotation of the factor axes (Varimax).

6. Limitations Whilst its findings pave the way to the elaboration of alternative approaches to SOSTs and the privacy-security dilemma, this study has important limitations. First of all, we restricted our focus on security technologies that, to varying degrees, involved surveillance. As a result, the tern1 security often seemed to act as a synonym for surveillance. However, the terms secmity and surveillance should not be conflated: increased surveillance might not necessarily imply increased secmity and security technologies that do not involve surveillance- such as front or backdoor security lights that are motion sensitive and switch on automatically when people pass by- may provoke different reactions, as occasionally emerging in the focus groups. Although concerned participants not only criticized SOSTs for their privacy implications, but also questioned their appropriateness and effectiveness in tenns of security enhancement, their main concerns might have more to do with surveillance (being monitored) than with security (being protected). As a result, further studies are needed to cast light on the relationship between secmity and surveillance, assessing not only whether citizens react differently towards security technologies not involving surveillance but also whether their reactions against surveillance-oriented technologies are merely due to their privacy implications. Second, we restricted our analysis to the security-privacy trade-off and its main social and political critiques. Other contested issues, such as the perceived health risks associated with the use of biometric technologies (McKinlay, 2008), have been left out not because they were irrelevant, but because they shared the same risk/benefit approach adopted by the trade-off model. Third, from the Spanish case we could learn that even when citizens are accustomed to surveillance devices, and generally hold a positive attitude towards new technologies, suppmi for new SOSTs cannot be taken for granted. This insight calls for future research to investigate the wide

318

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Pavone and Degli Esposti

567

range of institutional, personal, technical and cultural factors that affect people's attitudes towards SOSTs across different countries. Whilst this study cannot cast light on these differences, it demonstrates at least the need of studying these technologies in context. Finally, from a statistical viewpoint, running a structural equation model could have been more adequate in the analysis of the covariance structure. Owing to the limited sample size and to the exploratory character of our study, we preferred not to use it. In this study, we used 156 cases and always observed commonalities below 0.6, which made the sample size appropriate for running each factor analysis (MacCallum eta!., 1999). The Kaiser-Meyer-Olkin measure of sampling adequacy provides further evidence of the appropriateness of our choice. Indeed, we observed a middling result (0.736) in the TRUST-CONCERN case and a meritorious result (0.866) in the PRIVACY-SECURITY casc. 9 While the research design did not allow for a generalization it did, at least, support our hypotheses, suggesting the need for further research.

7. Conclusions Tn general, participants agreed that additional security, related to some aspects of ordinary life, was necessary. Yet, some participants questioned the appropriateness of using new SOSTs to address security problems, whilst others suggested restricting the adoption of new SOSTs only to specific crimes, in specific contexts and always under specific legal and institutional guarantees. Their main concern was to avoid political abuses and a deterioration of the democratic framework of law and rights. Other participants acknowledged that SOSTs may be useful against terrorism, but expressed concern that an over-emphasis on terrorism may come at the expense of other security threats that they perceive as more imminent and familiar. Many participants insisted that the use of these technologies for commercial and political purposes was not acceptable. They were aware that fear could easily be exploited for both economic and political purposes; therefore they vividly expressed concern for potential political abuses, pointing at the difficulty of assessing when a behaviour or an attitude of citizens may be defined as suspicious. In this respect, there was general awareness that errors may spring not only from the limits of the technologies but also from the limits of the people who operate them. Concerns about the professional and mom! profile of the SOSTs' operators led the participants to emphasize the importance of clear rules and reliable mechanisms of sanction in case of human errors. As someone suggested, a supedicial or dishonest application ofSOSTs might mise as many security concerns as the threats SOSTs were expected to address. In their view, therefore, the introduction of new SOSTs (a) should be gradual and transparent; (b) should occur always in a context of clear rules and widespread information; (c) should be focused on specific cases and places; (d) should be proportionate to the danger and the situation; and, fmally, (e) should affect the ptivate sphere of intimate life as little as possible. From these results, we can draw two main conclusions. First, without denying the existence and relevance of technology-specific implications, the qualitative data suggest that context-specific implications did exist and were utterly relevant. Citizens' concern about the introduction of SOSTs was often due more to their mistrust towards the institutions that were supposed to employ and regulate these technologies, than to their alleged lack of knowledge about science and technology. This outcome confinned the validity of Brian Wynne's (2006, 2008) emphasis on the importance of public participation in technology assessment, because the lay public assesses technologies not only on the basis of technical information, but also on the basis of other types of knowledge (institutional, legal, social, moral), which are not technical but are nonetheless absolutely relevant when technologies jump from laboratory to policy and social domains. This complex, contextual form of assessment, therefore, is of great importance when policy decisions

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

568

319 Public Understanding

of Science 21 (5)

have to be taken but is often neglected by cmTent assessment exercises, which are based merely on decontextualized, technical expe1iise. From a contextual approach, it is also easier to understand why citizens assessing the introduction of new SOSTs seldom approached the relationship between security and privacy according to the trade-off model. The latter assumes that citizens consider SOSTs as both privacy infringing and security enhancing. In contrast, the participants in this study tended to divide into two main groups, which considered new SOSTs either as infringing privacy without enhancing security or vice versa. In fact, a deeper line of demarcation seemed to exist, running along the dilemma between two broad political attitudes, trust and concern, of which the divide between privacy and security was only a by-product. Trusting citizens considered security the main issue, whilst concerned citizens assigned priority to their privacy. The implications are decisive because the trade-off model generally frames privacy and security as exchangeable goods that could be traded. ln contrast, trust and concern are political attitudes, which can change over time, depending on a number of personal and social factors, but cannot be "exchanged'' or traded as if they were economic goods. Our second conclusion, therefore, is that in the domain of public assessment of new SOSTs the adoption of an economic trade-off model between security and privacy may be misleading. In a context of rising security concerns, expanding definitions of risk and growing governmental monitoring activities, these results shed some light on the persisting gap between the governmental and the lay public perception of the security agenda, as well as on the political implications of the new public discourse on security.

Acknowledgements We would like to thank .To han Cas, coordinator of the PRISE project, Jose Manuel Rojo Abuin for his invaluable statistical suppmi, Giuseppe Di Palma for his comments on earlier versions of this paper, Manuel Pereira Puga for his contribution to the draft of the PRISE Report, and the two anonymous reviewers for their excellent comments and suggestions.

Notes 1. As surveillance-oriented security technologies we refer to the following set of devices: biometrics (i.e. fingerprints and facial characteristics), biometric passport, closed circuit television (CCTV), automatic face recognition, automatic number plate recognition (ANPR), passenger scanning (i.e. naked machines), locating technology (i.e. GPS, eCall), data retention, total infonnation awareness (TTA ), eavesdropping. These technologies are elsewhere defined as surveillance technologies (Petersen, 2001 ), yet, in our definition, we prefer to emphasize the fact that these technologies are nonnally employed for security purposes. 2. PRISE stands for: "Privacy enhancing shaping of security research and technology- A participatory approach to develop acceptable and accepted principles for European Security Industries and Policies." The project was financed by the EU Commission, under the PASR 2005 (Preparatory action for security research). 3. For more infonnation about this method refer to the Danish Board of Technology's website. 4. This type of methodology is meant to overcome some of the limits described by Haggerty and Gazso (2005) regarding omnibus survey research about surveillance and privacy. 5. We use the principal component extraction method (Hotelling, 1933) and retain factors with eigenvalues greater than one. All analyses are conducted using STATA 10. 6. We use the Promax routine and set the power equal to 2. 7. The internal consistency of our instrument results in a Cronbach 's Alpha of 0.89 which is considered well acceptable (Nunnally, 1978). 8. Reliability was ensured by a Cronbach's Alpha of 0.73.

320

Security and Privacy

Pavone and Degli Esposti

569

9. "Stata II help for factor postcstimation" (9 September 2009) provided by StataCorp LP, 4905 Lakeway Drive, College Station, TX 77845 USA.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

References

Amoore, L. (2006) "Biometric Borders: Governing Mobilities in the War on Terror," Political Geography 25(3 ): 336-51. Beck, U. and Lau, C. (2005) "Second Modernity as a Research Agenda: Theoretical and Empirical Explorations in the 'Meta-Change' of Modern Society," British Journal of"Sociology 56(4): 525-57. Beck, U., Bonss, W. and Lau, C. (2003) 'The Theory of Reflexive Modernization: Problematic, Hypotheses and Research Programme," Theory, Culture & Society 20(2): 1-33. Bowyer, K. W. (2004) "Face Recognition Technology: Security versus Privacy." IEEE Technology and Society Magazine 23(1): 9-19. Chen, P.Y. and Popovich, P.M. (2002) Correlation: Parametric and Nonparametric Measures. Thousand Oaks, CA: SAGE Publications. Coker, C. (2002) Globalization and Insecurity in the Twenty-First Centwy: NATO and the Management of Risks. Oxford: Oxford University Press. Cote-Boucher, K. (2008) "The Ditiuse Border: Intelligence-Sharing, Control and Confinement along Canada's Smart Border," Surveillance and Society 5(2): 142-65. Davis, D.W and Silver, B.D. (2004) ''Civil Liberties vs. Security: Public Opinion in the Context of the Terrorist Attacks on America," American Journal olPolitical Science 48(1 ): 28-46. Dourish, P. and Anderson, K. (2006) "Collective Information Practice: Exploring Privacy and Security as Social and Cultural Phenomena," Human-Computer Interaction 21(3): 319-42. Eurobarometer (2003) Europeans and Biotechnology in 2002. Eurobarometer Series n. 58.0. Eurobarometer (2005) Social Values, Science & Technology. Special Eurobarometer Series n. 225. Eurobarometer (2006) Europeans and Biotechnology in 2005: Patterns and Trends. Final report on Eurobarometer n. 64.3. Eurobarometer (2008a) Data Protection in the European Union: Citizens' Perceptions. Analytical Report. Flash Eurobarometer Series n. 225. Eurobarometer (2008b) Data Protection in the European Union: Data Controllers' Perceptions. Analytical Report. Flash Eurobarometer Series n. 226. Eurobaromctcr (2009) Confidence in the Inf"ormation Society: Analytical Report. Flash Eurobaromctcr Series n. 250. Europa (2007) "Fight Against Terrorism: Stepping up Europe's Capability to Protect Citizens Against the Threat of Terrorism," Press release, 6 November. URL: http://curopa.eu/rapid/prcssRelcasesAction. do?rcfercnce~IP/07 11649 European Commission (2004) "On the Implementation of the Preparatory Action on the Enhancement of the European Industrial Potential in the Field of Security Research, towards a Programme to Advance European Security through Research and Technology," COM(2004)72 final. URL (consulted May 2010): ftp:// ftp.cordis.europa.eu/pub/eraldocs/ communication_ security_030204_ en.pdf European Connnission (2009) "A European Security Research and lm10vation Agenda: Cmmnission's Initial Position on ESRIF's Key Findings and Reconnnendations," COM(2009)691 final. URL (consulted May 20 I 0): http:/I ec.europa.eu/enterprise/policies/security/files/mami/comm_pdf_com_2009_ 0691_f_connnunication_en. pdf European Union, Councilofthe(2003) "ASecureEuropeinaBetter World: European Security Strategy. Brussels, 12 December 2003," URL (consulted September 2009): http://www.consilium.europa.eu/uedocs/ems Upload/ 78367.pdf FECYT (Fundaci6n Espanola para Ia Ciencia y Ia Tecnologia) (2005) Percepci6n Social de Ia Ciencia y Ia Tecnologia en Espana 2004.

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

570

321 Public Understanding of Science 21 (5)

FECYT (Fundaci6n Espanola para la Ciencia y la Tecnologia) (2007) Percepci6n Social de la Ciencia y la Tecnologia en Espana 2006. Fumell, S. and Evangelatos, K. (2007) "Public Awareness and Perceptions of Biometrics," Computer Fraud & Security 2007(1): 8-13. Garson, G.D. (2009) "Factor Analysis," in "Statnotes: Topics in Multivariate Analysis." URL (consulted September 2009): http://faculty.chass.ncsu.edu/garson/pa 7 65/statnote.htm Gaskell, G., Allum, N., Wagner, W., Kronberger, N., Torgersen, H., Hampel, J. and Bardes, J. (2004) "GM Foods and the Misperception of Risk Perception," Risk Analysis 24( 1): 185-94. Grevi, G., Helly, D. and Keohane, D., eds (2009) European Security and Defence Policy: The First I 0 Years (I 999-2009). The European Union Institute for Security Studies. Paris: Corlet Imprimeur. Haggerty, K.D. and Gazso, A. (2005) "The Public Politics of Opinion Research on Surveillance and Privacy," Surveillance and Society 3(2/3): 173-80. Hayes, B. (2009) "NcoConOpticon: The EU Security-Industrial Complex," Statcwatch and the Transnational Institute. URL (consulted April 201 0): http://www.statcwatch.org/analyscs/ncoconopticon-rcport.pdf Hcng, Y. (2006) "The 'Transfonnation of War' Debate through the Looking Glass of Ulrich Beck's World Risk Society," International Relations 20(1 ): 69-91. Hotclling, H. (1933) "Analysis of a Complex of Statistical Variables into Principal Components," Journal of Education and P>ychology 24(6): 417--41. Jucls, A., Molnar, D. and Wagner, D. (2005) "Security and Privacy Issues in E-passports," Proceedings of the First International Conference on Security and Privacy.for Emerging Areas in Communications Networks, pp. 74-88. Washington, DC: IEEE Computer Society. Kessler, 0. (2008) "From Insecurity to Uncertainty: Risk and the Paradox of Security Politics," Alternatives: Global, Local, Political33(2). URL (consulted September 2009): http:/lvlex.com/vid/insecurity-uncertainty-risk-paradox-56022031 Knights, D., Noble, F., Vurdubakis, T. and Willmott, H. (2001) "Chasing Shadows: Control, Virtuality and the Production of Trust," Organization Studies 22(2): 311-36. Krahmann, E. (2010) States, Citizens and the Privatisation of Security. Cambridge: Cambridge University Press. Lawley, D.N. and Maxwell, A. E. (1962) "Factor Analysis as a Statistical Method," Journal of the Royal Statistical Society 12(3): 209-29. Levi, M. and Wall, D.S. (2004) 'Technologies, Security, and Privacy in the Post-9!11 European Information Society," Journal ofLaw and Society 31 (2): 194-220. Liberatore, A. (2007) "Balancing Security and Democracy, and the Role of Expertise: Biometrics Politics in the European Union," Ruropean Journal on Criminal Policy and Research 13(1-2): 109-37. Liotta, P. and Owen, T. (2006) "Why Human Security?," Whitehead Journal ofDiplomacy and International Relations 7(1): 37-55. Lodge, J. (2005) "e-Justice, Security and Biometrics: The EU's Proximity Paradox," European Journal of Crime, Criminal Law and Criminal Justice 13(4): 533-64. Lodge, J. (2007a) "A Challenge for Privacy or Public Policy- Certified Identity and Uncertainties," Regia: Minorities, Politics, Society 17(1): 193-206. Lodge, J. (2007b) "Freedom, Security and Justice: The Thin End of the Wedge for Biometrics?," Annali Istituto Superiore Sa nita 43( 1): 20--6. Lyon, D. (2007) "Surveillance Security and Social Sorting: Emerging Research Priorities," International Criminal Justice Review 17(3): 161-70. Lyon, D. (2009) IdentifYing Citizens: ID Cards as Surveillance. Cambridge: Polity Press. MacCallum, R.C., Widaman, K.F., Zhang, S. and Hong, S. (1999) "Sample Size in Factor Analysis," Psychological Methods 4(1): 84-99.

Security and Privacy

322

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Pavone and Degli Esposti

571

McKinlay, A. (2008) "Emerging EMF Technologies Action on Possible Health Risks," Annals of Telecommunications 63(1-2): 5-9. Manners, I. (2006) "Normative Power Europe Reconsidered: Beyond the Crossroads," Journal of European Public Policy 13(2): 182-99. Monahan, T., ed. (2006) Surveillance and Security: Technological Politics and Power in Evetyday Life. New York: Routledge. Muller, B.J. (2008) "Securing the Political Imagination: Popular Culture, the Security Dispositif and the Biometric State," Security Dialogue 39(2-3): 199-220. NATO (2001) "Chapter 2: The Transformation of the Alliance," in The Strategic Concept of the Alliance, NATO Handbook, pp. 33-58. Brussels: NATO Office of lnfonnation and Press. Nunnally, J.C. (1978) Psychometric Theory. New York: McGraw-Hill. Petersen, J.K. (2001) Understanding Surveillance Technologies: Spy Devices, Privacy, Histo1y, & Applications. Boca Raton, FL: CRC Press. PRISE Report (2008) "D6.2-Criteria for Privacy Enhancing Security Technologies," URL (consulted September 2009): http://www.prise.oeaw.ac.at/docs/PRISE_D_ 6.2_Criteria_ for_privacy_enhancing_security_technologies. pdf Rasmussen, M.V. (2001) ''Reflexive Security: NATO and International Risk Society," Millennium: Journal of Jnternational Studies 30(2): 285-309. Rasmussen, M.V. (2006) The Risk Society at War: Terror, Technology and Strategy in the Twenty-First Century. Cambridge: Cambridge University Press. Riley, T.I3. (2007) "Security vs. Privacy: A Comparative Analysis of Canada, the United Kingdom, and the United States," Journal a/Business and Public Policy 1(2): 1-21. Spence, K. (2005) "World Risk Society and War against Terror," Political Studies 53(2): 284-302. Strickland, L.S. and Hunt, L.E. (2005) "Technology, Security, and Individual Privacy: New Tools, New Threats, and the New Public Perceptions," Journal of' the American Society for Information Science and Technology 56(3): 221-34. Tsoukala, A. (2006) "Democracy in the Light of Security: British and French Political Discourses on Domestic Counter-terrorism Policies," Political Studies 54(3): 607-27. Van Loenen, B., Groetelaers, D., Zevenbergen, J. and de Jong, J. (2007) "Privacy versus National Security: The Impact of Privacy Law on the Use of Location Technology for National Security Purposes," in S.I. Fabrikant and M. Wachowicz (eds) The European Information Society: Leading the Way with Geo-Information, pp. 135-52. Berlin: Springer. Webb, M. (2007) Illusions of Security: Global Surveillance and Democracy in the Post-9/11 World. San Francisco, CA: City Lights Books. Williams, M.J. (2005) "Revisiting Established Doctrine in an Age of Risk," RUSI Journal150(5): 48-52. Williams, M.J. (2008) "(In)Security Studies, Reflexive Modernization and the Risk Society," Cooperation and Conflict 43(1): 57-79. Wynne, B. (2006) "Public Engagement as a Means of Restoring Public Trust in Science: Hitting the Notes, but Missing the Music?," Community Genetics 9(3 ): 211-20. Wynne, B. (2008) "Elephants in the Rooms Where Publics Encounter ·science"? A Response to Darrin Durant, 'Accow1ting for Expertise: Wynne and the Autonomy of the Lay Public'," Public Understanding of Science 17(1): 21-33. Zureik, E. and Hindle, K. (2004) "Governance, Security and Technology: The Case of Biometrics," Studies in Political Economy 73(Spring/Summer): 113-38. Zureik, E. and Salter, M.B. (2005) Global Surveillance and Policy: Borders, Security, Identity. Cullompton: Willan Publishing.

Security and Privacy 572

323 Public Understanding

of Science 21 (5)

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

Author Biographies Vincenzo Pavone is permanent research fellow in social studies of sciences at the Institute ofPublic Policies of the Consejo Superior de Investigaciones Cientificas (CSIC) Madrid, Spain. He studied political science at the University of Catania, Italy, and at the University of Kent, UK. He completed his PhD at the European University Institute of Florence, Italy. His main research interests are the political and social aspects emerging around the complex relationship between science, politics and society, with a special focus on public and professional understanding of security and medical biotechnologies, participatory governance and the so-called bio-based economy. Sara Degli Esposti is a PhD student at The Open University Business School. She was awarded an MPhil in Business Administration and Quantitative Methods by the Department of Business Administration of the University Carlos l1l of Madrid after completing her degree in Sociology at the University ofTrento, Italy. Her main research interests focus around innovation processes, public perception of new technologies, and methodological issues concerning the use of mixed methods research designs.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[19] EUROPEAN PROTECTIONISM IN CLOUD COMPUTING: ADDRESSING CONCERNS OVER THE PATRIOT ACT By John T. Billingst

I. INTRODUCTION

In recent years, both individuals and companies have embraced cloud computing as the future of information technology ("IT") architecture.' An estimated seventy-six percent of Americans use cloud computing services today,' and its use by businesses is estimated to more than double in the next three years.' Many companies around the world now outsource data storage and processing to "the cloud,"" seeking to save money on IT infrastructure costs, while benefiting from greater access and flexibility offered by the cloud.' : J.D. and Institute for Communications Law Studies Certificate Candidate, May 2013, Catholic University of America, Columbus School of Law. The author would like to thank his family for their continuous support, as well as the CommLaw Conspectus staff for their hard work throughout the writing process. 1 Jared A. Harshbarger, Cloud Computing Providers and Data Security T,aw: Building Trust With United States Companies, 16 J. TECH. L. & PoL'Y 229, 230 (2011). 2 Andrew R. Hickey, Cloud Computing Befuddles Consumers, Despite Use: Study, CRN (Aug. 9. 2011, 1:45 PM), http://commcns.org/13JsYWF (citing results of a recent study on "consumer tamiliarity'' with cloud computing). 3 Maggie Holland. TBM Pulse 2012: Cloud Computing Use To Double By 2015, ITPRO (Mar. 7, 2012, 9:26PM), http://commcns.org/Xfo24V. 4 The term "the Cloud" stems from computer network diagrams that depict the Internet as a vast cloud at the top of a network chain. Cloud Computing, ELEC. PRIVACY INFO. CTR., http://commcns.org/SSI37q (last visited Nov. 10, 2012). For present purposes, "the Cloud" refers to remote servers owned and operated by providers and made accessible to users by the Internet. Robert Gellman. Privacy in the Clouds: Risks to Privacy and Confidentiality [rom Cloud Computing 4 (Feb. 23. 2009). available at http://commcns.org/WLrmoy. 5 Mladen A. Vouk, Cloud Computing-Tssues, Research and Tmplementations, 16 J. COMPUTING & INFO. TECH. 235, 235 (2008), available at http://commcns.org/XfofFi. The global market for cloud computing is predicted to increase trom approximately $41 billion in 2011 to $241 billion in 2020. Jennifer Valentino-DeVries, More Predictions on the Huge Growth of 'Cloud Computing,' WALL Sr. J. BLOGS (Apr. 21, 2011, I 1:19 AM), http:/I commcns.org!VNkxXN.

326

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

212

COMMLAW CONSPECTUS

[Vol. 21

Coinciding with this swell of cloud computing, consumers and their governmental representatives have become increasingly sensitive to how their personal information is protected by cloud providers." In response, many countries have taken steps to protect consumer data through legislative action. 7 In October 1995, the European Union enacted Directive 95/46/EC in an effort to harmonize data protection laws across the E.U. Member States. 8 Renowned as one of the most comprehensive data protection laws enacted in any country, the E.U. Directive has served as a model for legislation in many non-European countries.' The United States, however, has never passed a comprehensive regulation on data privacy, instead relying on a sectorial approach to privacy regulation. 10 In part due to the lack of a comprehensive data privacy law, European cloud customers have become reluctant to do business with cloud service providers based in the United States. In fact, according to a recent survey, seventy percent of Europeans are concerned with the security of their online data, due in large part to a mistrust of U.S. privacy protections. 11 European consumers cite the USA PATRIOT Act ("Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act," or "PATRIOT Act")," which was enacted after the terrorist attacks of September 11, 2001, to grant the federal government more authority to obtain information about suspected terrorists, as emblematic of the United States' loose stance on Alex Palmer, Report: 90% Of Consumers Worry About Online Privacy, DIRECT MARKETING NEWS (Feb. 10, 2012), http://commcns.org/Uyu2ru (finding that "[n]inety percent of U.S. adults worry about online privacy, while 41% do not trust most companies with their personal data): Quentin Hardy & Nicole Perlroth, Companies Raise Concerns Over Coogle Drive's Privacy Protections, N.Y. TIMES (Apr. 25, 2012, 3:41 PM), http://commcns.org/VawFDy. 7 See generally Oliver Brettle & Nicholas Greenacre. White & Case LLP, Countries At A Glance-Data Privacy (2007), http://commcns.org/Xil28D (summarizing the relevant law and practice of each country in the areas of collection, processing, and transfer of personal data). 8 Joel R. Rcidcnbcrg, ResolvinR Conflicting International Data Privacy Rules in Cyberspace, 52 STAN. L. REV. 1315, 1329 (2000). See Jaw, What are F;U directives?, EUROPEAN CoMM'N, http://commcns.org/XHNHG2 (last updated June 25, 2012) (explaining that '"EU directives 122 123

Security and Privacy

342

228

COMMLAW CONSPECTUS

[Vol. 21

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

A. General Provisions The Directive defines personal data as "any information relating to an identified or identifiable natural person ('data subject')" that may be identifiable "directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural, or social identity." 128 It distinguishes between data controllers, those who "detennined the purposes and means of the processing of personal data" 129 and data processors, who "process[] personal data on behalf of the controller."'"' "Processing" is all inclusive, and pertains to "any operation or set of operations which is performed upon personal data .... " 131 The Directive places obligations on data "controllers" who either operate within the European Economic Area ("EEA"), who are "established" in the EEA, or who "make[] use of' equipment located in the EEA. 132 The data controller owes a duty to the data subject when the personal data is collected directly from the person. 133 Additionally, the controller must implement appropriate technical and organizational measures against unauthorized processing.134 Of particular importance to cloud providers, the Directive prohibits the transfer of personal data outside of the EEA to a third country unless the third country "ensures an adequate level of protection."'" Because the cloud operates on a borderless network, 136 this places a substantial burden on the data controller to ensure that consumer data is not transferred outside of the EEA or, if data is transferred outside the EEA, that the third country has been deemed to have an adequate level of protection. 137 Currently, only a few countries outside lay down cet1ain end results that must be achieved in every Member State ... but [national authorities] are free to decide how to do so''). Because the Directive had to be implemented on a local basis, there are some inconsistences in application of the Directive. Hon & MilIard, supra note 48. at 28-29. This paper will focus on generally accepted applications of the Directive. 128 Data Protection Directive, supra note 125, art. 2(a). 129 Jd.~ art. 2(d). 130 !d., art. 2( e). 131 /d., art. 2(b) (listing such actions as "collection, recording, organization, storage, electronic storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission. dissemination or otherwise making available. alignment or combination, blocking. erasure or destruction"); see also Cloud Computing and EU Data Protection Law, COMPUTERWORLDUK.COM (Sept. 28. 2011, 4:00PM), available at http://commcns.org/U4USZk. 132 !d., art. 4. 133 !d.' art. 4. 134 !d. 135 !d.. art. 25( 1). 136 Mel! & Grance, supra note 25, at 2. 137 Paul M. Schwartz. Data Protection J,aw and The F:uropean Union's Directive: F:uropean Data Protection Law and Restrictions on International Data Flows. 80 IowA L. REv.

343

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

2012]

European Protectionism in Cloud Computing

229

of the European Union have been declared by the European Commission to have the requisite level of protections to satisfy the Directive."' The United States is not one of these countries deemed to have an adequate level of protection.139 This status has not been given to the United States because of the reach of the PATRIOT Act and its lack of a comprehensive privacy regulation or governmental agency devoted to privacy, and lack of governmental agency devoted to privacy. 140 B. Safe Harbors

To help bridge the differences between the U.S. approach to privacy and that of the E.U., the European Commission adopted Decision 520/2000/EC on July 26, 2000, which recognized certain safe harbors for transferring data into the United States. 141 The U.S. Department of Commerce worked with the European Commission to develop U.S.-specific safe harbors, whereby U.S. organizations that promise to adhere to seven principles of privacy protection may transfer data with European companies. 142 To qualify for the safe harbor, the U.S. organization must (I) adhere to seven Safe Harbor principles that ensure U.S.based companies provide adequate privacy protection; 143 and (2) publicly announce its compliance through certification letters filed annually with the Department of Commerce or its designee. 144 C. Exemptions to Data Protection Directive While a major goal of the Directive is to provide comprehensive protection of personal data, drafters attempted to strike a "balance between the right to be 4 71' 483-84 ( 1995). 138 These countries include, Andorra, Argentina, Canada, Switzerland, Faeroe Islands, Guernsey, Israel, Isle of Man, and Jersey. Hon & Millard, supra note 47, at 26, 31. 139 Charles Batchelor. Privacy: US and EU Clash on Confidentiality, FINANCIAL TIMES (May 22, 2012, 4:56PM), hllp:l/commcns.org/VaxCMo. 140 !d. 141 Commission Decision 2000/520/EC, of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the Adequacy of the Protection Provided by the Safe Harbour Privacy Principles and Related Frequently Asked Questions Issued by the US Department of Commerce, 2000 O.J. (L 2I5) 7. 10-12 (EC), available at http:/I commcns.org/Xfrr3y. 142 See Safe Harbor Privacy Principles, EXPORT.GOV (July 21, 2000), http://commcns.org/Y cplsa. 141 These seven principles include: (I) notice; (2) choice; (3) access; (4) onward transfer; (5) security; (6) data integrity; and (7) enforcement. James M. Assey. Jr. & Demetrios A. Eleftheriou, The EU-U.S. Privacy Safe Harbor: Smooth Sailing or Troubled Waters?, 9 CoMMLi\wCoNsrFcrus 145, 151-52 (2001). 144 !d. at 151.

344

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

230

COMMLAW CONSPECTUS

[Vol. 21

let alone and the legitimate interests of a society." 145 In this vein, the Directive's scope does not cover a number of uses of personal data, including: 1) all activity falling not within the scope of "community law", such as national security, defense, public safety, economic or financial interests of the state, and criminal proceedings; 146 2) activities conducted solely for research; 147 3) activities "solely for journalistic purposes or the purpose of artistic or literary expression only if they are necessary to reconcile the right to privacy with the rules governing freedom of expression;" 148 and 4) transfers of data to third countries without adequate privacy laws if that transfer is done with the data subject's consent, is pursuant to a contract, is required for a legitimate public interest, or if the transferor adduces adequate safeguards by the transferee. 149 Most European countries have utilized the Directive's exception for "community law" and have passed laws specifically restricting data protection for national security reasons. For example, the Netherland's national data protection law states that, "this Act does not apply to the processing of personal data . . . by or on behalf of the intelligence or security services referred to in the Intelligence and Security Services Act [or] ... for the purposes of implementing police tasks." 150 The U.K. has similarly made personal data "exempt from any of the provisions of ... the data protection principles ... if the exemption from that provision is required for the purpose of safeguarding national security." 151 Spanish law provides that data protection "shall not apply to the collection of data when informing the data subject would affect national defense, public safety or the prosecution or criminal offences." 152 The Directive affords E.U. consumers with substantial rights and protections in their personal data, but the national security exemptions to the law allow data to be unprotected in the same instances when the PATRIOT Act applies. Critics who point to the PATRIOT Act as an example of the United States falling short of E.U. data privacy protections fail to recognize that the European Privacy Directive does not apply when national security is at risk, or even in 14 ' Stephen A. Oxman, Exemptions to the European Union Personal Data Privacy Direc!ive: Will They Swallow the Directive?, 24 B.C. TNT'L & CaMP. L. REV. 191. 192-93 (2000) (quoting Ulrich U. Wuermeling, Harmonisation of European Union Privacy Law, 14 J. MARSHALL J. COMPUTER & INFO. L. 411, 414 ( 1996) ). 146 Council Directive 95/46, art. 3(2), 1995 O.J. (L 281) 31.39 (EC). 147 ld. at 42. 148 !d. at 41. 149 See id. at 46. 150 Wet Bescherming Persoonsgegevens [Dutch Data Protection Act] art. 2(2)(b-c), Stb. 2000. p. 302 available at http://commcns.org/11CWY78 (unofficial translation). 151 Data Protection Act, 1998, c. 29, art. 28( I) (U.K.), available at http://commcns.Orj,!/XHNQte. 152 LEY ORGANICA 15/1999, de 13 de diciembre, de Protecci6n de Datos de Car:icter Personal art. 24, (B.O.E. 1999, 15). available at http://commcns.org/Sb2YQz (unotlicial translation).

Security and Privacy 2012]

European Protectionism in Cloud Computing

345 231

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

the prosecution of criminal offenses. In fact, when national security considerations are invoked, a consumer's data, whether American or European, is not protected from the reaches of government surveillance. V. CONCLUSION A critical examination of United States and European Union law indicates that simply avoiding U.S.-based cloud service providers based on concerns about the PATRIOT Act will not necessarily protect consumer data. Due to the wide jurisdictional scope of the PATRIOT Act and MLA Ts with European nations, merely selecting a European-based cloud provider does not guarantee that consumer data will be beyond the reaches of the PATRIOT Act. If the cloud provider is a subsidiary of a U.S.-based company, has a data center located in the United States, or has "continuous and systematic" contacts with the United States, it will be within the jurisdictional reach of the PATRIOT Act. Even if a consumer chooses an E.U.-based provider that is outside the scope ofthe PATRIOT Act, the United States will still potentially have access to the data pursuant to MLA T treaties with its European allies. The potential for intrusive governmental surveillance of personal data is not exclusive to the United States. Both the United States and the European Union allow the government substantial leeway in obtaining consumer information. In particular, the PATRIOT Act authorizes access to data for national security reasons, such as "international terrorism and clandestine intelligence activities."153 Similarly, the E.U. Directive makes an exception for "community law," which includes national security, defense, public safety, economic or financial interests of the state, and criminal proceedings, 154 and authorizes "[m]ember States [to] adopt legislative measures to restrict the scope of the obligation and rights ... when such a restriction constitutes a necessary measure to safeguard ... national security."' 55 Thus, an E.U. consumer's data is subject to government confiscation for national security reasons regardless of the location of the cloud service provider.

153 USA PATRIOT Act of2001, Pub. L. No. 107-56, §505, 115 Stat. 365 (2001): see a1so Lakatos, supra note 63. 154 See Oxman, supra note 149, at 191 (citing Ulrich U. Wuermeling, Harmonisation of European Union Privacy Law, 14 J. MARSHALL J. COMPUTER & INFO. L. 411, 414 (1996)); see also Council Directive 95/46/EC, rcc. 43. 1995 O.J. (L 281131) (EU) (stating that '·restrictions on the rights of access and intormation and on certain obligations of the controller may similarly be imposed by Member Stales in so lar as they are necessary to sareguard, lor example national security ... "). 155 Council Directive 95/46/EC. art. 13.

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

@ Taylor & Francis -

http://taylora ndfra nci S.com

Taylor & Francis Group

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

[20] Internet Intermediaries, Cloud Computing and Geospatial Data: How Competition and Privacy Converge in the Mobile Environment Lisa Madelon Campbell* [email protected]

Introduction Changes in the ·way we access and use d1e internet are bringing into sharp focus the convergence between privacy and competition issues. Once thought to be a social good that \vould result frorn inrpruvenrent in general consurner "'.Velfare Ilo"\\-ing from an innovative and competitive marketplace, privacy is emerging as a fundamental clement of the online world in which we live. Two trends are forging links between competition and privacy law issues: tbe economics of behavioural advertising and the monetisation of internet services; and the rise ofinternetintennediaries and their use of infonnation about consun1crs that flows to thctn as part of the services they render. This convergence intensifies in the context of cloud conrputing and mobile comnrerce. This article ·will explore the effects of these developments on innovation and corn petition and noYellegal issues that rnay arise.

The rise of internet intermediaries By connecting billions of data processing, mobile and other devices through the internet and other forn1s of connnunication, we are in the process of creating a 'vorldvvTide computer. The memory of this machine is the composite of all of the hard drives and storage devices wired to it. Whole archives of data pass through this worldwide computer every second and its software

is created continuously by our online behaviour. v\'e progran1 it with each site we access, the places we visit within that site and our interactions online. VVe are llO\V inextricably intertwined with this worldwide machine: just as we have come to depend upon it in virtually every aspect of our lives, it also requires our input for programming and devclopmen l. 1 Partly in response to governmental exertion of control over the internet in the face of political upheaval, a United Kations report recently declared that curtailing individuals' access to the internet, save in exceptional circumstances, constitutes a violation of their ci,~l and political rights: 'There should he as little restriction as possible to the flow of information on the internet'. 2 While it is true that the internet has democratised the power flowing fr01n inforn1ation to a greater extent than ever before," complete faith in that development fails to take into account the filtering role that internet intermediaries play, as well as the effects of surveillance on the intcrncL. 1 Electronic risk profiles n1ay result in discrin1inatory effects that arc far n1orc datnaging than traditional racial profiling. Because they are based on a variety of personal attributes, only son1e of \Vhich receiYe legal protection, it n1ay also be Inore di1Iicult to challenge these categorisations before the courts.!l Internet intermediaries have been described as the group of organisations worldwide that give access to the internet and host, trans1nit and organise content as 'veil as provide internet-based services. 0 The tcnn

348

Security and Privacy

Downloaded by [University of California, San Diego] at 07:06 21 April 2017

ANTITRUST AND PRIVACY 'internet interrnediaries' generally includes internet service providers, search engines, internet paytncnt syste1ns and online networking fora. 7 Internet intcnncdiarics have continuous access to a rich array of infonnation about consuiners, which they obtain through the services they provide:" Tntcrnlccliarics have infonnation about everything that consu1ners do on the internet, not only the searches they concluctY They may use this information to refine their offering-s or to build another service, or they may sell or trade the information. Many in Lcrncl in tcrmcdiarics offer free scniccs thal are ad-supported. For exan1ple, search engines such as Coogle in the United States and Baidu in China, use auction-based advertising programs through which advertisers pay to deliver content ain1cd at search queries.Hl As search engines partner \vith advertising con1panies and carriers, regulators have looked closely at the effects of their combined market power. 11 'Vith sotnc exceptions, internet intcnncdiarics generally have not been held responsible for content12 In lernel intermediaries play a pivotal rok in the in Ierne! 's global infrasU'ltctme since they help ensme continuous invcstn1cnt to allow for new applications and a growing base of users. SigniEcantly, internet intermediaries al"io establish trust in the internet, an aspect ofwhich invoiYcs the protection of individual privacy.'" Previously, a search page was the prilnary nranner in which people accessed the internet and retTieved and contributed inforn1ation. Increasingly however, social networks and highly personalised homepages are the starting point for individuals to participate online. As social netvvorks replace search pages as the principal portal through which people access the internet. several search engines are expanding their business lines into the social network sphere, either by acquiring existing services or by creating new oncs. 14 For nrany online conrpanies, rnargins rise quickly with scale, making bigger doubly better. This effect is even nHHe pronounced with social networks: users gather \Vhcrc tnost of their friends arc and advertisers follow them there. Face book has close to 700 million users worldwide, far more than its onetime co1npetitor Myspace. 15 In China, social nenvorking sites have grov.,rn in popularity in recent years, gaining most of their revenue from online advertising. There are now about I 00 social networking sites operating, sheltered !rom rn~jor fOreign conrpetition in the world's second-largest economy. 1(i The Chinese social net'i,vorking site Renren, which filed an initial public offering and is now being publicly traded, says it has hundreds of millions of users and is adding two million users a month from China's 1.3 billion population. 17 COMPETITION LAW INTERNATIONAL

November 2011

Although it replicates the United States' Facebook tnodcl in tnanyrcspccts, there arc itnportant differences: whether a social network page posting is objectionable is detcTnlined by govcrnn1cnt authorities. Rcnren is also required to monitor advertisenrents on its \vebsites, sotnc oh·vhich arc suqjcct to special governn1cnt tT\;cw bell:>re they are posted. Like Face book, however, Renren encourages users to participate showing their real identities and to share infornration about thernselves as they socialise online with family and friends. The neuvork's operation is supported by adyertiseinent.,, online games and third panv applicalions 1 H

The evolution of privacy online and social networks There has been international consensus for son1e time. at least from a policy perspective. about the basic tenets of adequate privacy protection. Govcrntncn ts in Europe, North Alnerica and elsewhere have adopted public and private sector legislation thal enshrines these fundamental principles either in whole or in part. 1 ~1 The rnain conrponents are: there should only be a limited collection of personal data, for legitimate purposes; individuals should have a choice about whether to disclose their personal data; indhiduals should have access to their personal data; personal data should be kept secure; and there should be accountability ;md remedies for enforcing privacy protection. If these principles operate effectively, consumers arc made aware before the fact that personal information may be sought from them; theyed in thi~ artidt> are lor the purpnst's ot contributing to Lam fnternolional and any mistakf's and all orui~~ion~ aie author\ . .:\lany thank~ tom~ colleague Travis Todhuntt'l' tor his thoughttul cormnenrs in the preparation ofthi~ article. Thi~ cuticle doe~ not in any way retlet:t the view~ of Lhe Compelition Bureau of Canada. Kevin Kell~, 'The Plnnelary Comrmter·, lFimi,Jul~ 200H (San Francisco: Conde :.l'ast ::\Iedia Group,200H), p 54. Frank La Rue, CN Special Rappo!'Leur on Lhe PromoLion and

COMPETITION LAW INTERNATIONAL

November 2011

About the author Lisa Campbell is Deputy Commissioner of the Fair Business Practices llranch at the Competition llureau of Canada. Ms Campbell has worked in both the private and public sectors and ha.~ been involved in transnational matters, including investigations into the data handling practices of global corporations, and the regulatoryimplicatio:a~ of emerging online business models. As a lltlgato•· in private practice, Ms Campbell worked in the areas of criminal, competition and

10 II 1~

1~

14

15

lfi

I7 1R

Protection of the Right to Freedom of Opinion and Expre~~ion, ~ Junt' ~Oil. A\'ailablt> online at: www.nn.org./apps/news/story.asp ?NewsiD=38608&Cr=press+freedom&Cr1=; abo at: http:/ /"\\-ired. com/threatlevel/20 I 1/06 /Internet-a-human-right/. .:\fi1tdlaruou1, rlu' j10weroj the IWW Paperbacks, 2007), pp ~~-2~1. !hid, p ~~. Jeffrey Ro5en, The .VahPd Crowd: Reclaiminp;SI:'curi(\' and Frmlom in an Anxmus Age (Nt>w York: Random House, 2004), p LJ(). 'Tilt' Economic and Social Role ot lmt>tnt't lntt'Ttllt'dial ies·, Organization for Economic Development and Cooperation, April 2010, rrcparcd by Karim: Pcrscl, OECD\, DirccloraLc for Science l'echnology and 1ndmtry. Available online at: ·w,.,w.oecd.org/ dataoecd/ 49 I 4/44949023. pdf. flJid, p 9. Randall C Picker, 'Competition and Privacy in "~Neb 2.0 and the Cloud', 10:1 Northme~tern L'nnN'rsi(v Law Rrwew (~OOH). Frank Pasquale, 'Beyond innovation and competition: the need lor CJUalilit>d rransparf'ncy in internet intermf'diarie~·, ~010 ..Vorthwnlern Umvl'niiJ Law Rrvil'w, Voll04, No 1, p 105, at p ll~; available online at: http:/ /ssrn.com/abstract=l686043. Sec nolc G above, alp 12. Seenote9above,atp 12H. One cxccpLion is lhc llalian conviction of Lhrcc Goog'lc representative~ on·r a video on the company's website 1vhkh showed a child being bullied. The cumpany indicated it will appt'al the comictions. Availahlt> on lint' at: www.pcworld.com/ article/190190/google_to_appeal_italian_convictions_of_three_ execs.html/. See note 6 above, at p B. l'hcre \\Trc rcn:nt report~ that FaccbooJ... may partner with Haidu, China'~ principal .;carch engine, Lo create a :-.ocial network in that country: http:/ /blogs.forbes.com/ericsavitz/2011/04/11/ facebook-baidu-reportedly-to-set-up-china-social-network-site/. Yandex, the largest ~earch engine in Rmsia, acquired \,foikrug, a social nem·ork in that country: http:/ /eng.cnews.ru/news/top/ indexEn.shtml?2007 /03/27/242398. that the number of Face book mer~ oruLe.I1Icr1Ca. when.: the market has arguabl~· been satmated for some time, i~ dropping ..·\vailable online at: http:/ I articles.cnn.com/20 11-06-20 /tech/ people.shunning.facebook_1_ facebook-user-active-users. 'China'~ Tntemet', Time ·Hagazinl', 17 February 20 II: avaitthlt> on lint' at: wv.w.time.com/time/magazine/ article/0,9171,2048171,00.html. AvaiLtble online at: http:/ /tech.forttme.cnn.com/2011/04/20howrenrens-ipo-is-setting-the-table-for-facebook. ibid. Vv'eib•J, a microbi•Jgging ~ite like Twitter, i~ alMJ vet y p•1puhn in China.

65

Security and Privacy

353

ANTITRUST AND PRIVACY 19

01\CD (;llirlflinn on the Pmtution of Privacy and Tnunbonler Flow~ of Penrmal Dolo, adopted 2)) St-'ptember 19?-lO, availablt' Fi~h fur tht> rnaj