The Oxford Handbook of Digital Technology and Society 9780190932596, 0190932597

Required reading for anyone interested in the profound relationship between digital technology and society Digital techn

120 24 6MB

English Pages 799 Year 2020

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

The Oxford Handbook of Digital Technology and Society
 9780190932596, 0190932597

Table of contents :
Cover
The oxford handbook of DIGITAL TECHNOLOGY AND SOCIETY
Copyright
Preface
Introduction
Challenges
Interdisciplinary Views of the Digital Society
Volume of Literature and Digital Tools
Constant Change
Chapters in the Book
Potential Audiences
Conclusion
Acknowledgments
Table of Contents
List of Figures
List of Tables
About the Contributors
Editors
Authors
Section 1: OVERVIEW
Chapter 1: Introduction to the Oxford Handbook of Digital Technology and Society: Terms, Domains, and Themes
Introduction
Terms and Growth of These Developments
Main Digital Technology and Society Issues and Contexts in Recent Books
Method
A. Theory and Conceptualization
B. Digital Technology
C. Issues
D. Contexts
E. Effects
Summary
Related Work
Purpose and Origins of This Book
Purpose and Domains
Origins
Conclusions and Recommendations from the ESRC Project
Communication, community, and identity
Citizens, politics, and governance
Understanding the platform economy
Data and digital literacies for engaged and included citizens
Everyday digital health and well-being
Digital inequalities
Beyond the ESRC Project
References in Main Text
References of Books for Issues and Contexts Analysis
Chapter 2: ESRC Review: Methodology
Introduction
Participants
Project Team
Stakeholder Engagement
Initial Outline for the Scoping Areas
Domains and Goals
Use of Theory
Use of Methods
Approaches for the Review
Delphi Process
Stakeholder Engagement: Workshops
Systematic Literature Reviews
Conclusion
References
Section 2: HEALTH, AGE, AND HOME
Chapter 3: ESRC Review: Health and Well-Being
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Future Research and Scoping Questions
Research Challenges
Conclusion
Note
References
Chapter 4: Computer-Mediated Communication and Mental Health: A Computational Scoping Review of an Interdisciplinary Field
Introduction
Computer-Mediated Communication and Mental Health
Defining Key Constructs
The State of the Research Field
The Present Study: Foci, Hypotheses, and Research Questions
Method
Scoping Review Methodology
Sample
Analytical Approach
Results
Core Topics
Changes over Time
Publication Behavior in the Field
Mental Health Concepts
Discussion
Summary and Contribution
Limitations
Future Research Agenda
References
Chapter 5: Digital Inclusion and Women’s Health and Well-Being in Rural Communities
Introduction
What Is Digital Inclusion?
What Is Information Literacy?
Women’s Health and Well-Being in Rural Communities
Rationale for Review
Methods
Description of the Reviewed Literature
Theory and Methods
Terminology
Digital Inclusion
Information Literacy
Rurality
Approaches to Digital Inclusion Initiatives
Differentiation of Digital Inclusion Initiatives: Levels and Approaches
Examples of Digital Inclusion Initiatives Intended for Women
The Use of Mobile Technology in Digital Inclusion
Digital Inclusion Frameworks, Measurements, and Evaluations
Digital Inclusion Training
Digital Inclusion, Information Literacy, Health, and Well-Being
Discussion
Vague and Inconsistent Terminology
Relations between Information Literacy and Digital Inclusion
Differences between Developing and Developed Country Contexts
Complexity of and Theoretical Approaches toward Initiatives
Conclusion
References Not in Review Database
Chapter 6: Digital Technology for Older People: A Review of Recent Research
Introduction
Scope of the Review
Uses of Mainstream Technologies by and for Older People
Topic 1: Older People’s Interaction with Mainstream Digital Technologies
Topic 2: Older People’s Lived Experience of Digital Technologies
Topic 3: Older People’s Use of Digital Technology for Communication and Social Interaction
Topic 4: Using Digital Technologies to Assist Older People: Monitoring Older People’s Welfare
Reflections on the Research on Uses of Digital Technology for Older People
Particular Subtopics within Topics
Two Themes across the Four Topics
Limitations and Future Research
Conclusion
Acknowledgments
References
Chapter 7: A Digital Nexus: Sustainable HCI and Domestic Resource Consumption
Introduction: Digital Systems and Natural Resources
Where Digital Development Meets Environmental Crisis
A Nexus of Relationships
Chapter Overview
The Development of Sustainable HCI
Investigating Physical Resource Use
Investigating Rational Choice and Behavior Change
Investigating Attitudes, Values, and Lifestyles
Investigating Practices and Networks
Revisiting Sustainable HCI in a WEF Context
Conclusion: Resource Sustainability, Resilience, and Security
Notes
References
Section 3: COMMUNICATION AND RELATIONSHIPS
Chapter 8: ESRC Review: Communication and Relationships
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Scoping Questions
Topics
Challenges
Conclusion
Note
References
Chapter 9: Media Mastery by College Students: A Typology and Review
Introduction
The Concept of Media Mastery
Definition
Related Concepts
Development of the Concept
Materials and Coding
Scope of the Literature
The Media Mastery Typology
The Coding Process
Description of the Sample
Review: Co-occurrences of Media Mastery Components with Social and Individual Aspects
Access
Boundaries
Constraints
Managing Content
Obstacles
Use Awareness
Conclusion
Note
References from Introductory Material
Chapter 10: Boundary Management and Communication Technologies
Introduction
Terminology
Work-Home Boundaries
Boundary Theory
Work-Home Conflict
Work-Home Enrichment
How Aspects of Communication Technologies Affect Work-Home Boundaries
Work Place and Space Shift
Using Multi-device Ecologies around Work-Home Spaces
Using Multi-platform Ecologies for Work and Personal Roles
Expectations of Work and Personal Availability
Awareness of Work and Personal Availability
Managing Boundaries in the Digital Age
Top-down Boundary Strategies
Bottom-up Boundary Strategies
Conclusion
References
Section 4: ORGANIZATIONAL CONTEXTS
Chapter 11: ESRC Review: Economy and Organizations
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Scoping Questions
Topics
Conclusion
Note
References
Chapter 12: The Changing Nature of Knowledge and Service Work in the Age of Intelligent Machines
Introduction
What Are Intelligent Machines (Artificial Intelligence and Robotics)?
Artificial Intelligence
Robots
Literature Review Methods
Changing Human Relations with Intelligent Machines
Human-Robot Interaction
Human-Robot Hybrid Teams
Adoption and Acceptance of Intelligent Machines
Trust in AI and Robots
Safety and Risks during Human-Machine Relations
Responsibility and Accountability for Intelligent Machines
Agenda for Future Research
Cross-cutting Requirements
Research Priorities for the Three Themes
Conclusion
Acknowledgments
References
Chapter 13: Workplace “Digital Culture” and the Uptake of Digital Solutions: Personal and Organizational Factors
Introduction
Understanding and Measuring Technology Acceptance Factors
Survey and Analysis Methods
Defining Digital Solutions
Sample and Analyses
The Extent to which UK Organizations and Sectors Are Digitizing
Presence and Number of Digital Roll-Outs, by Organizational Size and Sector
Increase in Digital Solutions Being Used, by Organizational Size and Sector
Reasons for Digital Roll-outs, by Organizational Size and Sector
Summary
Digital Efficacy: Digital Skills at Home and in the Workplace
Confidence and Use at Home and at Work
Summary
Experiences of Digital Technology Roll-Outs
Knowledge Workers and Digital Roll-Outs
Attitudes toward Number and Success of New Digital Solutions Rolled Out
Experiences and Opinion of Roll-Outs by Job Position
Other General Features of UK Workforce Attitudes to Digital Technology
Organizational Challenges and Communication
Challenges, by Organization Size, Sector, and Successful Roll-Outs
Communication and Leadership
Summary
Building a Model of Workplace Digital Culture
Measures of Organizational Culture
A Model of Factors Leading to Perceived Success in Digital Technology Implementation
Conclusions for Organizations from the Model
Conclusion
Overall Summary
Culture and Strategy
Final Conclusion
References
Section 5: COMMUNITIES, IDENTITIES, AND CLASS
Chapter 14: ESRC Review: Communities and Identities
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Future Research and Scoping Questions
Research Challenges
Conclusion
Note
References
Chapter 15: Digital Engagement and Class: Economic, Social, and Cultural Capital in a Digital Age
Introduction
Defining Digital Inequality
Distinctions among Digital Divides
Why Do We Need to Shift Academic Research away from Questions of Access and Skills?
Digital Inequality and Social Distinctions
The Idea of Digital Capital
Digital Inequalities in the Fields of Everyday Life
Class, Capital, and Digital Media Use
Economic Capital
Cultural Capital
Social Capital
Conclusion
References
Section 6: CITIZENSHIP, POLITICS, AND PARTICIPATION
Chapter 16: ESCR Review: Citizenship and Politics
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Future Research and Scoping Questions
Key Topics
Research Challenges
Conclusion
Note
References
Chapter 17: Digital Ecology of Free Speech: Authenticity, Identity, and Self-Censorship
Introduction
Methodology
Findings
Beliefs, Opinions, and “Alternative Facts”
Content Sharing as a Speech Act
Privatization of Censorship
Deliberate Ambiguity, Voluntary Invisibility, and Self-Censorship
Conclusion: Where from Here?
Central Implications
Avenues for Future Research
References
Section 7: DATA, REPRESENTATION, AND SHARING
Chapter 18: ESRC Review: Data and Representation
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Future Research and Scoping Questions
Research Challenges
Conclusion
Note
References
Chapter 19: Digital Citizenship in the Age of Datafication
Introduction
Citizenship
Definitions of Citizenship
Challenges and Alternatives
From Digital Acts to Digital Citizenship
Digital Acts
Digital Citizenship
Empowerment and New Affiliations
Digital Restrictions
Digital Citizenship and Datafication
The Datafication of Life and Governance
Implications of Datafication
Datafication and Citizenship
Digital Citizenship between Empowerment and Control
Conclusion
Note
References
Chapter 20: Digitizing Cultural Complexity: Representing Rich Cultural Data in a Big Data Environment
Introduction
Defining Pre-Data and the Origins of Data
Pre-Data
Native Data
Raw Data
Source Data
Source Data and Data Quality
Summary
Data Definitions: Theory and Practice
An Introduction to the Challenges of Defining Data
Data Processing and Data Interpretation
Data Definitions and Polysemy
Data Cleaning
Data Variants and Data Treatment
Metadata
Conclusion
Acknowledgments
References
Chapter 21: Motivations for Online Knowledge Sharing
Introduction
Framework
Distinctions
Public Goods and Knowledge Sharing Systems
Self-Oriented Motivations
Enjoyment and Entertainment
Altruism
Expertise
Feedback
Reputation
Incentives
Expected Individual Benefits and Costs
Other-Oriented Motivations
Reciprocity
Social Comparison
Social Loafing
Group Belonging and Sociality
Group Identity
Expected Collective Costs and Benefits
Contextual Factors
Self-Efficacy
Trust
Venue
Directions for Future Research
Conclusion
References
Section 8: GOVERNANCE AND ACCOUNTABILITY
Chapter 22: ESCR Review: Governance and Security
Introduction
Initial Comments
Literature Analysis
Topics
Theory, Method, and Approach
Delphi Review
Future Research and Scoping Questions
Research Challenges
Conclusion
Note
References
Chapter 23: Governance and Accountability in Internet of Things (IoT) Networks
Introduction
Principles of Internet of Things Governance
Considering Governance
Levels of Governance
Legitimacy and Representation
Accountability
Transparency
Use of Themes
Case Studies: Regional/National IoT Governance
Top-down IoT Governance in the European Union
Top-down IoT Governance in the United States
Top-down IoT Governance in the United Kingdom
Case Studies: Local IoT Deployment
Chicago, United States
Songdo, South Korea
New York, United States
Conclusion
Notes
References
Section 9: SYNTHESIS
Chapter 24: ESRC Review: Future Research on the Social, Organizational, and Personal Impacts of Automation: Findingsfrom Two Expert Panels
Introduction
Social and Economic Context
Method and Project Context
Definitions
Proposed Research Areas
Identified Research Topics
Social and Cultural Attitudes toward AI and Automation
Technology Development, System Design, and Adoption
Trust in Automated Systems; Oversight and Governance
Complexity and the Scale of the Topic
Evidence and Research Methods
Global Environments
Education, Skills, and Employment/Organizations, Professions, and Work
Inequalities/Community and Social Issues/Social Impacts
Embodiment and Cognitive Demands/System Design for Being (in) Digital
Ethics
Impactful Social Science
Conclusion
Appendix
1 Detail of the ESRC-DSTL Research Clusters and Questions
1.1 Social and Cultural Attitudes to Automation
1.1.1 Question Set 1: Social Benefits and Attitudes
1.1.1.1 what evidence will this generate? what could this be used for?
1.1.1.2 which disciplines need to be involved?
1.1.2 Question Set 2: Technology Implementation Attitudes
1.1.2.1 what evidence will generate this? what could it be used for?
1.1.2.2 which disciplines need to be involved?
1.2 Community and Social Issues
1.2.1 Question Set 1: Macro-Level Issues (Society)
1.2.1.1 what evidence will this generate? what could this be used for?
1.2.1.2 what disciplines need to be involved?
1.2.2 Question Set 2: Meso-Level Issues (Community)
1.2.2.1 what evidence will this generate? what could this be used for?
1.2.3 Question Set 3: Micro-Level Issues (Individuals and Workplaces)
1.2.3.1 what evidence will this generate? what could it be used for?
1.2.3.2 what disciplines need to be involved?
1.3 System Design for Being (in)Digital
1.3.1 Question Set 1
1.3.1.1 what evidence will this generate? what could this be used for?
1.3.1.2 which disciplines need to be involved?
1.4 Organizations, Professions, and Work
1.4.1 Question Set 1
1.4.1.1 what evidence will this generate? what could this be used for?
1.4.1.2 which disciplines need to be involved?
1.5 Trust and Accountability
1.5.1 Question Set 1
1.5.1.1 what evidence will this generate? what could this be used for?
1.5.1.2 which disciplines need to be involved?
1.6 What Is Human?—What Is the Role of Humans in a Future Society?
1.6.1 Question Set 1
1.6.1.1 what evidence will this generate? what could this be used for?
1.6.1.2 which disciplines need to be involved?
1.7 Technological Limitations
1.7.1 Question Set 1
1.7.1.1 what evidence will this provide? what could this be used for?
1.7.1.2 which disciplines need to be involved?
1.8 “Refuse-nicks”
1.8.1 Question Set 1
1.8.1.1 what evidence will this generate?
1.8.1.2 which disciplines need to be involved?
2 Detail of the ESRC-NSF Research Clusters and Questions
2.1 Trust
2.1.1 Citizen Knowledge and Skills
Original questions
2.1.2 Understanding and Addressing Human–Machine Trust
Original questions
2.1.3 Social Impacts and Consequences of Trust in Algorithms
Original questions
2.2 Complexity and the Scale of the Topic
2.2.1 (Inter/Multi) Disciplinary Issues
original questions
2.2.2 Collating Data, Cases, and Methods
original questions
2.2.3 Managing Scale and Rate of Change
original questions
2.2.4 Undertaking Responsive and Timely Research
original issues
2.2.5 Policy or Managing the Social Impacts and Asking Relevant Questions
original questions or comments
2.3 Evidence and Methods
2.3.1 Methods
2.3.1.1 Valuing different disciplinary perspectives
original questions or comments
2.3.2 Supporting Research Policy and Commercial R&D
original questions or comments
2.3.3 Data and Bias
original questions or comments
2.4 Global Environments
2.4.1 Environment
original questions or comments
2.4.2 Culture
original questions or comments
2.4.3 Economics
original questions or comments
2.4.4 Policy
original questions or comments
2.5 Changing Education, Skills, and Employment
2.5.1 Education
original questions or comments
2.5.2 Training and Skills
original questions or comments
2.5.3 Careers and Employment
original questions or comments
2.5.4 Nature of Digital Work
original questions or comments
2.5.5 Workers’ Rights, Rewards, and Trust
original questions or comments
2.5.6 Disruption and Change
original questions or comments
2.6 Inequalities
2.6.1 Growth in Inequalities
original questions or comments
2.6.2 Using Digital Technologies to Address Inequalities
original questions or comments
2.6.3 How to Support All Citizens Post-Automation
original questions or comments
2.7 Embodiment and Cognitive Demands
2.7.1 Cognitive Demands
original questions or comments
2.7.2 Capabilities
original questions or comments
2.8 Ethics and Research Challenges
2.8.1 Ethics and Social Consequences of Automation
original questions or comments
2.8.2 How Do We Research and Develop an Ethics for Automation?
original questions or comments
2.9 Impactful Social Science
original questions or comments
2.10 Technology Development and Adoption
original questions or comments
References
Chapter 25: Conclusion: Cross-Cutting, Unique, and General Themes in the Oxford Handbook of Digital Technology and Society
Introduction
Cross-Cutting Topics and Challenges in the ESRC Review Chapters
Co-occurring Terms and Cross-cutting Topics
Cross-cutting Research Questions
Cross-cutting Challenges
Missing Areas and Gaps
Cross-cutting and Unique Topics and General Themes in the non-ESRC Chapters
Common and Unique Themes
More General Themes Emerging from Relationships among the Chapters
Conclusion
Note
References
Index

Citation preview

T h e Ox f o r d H a n d b o o k o f

DIGI TA L T E C H NOL O GY A N D S O C I ET Y

the oxford handbook of

DIGITAL TECHNOLOGY AND SOCIETY Edited by

SIMEON J. YATES and

RONALD E. RICE

1

1 Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and certain other countries. Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America. © Oxford University Press 2020 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by license, or under terms agreed with the appropriate reproduction rights organization. Inquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above. You must not circulate this work in any other form and you must impose this same condition on any acquirer. Library of Congress Control Number: 2020938283 ISBN 978–0–19–093259–6 1 3 5 7 9 8 6 4 2 Printed by LSC Communications, United States of America

Preface Introduction This book is based upon work undertaken as part of the UK Economic and Social Research Council commissioned project “Ways of Being in a Digital Age”—for which Simeon was Principal Investigator and Ronald was a member of the Steering Group. The primary goal of the project was to identify the upcoming research questions and challenges facing the social sciences as they address the impacts that digital media and technologies are having and may have. This included a systematic review of prior work and a “horizon scanning” derived from expert opinion. This book is therefore as full of questions as it is of findings or answers. In particular, it identifies the topics that the social sciences, often in interdisciplinary collaboration, will need to tackle—probably sooner rather than later. The book is structured around the themes of the project—slightly reworked in the light of the findings. We have called these “domains.” The following list presents their initial descriptions, while the last part of Chapter 1 describes the ESRC and subsequent conference and workshops in more detail, the final domains, and their main questions. Initial Domains and Scoping Questions 1.  Citizenship and politics How does digital technology impacts on our autonomy, agency, and privacy—­illustrated by the paradox of emancipation and control? Whether and how our understanding of citizenship is evolving in the digital age—for example whether technology helps or hinders us in participating at individual and community levels? 2.  Communities and identities How we define and authenticate ourselves in a digital age? What new forms of communities and work emerge as a result of digital technologies—for example, new forms of coordination including large-scale and remote collaboration? 3.  Communication and relationships How are our relationships being shaped and sustained in and between various domains, including family and work?

vi   preface 4.  Health and well-being Does technology make us healthier, better educated, and more productive? 5.  Economy and sustainability How do we construct the digital to be open to all, sustainable, and secure? What impacts might the automation of the future workforce bring? 6.  Data and representation How we live with and trust the algorithms and data analysis used to shape key features of our lives? 7.  Governance and security What are the challenges of ethics, trust, and consent in the digital age? How we define responsibility and accountability in the digital age?

Challenges Interdisciplinary Views of the Digital Society The project, the book, and research on the social impacts of digital media and ­tech­nology have faced and will continue to face a number of key challenges. One of the great challenges of working in this field is that of avoiding simplistic “technological determinism”—or what Grint and Woolgar (2013) call “technism”—an inherent or implied reliance on “obvious or intrinsic” features of the technology in explanations of  technology development, use, or effects. Technism falls short of “technological determinism”—an approach that Grint and Woolgar argue is very rarely fully taken— but implies the assumption that technologies have intrinsic features that determine outcomes. We hope that we have sought to avoid this as much as we can, and to have captured the reflective and reflexive nature of the interactions among technologies, social systems and structures, and people. Another major challenge is that of interdisciplinary collaboration. Many questions require multiple disciplinary perspectives—across the social sciences, into health and engineering, but very often in collaboration with computer science and information studies colleagues. How does one understand the uses, implications, and role of the smartphone in any social domain without also understanding the telecommunications infrastructure, hardware, software, security, and design issues underlying the device? As a result, many of the contributions to this book are from very different disciplines (see Chapter 1), and this has enriched the perspective and critical analysis that have been considered. The interdisciplinary perspectives and the way new technologies have been developing have also introduced new ethical challenges in our research objects as well as our practices as researchers. Questions around automation, security, surveillance, and privacy (chapters 10, 12, 13, 16, 17, 18, 19, and 22) have complicated how we think about the

preface   vii relationship between humans and machines and what are the roles of governments, technology companies, and civil society in their design, use, and regulation. When it comes to conducting research on these subjects or using digital technologies to examine them, researchers also face new ethical challenges of what they can and should access, collect, analyze, and then present or publish. Digital media and technologies, then, have complicated how we do research, how we think about our research objects and subjects, and who is involved in these processes.

Volume of Literature and Digital Tools Another challenge is the volume of work out there needing to be reviewed and assessed. As we note in chapter 2, it is a feature of our contemporary world that the volume of academic work continues to increase at a far greater rate than can be followed. As Petticrew and Roberts note: The problem is not just one of inconsistency, but one of information overload. The past 20 years have seen an explosion in the amount of research information available to decision makers and social researchers alike. With new journals launched yearly, and thousands of research papers published, it is impossible for even the most energetic policymaker or researcher to keep up-to-date with the most recent research evidence, unless they are interested in a very narrow field indeed. (Petticrew & Roberts, 2008, p. 7)

In the ESRC project, and in many of the non-ESRC chapters, we have turned to digital tools to help manage this mass of literature—to extract topics and concepts from close to thousands of articles in few hours rather than in tens of thousands of hours. This is an example of the well-documented fact that digital tools are therefore transforming how we undertake many aspects of social research. The project therefore provided an opportunity to experiment with several of these tools and methods (see chapter 2). Thus, two characteristics of this Handbook of Digital Technology and Society is the broad range of literature covered in most chapters, and the frequent use of computer-based programs and techniques for collecting, analyzing, and displaying the results of that literature.

Constant Change Another key challenge for work in this field is the constantly changing nature of the artefacts, contexts, and social practices as technologies develop and change, and are adapted and socially constructed. This is often a clearly two-way street as social change

viii   preface and regulation change systems and new systems create new opportunities, debates, and challenges. This influences research as scholars seek to address current issues, new technologies, new behaviors, and new implications. The ESRC chapters have reflected on this by contrasting the development of concepts and topics over the period of the sampled literature and through the reflections of the experts involved in the ESRC project. This constant change generates a specific set of challenges for theory and methods: Do we need new theory and methods? Or do our existing tools still work—if slightly modified? We have found in exploring the literature a highly varied mix of work. In many cases the work is inductive, documenting and evidencing digital media and technology use and impacts but not testing or evaluating theory—with a good number of papers being “theory free.” Having said this, many papers draw on key social theories— with the notable re-use and revision of older theory. Uses and Gratifications is one notable “older” social theory that has been given new life by examinations of digital media use (see chapters 8, 9, 18, and 21). In other cases, new theory has had to be developed and honed to explore specific issues or to address challenges specific to digital technology use. An example here would be “unified user acceptance” theory or models (see chapter 13). Questions concerning theory that we might address include, • How is the digital socially and technically conceptualized? • Which theories are predominant in which domains? • What new theory has been developed, and/or is “old theory” adequate to the task of explaining the social impacts and use of the digital? • To what extent is digital research theoretically or empirically driven? • Which concepts and key themes cluster and link regardless of theoretical or empirical approach? • Can a new “theoretical framework” for understanding the digital be generated, and is this needed? • To what extent have interdisciplinary approaches modified or developed theory? • Which methods predominate in which domains of work? • Does the availability of large volumes of digital data change how the digital is studied and/or the approaches taken to the social in a digital world? • Are certain methods intrinsically linked to certain domains or theories? How are methods tied to the social contexts around digital research? • Have interdisciplinary approaches modified or prioritized certain methods in the study of the digital? We hope that by documenting issues of theory and method the book can help colleagues reflect on issues of theory selection and testing, as well as appropriate methodology. In this way, the book provides a snapshot from a brief period in time, assembling what has been studied in the area of digital media and technologies with an eye to the future of this research.

preface   ix

Chapters in the Book The chapters in the book either present the outcomes from the respective domains of the ESRC project or they are developed from responses to an open call as part of the Ways of Being Conference held in 2017 (both of which are described more fully in chapter 1). The non-ESRC contribution chapters represent reviews, reflections on the state of the art, or the application of reviews to various kinds of evidence in each of the domains. The Table of Contents provides detailed listings of the sections of each chapter as a guide to their content and focus. The online versions of each chapter also provide an abstract. All of the chapters should help provide overviews for, and spark ideas for and debates about, future research directions in the broad and evolving area of digital technology and society. As such, we would hope that current and future scholars can draw upon these as a resource when planning their work, using these chapters as foundations and baselines for literature reviews, as well as identifying central concepts and topics and larger research areas that need more attention and explication. We have also provided a range of supporting materials and visualizations via the project website at https:// waysofbeingdigital.com. The production of the materials presented in the ESRC chapters was a complex ­process involving contributions of the core research team, the project post-doctoral researchers, and other colleagues. The ESRC chapters follow a fairly similar and ­standard initial format developed by the core team—especially Sim Yates, Jordana Blejmar, and Elinor Carmi—and revised and finalized by Ron Rice. In listing the authorships of these chapters, we have tried to reflect as accurately as we can the contributions to these chapters from core project team members, either directly or via Delphi or workshop materials. We more generally acknowledge the contributions made by all across the project. Ron conceptualized the structure and flow of chapters, worked with Oxford University Press to develop a shared approach to the book, worked with the chapter authors through multiple versions of all 25 chapters, and developed and continually updated the surrounding material to ensure consistency of text and reference format, correspondence of terms, cross-referencing, and style. Ron and Sim engaged in multiple iterations of the materials, raising questions, resolving questions, and sharing detailed descriptions of all the things going on in our personal and academic lives that continually got in the way of completing the book. Irony, dark humor, encouragement, promises, arcane analysis details, Byzantine university politics, strikes, Brexit, floods, fires, emergency administration meetings, changes in contributors’ affiliations, reshuffling of chapter order, and debates about the proper use of “concept” or “topic” pervaded these email and Skype conversations.

x   preface

Potential Audiences Primary audiences for the Oxford Handbook of Digital Technology and Society are researchers, faculty, and graduate students, in one or more of the seven theme areas. The entire book, and certainly specific chapters, provide required reading for anyone interested in the multifaceted nature of relationships between digital technology and society. Secondary audiences are policymakers, research funding agencies, libraries, and upperlevel college students working on academic projects. The chapters should provide exceptional resources for those working on projects needing literature background and sources for deeper insights, research results, and theoretical foundations. Readers will benefit from this book’s disciplinary, multidisciplinary, and interdisciplinary perspectives. Given the seven theme sections—Health, Age, and Home; Communication and Relationships; Organizational Contexts; Communities, Identities, and Class; Citizenship, Politics, and Participation; Data, Representation, and Sharing; Governance and Accountability—as well as a Synthesis section, different portions of the proposed book could be of interest to diverse audiences, including, for example, those interested in sociology, political science, communication, psychology, media policy, management, organizational community, community studies, environment, economics, public administration, political communication, digital design, socio-technical systems, public health, and media research. No prior training or expertise is required to read or benefit from the chapters.

Conclusion This book was born out of a need to understand what the future research challenges will be for social research in understanding the relationships among digital technology and society. It is not meant as a definitive guide to this, but as rather a set of starting points and provocations to fellow scholars (and ourselves) as to the next steps in research, practice, and policy. Simeon J. Yates, University of Liverpool, United Kingdom Ronald E. Rice, University of California, Santa Barbara, United States

Acknowledgments

Particular thanks need to go to our colleagues Jordana Blejmar and Elinor Carmi. Jordana was instrumental in organizing the conference from which the non-ESRC chapters came. Both Jordana and Elinor managed many of the practicalities of getting the ESRC chapters together, and they were quick and detailed in providing additional literature reviews and editorial suggestions for all the ESRC chapters. We obviously need to thank all the contributors to the project and to the book without whose input neither the research nor the contributions to this volume would have been possible. Sim thanks the ESRC, which funded the project, and the UK Defence Science Technology Laboratory and the US National Science Foundation, which funded the workshops. He also thanks Ron, Jordana, and Elinor for their patience as he got distracted over the life of the project and book writing by role changes, a secondment to government, and a university promotion. As ever, thanks to his family: Rachel, Ciaran, Ethan, and Niamh for just being there (and typing up workshop “yellow stickies”—if only for extra pocket money!). Ron thanks the Arthur  N.  Rupe Foundation for funding his UC Santa Barbara endowed professorship, which supported travel and other resources involved in this project. He also thanks Claire for her ongoing tolerance of his frequent late-night editing work, and our cats Tinker and Belle for their frequent occupation of his desk and keyboard during those times. We very much appreciate the enthusiastic initial response to our book proposal by Hallie Stebbins at Oxford University Press, and the ongoing support by the Oxford University Press Editor Sarah Humphreville. Thanks, too, to the copyeditor Suzanne Copenhagen, the production team at SPi Global, and indexer Robert Swanson.

References Grint, K., & Woolgar, S. (2013). The machine at work: Technology, work and organization. New York, NY: John Wiley & Sons. Petticrew, M., & Roberts, H. (2008). Systematic reviews in the social sciences: A practical guide. New York, NY: John Wiley & Sons.

Table of Contents

List of Figuresxxi List of Tablesxxiii About the Contributorsxxix

SE C T ION 1  OV E RV I E W 1. Introduction to the Oxford Handbook of Digital Technology and Society: Terms, Domains, and Themes 3 Ronald E. Rice, Simeon J. Yates, and Jordana Blejmar Introduction3 Terms and growth of these developments 6 Main digital technology and society issues and contexts in recent books 9 Purpose and origins of this book 27 References in main text 31 References of books for issues and contexts analysis 31 2. ESRC Review: Methodology 36 Simeon J. Yates, Iona C. Hine, Michael Pidd, Jerome Fuselier, and Paul Watry Introduction36 Participants36 Initial outline for the scoping areas 39 Approaches for the review 40 Conclusion52 References53

SE C T ION 2  H E A LT H , AG E , A N D HOM E 3. ESRC Review: Health and Well-Being 57 Simeon J. Yates, Leanne Townsend, Monica Whitty, Ronald E. Rice, and Elinor Carmi Introduction57 Initial Comments 57

xiv   table of contents

Literature Analysis 58 Future Research and Scoping Questions 72 Research Challenges 75 Conclusion76 References77 4. Computer-Mediated Communication and Mental Health: A Computational Scoping Review of an Interdisciplinary Field 79 Adrian Meier, Emese Domahidi, and Elisabeth Günther Introduction79 Computer-mediated communication and mental health 80 The present study: Foci, hypotheses, and research questions 83 Method85 Results89 Discussion98 References101 Appendix: Publications analyzed from the topic modeling dataset (N = 1780) used for topic description 105 5. Digital Inclusion and Women’s Health and Well-Being in Rural Communities 111 Sharon Wagg, Louise Cooke, and Boyka Simeonova Introduction111 Methods114 Description of the reviewed literature 115 Theory and methods 116 Terminology117 Approaches to digital inclusion initiatives 119 Digital inclusion, information literacy, health, and well-being 125 Discussion127 Conclusion129 References not in review database 130 Appendix: Publications analyzed: Journal articles (N = 66) and grey literature (N = 16) 131 6. Digital Technology for Older People: A Review of Recent Research 136 Helen Petrie and Jenny S. Darzentas Introduction136 Scope of the review 137 Uses of mainstream technologies by and for older people 139

table of contents   xv

Reflections on the research on uses of digital technology for older people 150 Conclusion154 References154 Appendix: Publications analyzed 160 7. A Digital Nexus: Sustainable HCI and Domestic Resource Consumption186 Nicola Green, Rob Comber, and Sharron Kuznesof Introduction: Digital systems and natural resources 186 The development of sustainable HCI 189 Investigating physical resource use 192 Investigating rational choice and behavior change 194 Investigating attitudes, values, and lifestyles 198 Investigating practices and networks 200 Revisiting sustainable HCI in a WEF context 203 Conclusion: Resource sustainability, resilience, and security 206 References209

SE C T ION 3  C OM M U N IC AT ION A N D   R E L AT ION SH I P S 8. ESRC Review: Communication and Relationships 221 Simeon J. Yates, Rich Ling, Laura Robinson, Catherine Brooks, Adam Joinson, Monica Whitty, and Elinor Carmi Introduction221 Initial comments 221 Literature analysis 223 Topics223 Theory, Method, and Approach 238 Delphi Review 239 Conclusion245 References247 9. Media Mastery by College Students: A Typology and Review 250 Ronald E. Rice, Nicole Zamanzadeh, and Ingunn Hagen Introduction250 The concept of media mastery 251 Materials and coding 256 Review: Co-occurrences of media mastery components with social and individual aspects 266

xvi   table of contents

Conclusion278 References from introductory material 279 Cited analyzed references 281 Appendix: Publications analyzed 286 10. Boundary Management and Communication Technologies 299 Marta E. Cecchinato and Anna L. Cox Introduction299 Terminology300 Work-home boundaries 301 How aspects of communication technologies affect work-home boundaries304 Managing boundaries in the digital age 312 Conclusion315 References315

SE C T ION 4  ORG A N I Z AT IONA L C ON T E X T S 11. ESRC Review: Economy and Organizations 323 Simeon J. Yates, Paul Hepburn, Ronald E. Rice, Bridgette Wessels, and Elinor Carmi Introduction323 Initial comments 323 Literature analysis 324 Delphi review 336 Conclusion341 References342 12. The Changing Nature of Knowledge and Service Work in the Age of Intelligent Machines 344 Crispin Coombs, Donald Hislop, Stanimira Taneva, and Sarah Barnard Introduction344 What are intelligent machines (artificial intelligence and robotics)? 346 Literature review methods 348 Changing human relations with intelligent machines 349 Adoption and acceptance of intelligent machines 352 Ethical issues associated with machine-human collaboration 353 Agenda for future research 356 Conclusion359 References359 Appendix: Publications analyzed 362

table of contents   xvii

13. Workplace “Digital Culture” and the Uptake of Digital Solutions: Personal and Organizational Factors 369 Simeon J. Yates and Eleanor Lockley Introduction369 Understanding and measuring technology acceptance factors371 Survey and analysis methods 372 The extent to which UK organizations and sectors are digitizing 374 Digital efficacy: Digital skills at home and in the workplace 380 Experiences of digital technology roll-outs 383 Organizational challenges and communication 388 Building a model of workplace digital culture 394 A model of factors leading to perceived success in digital technology implementation396 Conclusion399 References401

SE C T ION 5  C OM M U N I T I E S , I DE N T I T I E S , AND CLASS 14. ESRC Review: Communities and Identities 405 Simeon J. Yates, Jordana Blejmar, Bridgette Wessels, and Claire Taylor Introduction405 Initial comments 406 Literature analysis 407 Conclusion421 References423 15. Digital Engagement and Class: Economic, Social, and Cultural Capital in a Digital Age 426 Simeon J. Yates and Eleanor Lockley Introduction426 Defining digital inequality 428 Why do we need to shift academic research away from questions of access and skills? 430 Class, capital, and digital media use 435 Conclusion442 References445

xviii   table of contents

SE C T ION 6  C I T I Z E N SH I P, P OL I T IC S , A N D PA RT IC I PAT ION 16. ESCR Review: Citizenship and Politics 451 Simeon J. Yates, Bridgette Wessels, Paul Hepburn, Alexander Frame, and Vishanth Weerakkody Introduction451 Initial comments 452 Literature analysis 453 Delphi review 460 Conclusion465 References468 17. Digital Ecology of Free Speech: Authenticity, Identity, and Self-Censorship 471 Yenn Lee and Alison Scott-Baumann Introduction471 Methodology473 Findings476 Conclusion: Where from here? 485 References488

SE C T ION 7  DATA , R E P R E SE N TAT ION , A N D SHA R I N G 18. ESRC Review: Data and Representation 501 Simeon J. Yates, Liz Robson, Ronald E. Rice, and Elinor Carmi Introduction501 Initial comments 501 Literature analysis 502 Conclusion523 References524 19. Digital Citizenship in the Age of Datafication 526 Arne Hintz Introduction526 Citizenship527 From digital acts to digital citizenship 530

table of contents   xix

Digital restrictions 535 Digital citizenship and datafication 536 Conclusion541 References542 20. Digitizing Cultural Complexity: Representing Rich Cultural Data in a Big Data Environment 547 Georgina Nugent-Folan and Jennifer Edmond Introduction547 Defining pre-data and the origins of data 549 Data definitions: Theory and practice 554 Metadata565 Conclusion568 References570 21. Motivations for Online Knowledge Sharing 573 Kristin Page Hocevar, Audrey N. Abeyta, and Ronald E. Rice Introduction573 Framework574 Self-oriented motivations 576 Other-oriented motivations 582 Contextual factors 588 Directions for future research 590 Conclusion591 References593

SE C T ION 8  G OV E R NA N C E A N D AC C OU N TA B I L I T Y 22. ESCR Review: Governance and Security 605 Simeon J. Yates, Gerwyn Jones, William H. Dutton, and Elinor Carmi Introduction605 Initial comments 605 Literature analysis 606 Delphi review 621 Conclusion626 References627

xx   table of contents

23. Governance and Accountability in Internet of Things (IoT) Networks 628 Naomi Jacobs, Peter Edwards, Caitlin D. Cottrill, and Karen Salt Introduction628 Principles of Internet of Things governance 630 Case studies: Regional/national IoT governance 636 Case studies: Local IoT deployment 643 Conclusion649 Notes651 References651

SE C T ION 9  SY N T H E SI S 24. ESRC Review: Future Research on the Social, Organizational, and Personal Impacts of Automation: Findings from Two Expert Panels 659 Simeon J. Yates and Jordana Blejmar Introduction659 Social and economic context 661 Method and project context 663 Definitions665 Proposed research areas 668 Identified research topics 671 Conclusion680 Appendix681 References697 25. Conclusion: Cross-Cutting, Unique, and General Themes in the Oxford Handbook of Digital Technology and Society699 Ronald E. Rice, Simeon J. Yates, and Jordana Blejmar Introduction699 Cross-cutting topics and challenges in the ESRC review chapters 700 Cross-cutting and unique topics and general themes in the non-ESRC chapters 708 Conclusion716 References719 Index

721

List of Figures

1.1 Trends over time in mention of four major digital terms in books through 2008, based on Google Ngram Viewer.

11

1.2 Hierarchical clustering of main codes based on co-occurrence (correlation) of main and subcodes within each source text.

25

2.1 Delphi process.

41

2.2 Bubble map of concept pairs.

48

2.3 Tree map of concept pairs.

49

2.4 Interactive topic modelling graph–topic.

50

2.5 Interactive topic modelling graph–keyword.

51

2.6 WordStat topic modelling.

52

3.1 Health and Well-Being 2000–2004: Most frequent concept pairs.

60

3.2 Health and Well-Being 2012–2016: Most frequent concept pairs.

61

4.1 Distribution of nine core topics over time.

94

4.2 Top 20 journals.

94

4.3 Distribution of articles per discipline over time.

95

4.4 Distribution of mental health concepts over time.

96

8.1 Communication 2000–2004: Most frequent concept pairs.

226

8.2 Communication 2012–2016: Most frequent concept pairs.

227

11.1 Economy 2000–2004: Most frequent concept pairs.

327

11.2 Economy 2012–2016: Most frequent concept pairs.

328

13.1 Digital roll-outs (or not) by company size.

375

13.2 Number of digital roll-outs by organization size (area represents proportion of cases).

375

13.3 Roll-outs or not by sector.

376

13.4 Digital solution roll-outs by sector (area represents proportion of cases).

377

13.5 Increase in roll-outs over the last two years by sector.

378

13.6 Reasons for digital roll-outs.

379

13.7 Knowledge worker and number of roll-outs.

384

13.8 Proportion of digital roll-outs UK workforce thought successful.

384

xxii   list of figures 13.9 Level of employment and number of roll-outs experienced.

385

13.10 Positive impacts of new digital tools.

386

13.11 Reasons for a negative attitude.

387

13.12 Organization size and challenges to implementation of digital solutions.

389

13.13 Levels of organizational challenge and successful digital roll-outs.

389

13.14 Communication channels used.

390

13.15 Adequate communication and communication channel.

391

13.16 Communications channels and successful roll-outs.

392

13.17 Leadership and successful roll-outs.

392

13.18 Leadership by sector.

393

13.19 Regression model of perceptions of successful digital roll-outs.

397

14.1 Communities and Identities 2000–2004: Most frequent concept pairs.

409

14.2 Communities and Identities 2012–2016: Most frequent concept pairs.

410

15.1 Mean of frequency of social media use by social class (NS-SEC).

437

15.2 Type of Internet user by social class (NRS).

438

15.3 MCA analysis—overall results.

440

15.4 Mean number of social media platforms used by class.

443

15.5 Social media platforms used by social class (NS-SEC).

444

16.1 Citizenship 2000–2004: Most frequent concept pairs.

453

16.2 Citizenship 2012–2016: Most frequent concept pairs.

454

18.1 Data and representation 2000–2004: Most frequent concept pairs.

506

18.2 Data and representation 2012–2016: Most frequent concept pairs.

507

21.1 Summary of review of motivations for online knowledge sharing.

592

22.1 Governance and security 2000–2004: Most frequent concept pairs.

610

22.2 Governance and security 2012–2016: Most frequent concept pairs.

611

23.1 Governance tools and their application at different levels of IoT activity. (Smith, 2012)

632

24.1 Productivity graph.

662

24.2 Clustering of ideas: ESRC-NSF workshop.

665

24.3 Political, economic, social, technical, legal, and environmental clustering: ESRC-NSF workshop.

666

24.4 Final research topic template: ESRC-DSTL workshop.

667

25.1 All seven domains 2000–2004: Most frequent concept pairs.

701

25.2 All seven domains 2012–2016: Most frequent concept pairs.

702

25.3 Hierarchical clustering of non-ESRC chapters based on co-occurrence of coded themes.

716

List of Tables

1.1 First Appearances of Four Major Digital Terms in Web of Science

7

1.2 First Appearances of Four Major Digital Terms in ScienceDirect

7

1.3 First Appearances of Four Major Digital Terms in Nexis Uni (News)

8

1.4 First Appearances of Four Major Digital Terms in Proquest Periodicals Index Online

9

1.5 First Substantive Entries of Four Major Digital Terms in Books, from Google NGram

10

1.6 Themes, Main Codes and Subcodes Used to Identify Issues and Concerns in Recent Books on Digital Technology and Society

12

2.1 Steering Group

37

2.2 Initial Scoping Questions

39

2.3 Example Concept Mapping by Digital Humanities Institute at the University of Sheffield

47

3.1 Analysis Concepts Ranked

58

3.2 Concept Pairings—Main and Secondary Concepts

59

3.3 Wordstat Analysis of Topics

62

3.4 Comparison between Concepts and WordStat Topics

63

3.5 Epistemological Approach

70

3.6 Empirical Approach

71

3.7 Analytic Approach

71

3.8 Research Method

72

3.9 Study Population

72

3.10 Delphi Review Scoping Questions

73

3.11 Delphi Review Scoping Questions Ranked by Importance

73

3.12 Key Topics Ranked by Percentage of Delphi Survey Responses

74

3.13 Key Topics Ranked by Importance from Delphi Survey

74

3.14 Challenges Ranked by Percent of Cases

75

3.15 Challenges Ranked by Importance from Delphi Survey

76

4.1 Search Terms, Databases, and Concept Operationalization

87

xxiv   list of tables 4.2 CTM with 15 Manually Selected Topics Merged into Nine Thematically Overlapping Topic Clusters, Sorted by Aggregated Frequencies ( k = 110, N = 1780, Max. 2 topics/Document, Prob ≥ 0.1)

90

4.3 Mental Health Concepts Distributed over Disciplines

97

4.4 Mental Health Concepts Distributed over Topics

97

5.1 Range of Theories and Methods Identified in Review

117

6.1 Mainstream and Specialist Outlets Included in the Review

138

6.2 Terms Related to Older People Used to Select Papers for Inclusion in the Review139 6.3 The 16 Topics of the Research in the Papers Reviewed

140

7.1 Summary of Approaches, Key Concepts, Methodologies, and Studies in Each Section

190

8.1 Scoping Questions

222

8.2 Analysis Concepts Ranked

224

8.3 Concept Pairings—Main and Secondary Concepts

224

8.4 WordStat Analysis of Topics

225

8.5 Epistemological Approach

238

8.6 Empirical Approach

238

8.7 Research Method

238

8.8 Study Population

239

8.9 Delphi Review Scoping Questions

240

8.10 Scoping Questions Ranked by Number of Cases

240

8.11 Scoping Questions Ranked by Importance

240

8.12 Consultation Workshop Scoping Categories and Example Questions

241

8.13 Key Topics Ranked by Percent of Cases

242

8.14 Key Topics Ranked by Importance from Delphi Survey

243

8.15 Challenges Ranked by Percentage of Cases

243

8.16 Challenges Ranked by Importance from Delphi Survey

245

9.1 Media Mastery Typology Codes and Sublevels

257

9.2 Co-occurrences of Media Mastery Subcodes with Social and Individual Aspects Subcodes

260

11.1 Analysis Concepts Ranked

324

11.2 Concept Pairings—Main and Secondary Concepts

325

11.3 WordStat Analysis of Topics

326

11.4 Epistemological Approach

335

list of tables   xxv 11.5 Empirical Approach

335

11.6 Research Method

336

11.7 Study Population

336

11.8 Delphi Review Scoping Questions

338

11.9 Key Topics Ranked by Percent of Cases

339

11.10 Key Topics Ranked by Importance from Delphi Survey

339

11.11 Challenges Ranked by Percent of Cases

340

11.12 Challenges Ranked by Importance from Delphi Survey

340

13.1 Defining Digital Solutions

373

13.2 Organization Size and Number of Digital Roll-Outs

376

13.3 Confidence at Home

380

13.4 Access to Technology at Home

381

13.5 K-Means Clustering with a Target of Six Clusters

382

13.6 Correlations of Age, Personal Confidence, and Work Confidence

382

13.7 Factor Analysis

394

13.8 Key Predictors of UK Workforce Perceptions of Successful Digital Roll-Outs

397

14.1 Analysis Concepts Ranked

407

14.2 Concept Pairings—Main and Secondary Concepts

408

14.3 Wordstat Analysis of Topics

411

14.4 Comparison between Concepts and WordStat Topics

412

14.5 Epistemological Approach

416

14.6 Empirical Approach

416

14.7 Research Method

417

14.8 Study Population

417

14.9 Delphi Review Scoping Questions

418

14.10 Key Topics Ranked by Percent of Delphi Survey Responses

419

14.11 Key Topics Ranked by Importance from Delphi Survey

419

14.12 Challenges Ranked by Percent of Cases

420

14.13 Challenges Ranked by Importance from Delphi Survey

420

15.1 NRS Social Grades and NS-SEC Classifications

436

15.2 Key Features of Non-users

439

15.3 Key Features of Limited Users

439

15.4 MCA Clustering of Arts Attendance

441

16.1 Analysis Concepts Ranked

455

xxvi   list of tables 16.2 Concept Pairings—Main and Secondary Concepts

455

16.3 WordStat Analysis of Topics

456

16.4 Comparison between Concepts and WordStat Topics

457

16.5 Empirical Approach

461

16.6 Research Method

461

16.7 Study Population

461

16.8 Analytic Approach

461

16.9 Delphi Review Scoping Questions

462

16.10 Delphi Review Scoping Questions Ranked by Number of Cases and by Importance

463

16.11 Key Topics Ranked by Percent of Delphi Survey Responses

464

16.12 Key Topics Ranked by Importance from Delphi Survey

465

16.13 Challenges Ranked by Percent of Cases

466

16.14 Challenges Ranked by Importance from Delphi Survey

466

17.1 List of Keywords Used in the Process of Literature Review

474

17.2 Free Speech Challenges Posed by Digital Technologies and Practices

486

18.1 Analysis Concepts Ranked

502

18.2 Concept Pairings—Main and Secondary Concepts

503

18.3 WordStat Analysis of Topics

504

18.4 Comparison between Concepts and WordStat Topics

505

18.5 Epistemological Approach

516

18.6 Empirical Approach

516

18.7 Analytic Approach

517

18.8 Study Population

517

18.9 Delphi Review Scoping Questions

519

18.10 Delphi Review Scoping Questions Ranked by Importance

520

18.11 Key Topics Ranked by Percent of Delphi Survey Responses

521

18.12 Key Topics Ranked by Importance from Delphi Survey

521

18.13 Data-focused Topics and Challenges

522

18.14 Challenges Ranked by Percent of Cases

522

18.15 Challenges Ranked by Importance from Delphi Survey

523

20.1 Analysis of Journal of Big Data, 2014–2017

559

20.2 NASA’s Earth Observing System Data Information System (EOS DIS)

563

22.1 Analysis Concepts Ranked

606

22.2 Concept Pairings—Main and Secondary Concepts

607

list of tables   xxvii 22.3 WordStat Analysis of Topics

608

22.4 Comparison between Concepts and WordStat Topics

609

22.5 Empirical Approach

620

22.6 Research Methods

621

22.7 Analytic Approach

621

22.8 Study Population

621

22.9 Delphi Review Scoping Questions

622

22.10 Delphi Review Scoping Questions Ranked by Importance

623

22.11 Key Topics Ranked by Percent of Delphi Survey Responses

623

22.12 Key Topics Ranked by Importance from Delphi Survey

624

22.13 Challenges Ranked by Percent of Cases

625

22.14 Challenges Ranked by Importance from Delphi Survey

625

23.1 Key Emergent Themes and the Case Studies to Which They Particularly Relate636 23.2 Mapping of Themes in EU Governance

638

23.3 Mapping of Themes in United States Governance

640

23.4 Mapping of Themes in UK Governance

642

24.1 Expertise Represented at the Two Workshops

663

24.2 ESRC-NSF Workshop: Topics by Issues

669

24.3 ESRC-DSTL Workshop: Topic Areas by Level of Impact

670

24.4 ESRC-DSTL Workshop: Social and Cultural Perceptions by Level of Impact

671

24.5 ESRC-DSTL Workshop: Technology Acceptance and Systems Design by Level of Impact

672

24.6 ESRC-DSTL Workshop: Trust and Automation by Level of Impact

673

24.7 ESRC-DSTL Workshop: Work and Organizational Topics by Level of Impact

678

24.8 ESRC-DSTL Workshop: Areas of Inequality by Level of Impact

679

24.9 ESRC-DSTL Workshop: Research Impact Questions by Level of Impact

681

25.1 Main Themes of Concept Pairs, 2000–2004

703

25.2 Main Themes of Concept Pairs, 2012–2016

704

25.3 Cross-cutting Topics in ESRC Themes

705

25.4 Most Frequent Cross-cutting Challenges in ESRC Themes

705

25.5 Non-ESRC Chapters: Number of Themes and Subthemes by Chapters and by Total Instances

709

25.6 Non-ESRC Chapters Including at Least One Instance of Each Theme

715

About the Contributors Editors Simeon J. Yates (PhD, Open University UK, 1993) is Professor of Digital Culture and Associate Pro-Vice-Chancellor Research Environment and Postgraduate Research at University of Liverpool. His research on the social, political, and cultural impacts of digital media includes a long-standing focus on digital media and interpersonal interaction. More recently, he has worked on projects that address issues of digital inclusion and exclusion. He was seconded to the UK Government’s Department of Digital, Culture, Media, and Sport (DCMS) in 2017 to act as research lead for the Digital Culture team. He remains the joint-chair of the DCMS Research Working Group on Digital Skills and Inclusion. His prior work covered topics such as the use of digital technologies in the workplace, digital media use during crises, and ICT use by the security services. The majority of his research has been funded by the Economic and Social Research Council (ESRC), the Arts and Humanities Research Council (AHRC), EU, and industry. Simeon’s work has often been interdisciplinary and has predominantly involved creative and digital industry partners. He led on a major Engineering and Physical Sciences Research Council (EPSRC) funded interdisciplinary program (Engineering for Life) while at Sheffield Hallam. Simeon has been researching the impacts of the internet and digital media on language and culture since 1990. His PhD thesis (1993) is a large-scale linguistic comparison of speech, writing, and online interaction. Subsequent published work has covered analyses of gender differences in computer-mediated communication (CMC), gender and computer gaming, email and letter writing, and science in the mass media. Simeon has written text books on social research methods—in particular, linguistic and discourse analytic methods. https://www.liverpool.ac.uk/communication-and-media/staff/simeon-yates/ Ronald E. Rice (PhD, Stanford University, 1982) is the Arthur N. Rupe Chair in the Social Effects of Mass Communication in the Department of Communication at University of California, Santa Barbara. Dr. Rice has been awarded an Honorary Doctorate from University of Montreal (2010), an International Communication Association (ICA) Fellow, selected President of the ICA (2006–2007), awarded a Fulbright Award to Finland (2006), and appointed as the Wee Kim Wee Professor at the School of Communication and Information and the Visiting University Professor, both at Nanyang Technological University in Singapore (Augusts 2007–2009 and June 2010).

xxx   about the contributors His co-authored or co-edited books include Organizations and unusual routines: A systems analysis of dysfunctional feedback processes (2010); Media ownership: Research and regulation (2008); The Internet and health care: Theory, research and practice (2006); Social consequences of internet use: Access, involvement and interaction (2002); The Internet and health communication (2001); Accessing and browsing information and communication (2001); Public communication campaigns (1981, 1989, 2001, 2012); Research methods and the new media (1988); Managing organizational innovation (1987); And The new media: Communication, research and technology (1984). He has published over 150 refereed journal articles and 70 book chapters. Dr. Rice has conducted research and published widely in communication science, public communication campaigns, computer-mediated communication systems, methodology, organizational and management theory, information systems, information science and bibliometrics, social uses and effects of the Internet, and social networks. http://www.comm.ucsb.edu/ people/ronald-e-rice

Authors Audrey N. Abeyta (MA, UCSB) is a doctoral candidate at the University of California, Santa Barbara and an instructor in the Department of Communication at the University of Missouri. Her research explores the creation and consumption of online information, focusing specifically on individuals’ motivations to share information online and their assessment of that information. Audrey teaches courses in public speaking, group communication, research methods, and statistics. Sarah Barnard is an Assistant Professor in Sociology of Contemporary Work and a member of the Centre for Professional Work in Society in the School of Business and Economics at Loughborough University, United Kingdom. Her research focuses largely on gender, organizations, sociology of higher education, and sociological research in Science, Engineering, and Technology (SET). Her research investigates inequalities in society; explores the social impact of construction and engineering; how digital technology can inform and influence professional working practices; and gender and higher education. She has extensive experience applying quantitative and qualitative social research methods over a range of research and consultancy projects. She has written and published 20 conference papers, 7 journal articles and 11 reports on these subjects. She is a member of the British Sociological Association and the Women in Higher Education Management (WHEM) network. Jordana Blejmar (MPhil, PhD as a Gates Scholar, University of Cambridge) is Lecturer in Visual Media and Cultural Studies in the School of the Arts, University of Liverpool, after previously working on an Arts and Humanities Research Center–funded project on Latin American Digital Art. Before Liverpool, she was Lecturer in Hispanic Studies at the Institute of Modern Languages Research, University of London. Her research is

about the contributors   xxxi situated at the meeting point of Latin American visual cultures, memory studies, and digital humanities. She is the author of Playful Memories: The Autofictional Turn in Post-Dictatorship Argentina (Palgrave Macmillan, 2017). She has co-edited several books and has also published articles and book chapters on contemporary Latin American, especially Argentine, literature, art, photography, theater, digital artworks, and film. Catherine Brooks (PhD, University of California) is the Founder and Director of the Center for Digital Society and Data Studies (CDSDS), Director of Arizona’s iSchool, and an Associate Professor in the School of Information. Catherine’s primary research interests focus on issues of language and culture, with particular concern about data privacy and digital exclusion. She established the CDSDS as an interdisciplinary research center meant to explore today’s grand challenges related to a digital society and data-driven culture. Catherine has spent more than 20 years in higher education, she developed the new Information Science and eSociety degree program for the School of Information at UA, and has published work on a variety of topics to include supporting faculty online and training students for life and work in a digital society. Elinor Carmi (PhD, Media and Communications Department at Goldsmiths, University of London) is a digital rights advocate, feminist, researcher, and journalist who has been working, writing, and teaching on deviant media, internet standards, feminist-technoscience, sound studies, internet history, and internet governance. Currently, she is a postdoctoral research associate in digital culture and society at Liverpool University (UK), where she works on several ESRC and AHRC projects around digital ways of being, digital inclusion, and digital literacies. In addition to writing her book about spam, she is also working on two special journal issues: One about “sonic publics,” together with Ram Sinnreich for the International Journal of Communication, and the other about (re)designing time, together with Britt Paris, for Theory, Culture & Society. Marta  E.  Cecchinato is an Senior Lecturer at Northumbria University, working in human-computer interaction (HCI). Prior to this, she has worked at the UCL Interaction Centre and at Microsoft Research in Cambridge (UK). She has a BS and MS in Psychology from University of Padua (Italy) and has a PhD in HCI from the UCL Interaction Centre. Her current research focuses on understanding complexities of dealing with digital technologies in everyday life especially for work-life balance, and has been investigating strategies that support people in feeling in control of their digital lives. Her work has been consistently published in top tier HCI conferences and has been featured in the New Scientist, The Conversation, and The Psychologist. Rob Comber is a human-computer interaction researcher working at the Swedish Institute for Computing Science at RI.SE, where he is an ERCIM Fellow. His research explores the ethics, methods, and tools to promote citizen participation in social and civic issues. His current research examines topics such as activism, citizen science, community education, and food and technology, all through a lens of designing for community.

xxxii   about the contributors Louise Cooke is Professor of Information and Knowledge Management in the School of Business and Economics at Loughborough University. Her main research interests focus on the ethical aspects of information, data and knowledge use, and the societal value of access to information. In particular, her work has focused on challenges to freedom of expression in the online environment. She led the Arts and Humanities Research Center–funded MAIPLE (Managing Access to the Internet in Public Libraries) and JISC-funded staff access to Information and Communication Technology in UK Further Education and Higher Education projects. Her PhD thesis investigated the impact on freedom of expression of measures taken to regulate internet access and content. She has published widely in the field of information science. Crispin Coombs is a Reader in Information Systems (Associate Professor) and Head of the Information Management group in the School of Business and Economics at Loughborough University, UK. He is an expert in the organizational impacts of new technologies, their successful implementation, and people’s attitudes and behaviors towards IT. Particular interests include the robotization of knowledge and service work, the behavioral impacts of new technologies, and benefits realization management from information systems. He has led several externally funded research projects from Engineering and Physical Sciences Research Council (EPSRC), Chartered Institute of Personnel and Development (CIPD), British Academy, National Institute for Health Research (NIHR), Economic and Social Research Council (ESRC), and Department of Health. He has published over 80 outputs and is a senior editor for Information Technology and People and associate editor for the European Journal of Information Systems. He was appointed to the Board of the UK Academy of Information Systems in 2015 and is a Visiting Professor at the University of Sao Paulo, Brazil. Caitlin  D.  Cottrill is a Senior Lecturer in the Department of Geography and Environment at the University of Aberdeen. Her primary research interests span the interrelated topics of transport, individual behavior, technology, and data, linked by an underlying commitment to encouraging sustainable and efficient mobility. Her work has a strong focus on facilitating data sharing between transport service providers and travelers in a privacy-preserving manner, in order to encourage better decision making. She has, additionally, worked to ensure that this research takes place in a multidisciplinary context, with collaborators from the areas of computing science, engineering, statistics, and information sciences. Anna L. Cox is Professor of Human-Computer Interaction at the University College London Interaction Centre. Her research focuses on productivity at work, work-life balance, and well-being. She has published nearly 200 papers, many of which in toptier HCI conferences and journals. She co-edited the first textbook on Research methods for human-computer interaction. Her work has been featured, among others, in The Conversation, The Psychologist, Men’s Health, BPS Occupation Digest, and most recently in the Guardian.

about the contributors   xxxiii Jenny S. Darzentas was the Marie Curie Advanced Researcher Fellow in the Department of Computer Science at the University of York 2016–2018 during the writing of the chapter. She is currently Assistant Professor at the Department of Product and Systems Design Engineering, University of the Aegean, Greece. Her research interests are in accessibility, service design and systems thinking, and information design. She has worked on collaborative research projects funded by the European Union on HCI, intelligent tutoring, decision support, library and information systems, and universal design. She also has an interest in accessibility issues in international (ISO) and European (CEN/CENELEC) standardization efforts through her voluntary work with ANEC (www.anec.gr). She has published widely on all these subjects. Emese Domahidi is an Assistant Professor for Computational Communication Science at the Technische Universität Ilmenau in Germany. Her research focuses on the psychosocial consequences of online media use and on (biased) information processing in digital media. Emese is especially interested in computational communication science methods and their use to gain insights into her research questions. She is an expert in computational systematic reviews and meta-analysis. William H. Dutton is an Emeritus Professor at the University of Southern California, Senior Fellow of the Oxford Internet Institute, and Oxford Martin Fellow with the Global Cyber Security Capacity Centre, Department of Computer Science at the University of Oxford, and Visiting Professor in the School of Media and Communication at the University of Leeds. He was the Quello Professor of Media and Information Policy in the Department of Media and Information, College of Communication Arts and Sciences, Michigan State University, where he was also Director of the Quello Center. Jennifer Edmond is Associate Professor of Digital Humanities at Trinity College Dublin and the co-director of the Trinity Center for Digital Humanities. She holds a PhD in Germanic Languages and Literatures from Yale University, and applies her training as a scholar of language, narrative, and culture to the study and promotion of advanced methods in and infrastructures for the arts and humanities. Jennifer is President of the Board of Directors of the pan-European research infrastructure for the arts and humanities, DARIAH, and was the Principal Investigator for the European Commission-funded KPLEX Project. Peter Edwards is Professor of Computing Science at the University of Aberdeen. Between 2009 and 2015 he was Director of the RCUK Digital Economy Hub dot. rural—a large interdisciplinary research effort which explored how digital technologies could transform rural life; from 2006 to 2012 he was Director of the ESRC Digital Social Research Node, PolicyGrid—exploring the role of computational models of provenance in documenting social policy formulation. He has over 25 years of experience of research into distributed information systems and their applications, working in domains as diverse as transport, health care, environmental modelling, and food safety.

xxxiv   about the contributors Alexander Frame is an Associate Professor in Communication Science at the Languages and Communication Faculty of the University of Burgundy (Dijon, France), where he runs the MA course in Intercultural Management. Born in Britain, he graduated from the University of Oxford in 1998, before settling in France and completing his PhD in Communication Science at the University of Burgundy, in 2008. He is a member of the TIL (“ Text, Image, Language”) research group (EA 4182), where he specializes in intercultural communication, political communication on Twitter, organizational communication, and comparative cross-cultural communication studies. Recent publications include Citizen participation and political communication in a digital world (Routledge, 2015). Jerome Fuselier has been an Associate Researcher at the University of Liverpool since 2008. Before that he was a Postdoc at Xerox Research Centre Europe. He was awarded his PhD in 2006 at the Université Savoie Mont Blanc. Nicola Green is a Research Associate with the OpenLab, Newcastle University. She is a sociologist by trade and a qualitative interdisciplinary researcher by inclination. Her background has run the gamut of social sciences, HCI, science and technology studies, media and cultural studies, and surveillance studies; all intersecting via projects on digital media technologies and/or sustainabilities of various sorts. Her projects have included work on virtual reality technologies; mobile devices and everyday mobilities; the rise and spread of mobile data and “big data”; digital trust, risk, and privacy; and lifestyles, consumption, and environment. Issues explored across these projects have included embodiment and identity, organization and discourse, popular media and culture, as well as the development of qualitative research methodologies and their use in both HCI research and within social sciences more generally—particularly in respect of ethnographic, mixed, feminist, and participatory methodologies. Elisabeth Günther is a PhD candidate at the Department of Communication, University of Münster, Germany. Her research interests are in computational methods, especially topic modeling, and online journalism. Elisabeth works as a data scientist at Axel Springer Digital. Ingunn Hagen (PhD) is a Professor in Psychology at the Department of Psychology, Norwegian University of Science and Technology (NTNU), Trondheim, Norway. Her main research interests include topics related to media and communication psychology, such the role of media and ICT in children and young people’s lives. She has been involved in research projects on Internet-related risks (EU Kids Online). Her research also includes such fields as audience reception studies, political communication, consumption of popular culture, children and consumption, and yoga and well-being. See https://www.ntnui.edu/employees/ingunn.hagen Paul Hepburn is a Research Associate at Heseltine Institute for Public Policy and Practice, University of Liverpool. His research interests lie in exploring the potential of the new digital media to enhance local democracy and local governance. He is also

about the contributors   xxxv interested in methods and tools for analyzing and explaining the structure of online networks. Prior to pursuing an academic career, Paul worked in local government conducting research, developing policy, and, lately, implementing an e-government program. Iona C. Hine is a postdoctoral researcher at the Urban Institute at the University of Sheffield. Together with Digital Humanities developers and colleagues in the School of English, she has modelled discursive concepts in text collections ranging from the earliest English print to comments on YouTube videos. She has a particular interest in context and translation, as well as the challenges of unruly metadata. Her work spans several disciplines, including biblical studies, early modern literature, and translation studies. Arne Hintz is Senior Lecturer at the School of Journalism, Media and Culture at Cardiff University, where he leads the MA Digital Media and Society, and co-directs the Data Justice Lab. His research addresses questions of digital democracy, datafication, and communication policy. He has led several collaborative research projects, including Digital Citizenship and Surveillance Society: State-Media-Citizen Relations after the Snowden Leaks and Towards Democratic Auditing: Civic Participation in the Scoring Society. His publications include, among others, Beyond WikiLeaks (Palgrave, 2013) and Digital Citizenship in a Datafied Society (Polity, 2018). Donald Hislop is Professor in the Business School at the University of Aberdeen. Prior to this he worked at Loughborough University and Sheffield University. His research interests are in two main areas: knowledge management and mobile working. He has published on knowledge management in a range of journals, including Management Learning, Journal of Information Technology, Technology Analysis & Strategic Management, and the Journal of Knowledge Management. He is also the author of a popular and well-regarded textbook called Knowledge management in organizations: A critical introduction (now in its fourth edition, published in 2018). He is on the editorial board of the journal New Technology, Work and Employment. Kristin Page Hocevar (PhD, UC Santa Barbara) is an Assistant Professor at Southern Oregon University. She has worked in television, documentary film, and web production for multiple Public Broadcasting Service stations and affiliated organizations. Her current research focuses on online health information sharing, selection, and evaluation, and the social and health implications of the interactions, communities, and pooled information facilitated by the Internet. Naomi Jacobs is a Research Fellow currently based at the University of Aberdeen, whose interdisciplinary work focuses on social impacts of technology for interaction in digital and physical spaces. Her research to date has included examining the nature and impacts of the digital public space, developing new tools for interdisciplinary collaboration and knowledge exchange, and using design ethnography and speculative design to investigate factors affecting trust by citizens and communities with regard to the Internet of Things.

xxxvi   about the contributors Adam Joinson is Professor of Information Systems at the University of Bath. He conducts inter-disciplinary research on the interaction between human behavior and technology, with specific foci on issues of how the design of systems influences behavior ranging from privacy and self-disclosure, cyber-security, social relations, and patterns of influence. He is a program lead for the national Centre for Research and Evidence on Security Threats, as well as currently running funded projects on individual susceptibility to malevolent influence techniques (e.g., scams, phishing), communication accommodation, and behavioral change and technology. Adam’s work has been funded by the ESRC, EPSRC, EU, British Academy, DSTL, and UK Government. He also has an interest in “big data” generally and the use of computational social science to gain insights into social and workplace behaviors. Gerwyn Jones is a Senior Research Fellow working at Liverpool John Moores University’s Screen School. He is currently program leader for the MA in Cities, Culture, and Creativity. Gerwyn has over 15 years academic and consultancy experience relating to urban policy, governance, and regeneration. In recent years, Gerwyn has undertaken ESRC funded research and published articles on the impact of austerity on the cities of Liverpool and Bristol. Sharron Kuznesof is a Senior Lecturer and applied qualitative social scientist working in an interdisciplinary environment in the School of Natural and Environmental Sciences, Newcastle University. Her research focuses on conceptual exploration of the behaviors and practices of food consumers and innovative research methods to support that endeavor. Related research includes Food Standards Agency–funded research with HCI staff at Newcastle University’s OpenLab to examine domestically situated food safety practices. Yenn Lee (PhD, University of London) is a widely published researcher in the sociology of digital technologies, participation, and social change, with a special interest in the Asia–Pacific region. She has also long collaborated with various activist and non-profit organizations outside academia, including Freedom House for its annual report Freedom on the Net since its first edition in 2011. In her current position as Senior Lecturer in Research Methodology at SOAS University of London, she teachesPhD students. interdisciplinary and technology-enhanced research methods. Rich Ling (PhD, University of Colorado) has focused his work on the social consequences of mobile communication. He was a professor at the IT University of Copenhagen, where he has served in department management, and he works at Telenor near Oslo, Norway. Ling has been the Pohs visiting professor of communication studies (2005) at the University of Michigan in Ann Arbor, where he has an adjunct position. He is the author of the book Taken for grantedness (2012 MIT Press), which was recently the subject of a complementary review in the journal Science. He has also written New tech, new ties (2008, MIT), The mobile connection (Morgan Kaufmann) and, along with Jonathan Donner, he has written the book Mobile phones and mobile communication (2009, Polity). Ling is a founding co-editor of the Sage journal Mobile Media and

about the contributors   xxxvii Communication. He is the co-editor of the Oxford University Press series Studies in Mobile Communication with Gerard Goggin and Leopoldina Fortunati. Along with Scott Campbell he is the founding editor of The Mobile Communication Research Series and he is an associate editor for The Information Society, The Journal of Computer Mediated Communication, and Information Technology and International Development. Eleanor Lockley is Research Fellow and Associate Lecturer at Sheffield Hallam University. Her research falls broadly under communication studies and information studies, and working in C3RI means that she has worked on a variety of different interdisciplinary projects since 2008. One day she can be a human-computer interaction researcher—the next she can be investigating issues associated with user centered design! Her previous role in C3RI involved engaging with knowledge transfer activity— meaning that she has worked on commercial consultancy, as well as on academic projects. She has recently worked on several European-funded projects; two of note are COURAGE (2014–2016) and ATHENA (2013–2016). The former involved developing a research agenda for cybercrime and cyberterrorism based upon user-centered research. Her role in the latter focused upon human factors and best practices for crisis sensemaking and communication and, in particular, how social media can be best used for crisis and disaster management. ATHENA is creating a prototype to enhance the ability of Local Education Agencies of police, first responders, and citizens in their use of mobile and smart devices in crisis situations. Adrian Meier is a PhD candidate at the Department of Communication, Johannes Gutenberg-University Mainz, Germany. His research revolves around the question of whether and how communication technologies can improve or impair mental health and well-being. Specifically, he investigates the relationship between technology usage and mental health through the lens of self-regulation processes, using intensive longitudinal surveys (e.g., diaries, experience sampling), and systematic review methodology. Georgina Nugent-Folan is Assistant Professor of Modern English Literature, Department of English and American Studies, Ludwig Maximillians University of Munich, Germany. She completed her PhD on the works of Samuel Beckett and Gertrude Stein at Trinity College Dublin in 2016. Georgina is currently preparing a digital genetic edition of the Compagnie/Company module as part of the Beckett Digital Manuscript Project (forthcoming, 2020). Articles on Beckett, Stein, and/or James Joyce have been published in the Journal of Beckett Studies, The Southern Review, Samuel Beckett Today/Aujourd’hui, and the James Joyce Quarterly. Her article, “Samuel Beckett: Going On in Style,” received a Special Mention in the 2017 Pushcart Prize. Helen Petrie is Professor of Human Computer Interaction in the Department of Computer Science at the University of York in the UK. Her research centers on the use of new technologies for people with disabilities and older people, particularly the web. She has been involved in many British and international projects and has published extensively. She has advised numerous private and public sector organizations on web

xxxviii   about the contributors accessibility and accessibility issues of other new technologies. She directed the largest study in the world on web accessibility for the Disability Rights Commission of Great Britain and a similar study for the UK Museums, Libraries, and Archive Council, and she has conducted many smaller studies of web accessibility. In 2009 she was awarded an Association for Computing Machinery (ACM) Award for the social impact of her research, and in 2017 she was honored with a Lifetime Achievement Award from the Royal National Institute for Blind People. Michael Pidd is Digital Director of HRI Digital at the Humanities Research Institute, University of Sheffield, one of the United Kingdom’s leading Digital Humanities centers. Michael has over 20 years of experience in developing, managing, and delivering large collaborative research projects in the humanities and heritage subject domains. Laura Robinson is Associate Professor in the Department of Sociology at Santa Clara University. She earned her PhD from UCLA, where she held a Mellon Fellowship in Latin American Studies and received a Bourse d’Accueil at the École Normale Supérieure. In addition to holding a postdoctoral fellowship on a John  D.  and Catherine T. MacArthur Foundation–funded project at the USC Annenberg Center, Robinson has served as Visiting Assistant Professor at Cornell University and Affiliated Faculty at the UC Berkeley Institute for the Study of Societal Issues. She is a series coeditor for Emerald Studies in Media and Communications and previously served as the Chair of CITAMS (formerly CITASA). Her research has earned awards from CITASA, AOIR, and NCA IICD. Robinson’s current multi-year study examines digital and informational inequalities. Her other publications explore interaction and identity work, as well as new media in Brazil, France, and the United States. Liz Robson is a Research Associate at the University of Newcastle. She has a background in economic development with expertise in understanding labor markets, employment, and skills. Liz Robson joined Center for Urban and Regional Development Studies in September 2000 as a research associate, leaving in 2004 to work for the Regional Development Agency as a skills and employment analyst. She returned in 2011 as a Visiting Fellow supporting the work of Ranald Richardson and the SIDE (Social Inclusion through the Digital Economy) project to better understand how young people might access the life-changing benefits offered by digital technologies. Her recent research at CURDS has focused on the digital age, which throws up all kinds of questions regarding how technology, social media, and the so-called fourth industrial will impact on institutional and organizational arrangements. In June 2017, she joined the department of sociology to work on a prestigious AHRC (Arts and Humanities Research Council) project, which is investigating the different ways audiences engage with specialized film outside of London. Research questions encompass the range of specialized film venues and events within regional provision, as well as how digital platforms feature in the venue and event-based film experience. Karen Salt is Director of the Centre for Research in Race and Rights (C3R) and Assistant Professor at the University of Nottingham. She is an interdisciplinary scholar with

about the contributors   xxxix strong interests in transnational American studies and Afrodiasporic studies. A significant portion of her work investigates how black nation-states have fought for their continued existence within a highly racialized world. As this work has developed, Dr. Salt has considered the relationship of sovereignty and race to environmental consumption and protection, enabling her to craft new research on racial ecologies. In addition to this work, she currently leads or co-leads projects on reparative trust, collective activism, racial equity, and transformative justice politics. Alison Scott-Baumann is Professor of Society and Belief in the Department of Religions and Philosophies at SOAS University of London. She is a scholar with an international reputation in Islam in Britain, and her recent book Islamic Education in Britain, with Cheruvallil-Contractor (2015), is highly regarded in British Muslim communities. She recently completed her leadership of Re/presenting Islam on Campus (2015–2018), a major project funded by the Arts and Humanities Research Council of the United Kingdom. In 2017 she gave evidence to the Joint Committee on Human Rights in their investigation of freedom of speech in universities. Boyka Simeonova is Lecturer in Information Management at Loughborough University, United Kingdom. Boyka is Director of the Knowledge and the Digital Economy Network and Deputy Director of the Centre for Information Management at Loughborough University. Boyka is a Fellow of the Higher Education Academy. Boyka is the recipient of the Dean’s Early Career Researcher Award at Loughborough University and has published in Information Systems and Management. Stanimira Taneva is currently Senior Researcher and REF Impact Officer, School of Sociology and Social Policy, University of Nottingham, United Kingdom. During the work on her chapter, she was a Senior Research and Enterprise Associate and a member of the Centre for Professional Work and Society at the School of Business and Economics at Loughborough University. Her background is in developmental and work/ organizational psychologies, as well as psychometrics. Stanimira’s work experience is a combination of academia and practice—she has been in academic, research, management, and expert roles in the public, private, and third-sector. Stanimira has conducted a variety of academic and applied research programs in areas such as developing and managing careers, (age-) diversity, well-being, and performance in organizations. In 2013 she was awarded a Marie Curie Fellowship from the European Commission for her cross-cultural research on successful aging at work. Stanimira’s most recent research interests include cross-disciplinary research impact and the exploration of the impacts of new technology (e.g., AI) on work. She is a fellow at the UK Research and Innovation Future Leaders Fellowships program Peer Review College. Claire Taylor is Gilmour Chair of Spanish and Professor of Hispanic Studies at the University of Liverpool. She is a specialist in Latin American literature and culture and has published widely on a range of writers, artists, and genres from across the region. Her particular geographical areas of interest are Colombia, Argentina, and Chile, although she also worked on literature, art, and culture from other regions. Within

xl   about the contributors Latin American Cultural Studies, she takes a particular interest in the varied literary and cultural genres being developed online by Latin(o) Americans, especially hypertext novels, e-poetry, and net art. She has published numerous articles and book chapters on these topics, and she is the co-author of the recent volume Latin American identity in online cultural production (New York: Routledge, 2012) and author of the recent monograph Place and politics in Latin America digital culture: Location and Latin American net art (New York: Routledge, 2014). She is currently working on an AHRCfunded project focusing on memory, victims, and representation of the Colombian conflict. Leanne Townsend is a Senior Social Scientist working within the Social, Economic, and Geographical Sciences Group at the James Hutton Institute, Aberdeen, Scotland. Leanne leads research on a number of projects exploring digitization and innovation in various rural contexts, including agriculture, rural entrepreneurship, and rural community development. Sharon Wagg is a doctoral researcher in the Centre of Information Management, part of the School of Business and Economics at Loughborough University, United Kingdom. She is the recipient of the Mark Hepworth PhD scholarship, and her research interests include digital inclusion and social change, information literacy, and lifelong learning. Sharon worked as part of the research team at the digital inclusion charity Good Things Foundation, and has a master’s degree in Librarianship (Distinction) from the University of Sheffield. Her PhD dissertation investigated digital inclusion initiatives in the context of rural communities in the United Kingdom. Paul Watry is Principal Investigator for the Multivalent Digital Preservation Architecture project and the Cheshire digital library system. His primary area of interest is in computational linguistics and in bibliographic analysis. A core activity is to develop and implement a strategy which will embrace both electronic and traditional information resources and address the needs of both research and learning. Vishanth Weerakkody joined the School of Management at University of Bradford in March 2017 as Professor in Management Information Systems and Governance. He was previously a Professor of Digital Governance at the Business School in Brunel University, London, where he held several leadership roles. Prior to his academic career, Prof. Weerakkody worked in a number of multinational organizations, including IBM UK. He has a successful track record of research and enterprise and has secured numerous research grants from funding bodies such as the European Commission (FP7 & H2020), Economic and Social Research Council, Qatar Foundation, and UK Local Government. His R&D expertise spans several disciplines, including management decision making, ICT evaluation, public administration, social innovation, and process transformation. He is the editor-in-chief of the International Journal of Electronic Government Research and a handling editor for Information Systems Frontiers. He is a chartered IT professional and fellow of the UK Higher Education Academy.

about the contributors   xli Bridgette Wessels is Professor of Social Inequality, Department of Sociology, at the School of Social and Political Sciences, University of Glasgow. Her research focuses on the innovation, development, and use of digital technology and services in social and cultural life. Recent books include Open data and knowledge society (2017, Amsterdam University Press) and Communicative civic-ness: Social media and political culture (2018, Routledge). She is a co-investigator on the ESRC project Ways of Being in the Digital Age, and she is Principal Investigator on the AHRC funded project “Beyond the Multiplex: Audiences for Specialized Film in English Regions,” which is using digital humanities methods. Other examples of funded work include research on telehealth, social media, digital social research methodologies, women, work and technology (NordWit project), journalism in the digital age (REGPRESS project), and mobile networks (COST network: Social Networks and Travel Behaviour). Monica Whitty is Professor of Human Factors in Cyber Security at the University of Melbourne, Australia and the University of Warwick, WMG, United Kingdom. She is also on the Global Futures committee for cybersecurity for the World Economic Forum. Her research over the last 20 years has focused on the ways individuals behave in cyberspace. Her work, in particular, examines identities created in cyberspace, cyberscams, online security risks, behavior in cyberspace, insider threat, as well as detecting and preventing cybercrimes. Monica is the author of over 100 articles, and five books, the latest being Cyberpsychology: The study of individuals, society and digital technologies (Wiley, 2017, with Garry Young). She is currently leading an interdisciplinary project funded by TIPS (ESPRC) titled, Detecting and Preventing Mass-Marketing Fraud. Nicole Zamanzadeh received her PhD from the University of California, Santa Barbara. Her research interests include new media, stress, and family resilience. Her current work investigates questions about media use habits such as media multitasking as a potential source of stress or resilience for individuals and the family system.

section 1

OV E RV I E W

chapter 1

I n troduction to th e Ox for d H a n dbook of Digita l Tech nol ogy a n d Societ y Terms, Domains, and Themes Ronald E. Rice, Simeon J. Yates, and Jordana Blejmar

Introduction Many developed countries have become information or knowledge societies, whereby cognitive activities, symbolic and data analysis, and information resources are replacing agriculture and manufacturing as the primary sectors of developed countries’ economies. This idea of the information society or economy has been identified, discussed, and analyzed since the 1960s. For example, Machlup’s (1962) analysis of the US economy identified an information sector, primarily devoted to information activities necessary to produce physical goods and services. Porat (1971) reanalyzed Machlup’s data to define the key components of the growing information society. Bell (1973) explained the post-industrial economy, whereby knowledge becomes the primary resource, allowing freedom from constraints of labor, land, and machines. But we can argue that the concept of an information society does not, in itself, require computers or digitization (Beniger, 1989). The extensive collection and analysis of information, especially about transactions and inventory, but also about local and regional administration, has existed from early civilizations (Egypt, Mesoamerica, Mesopotamia). Nevertheless, it can be argued that the information society as we see it now has roots in the growth of the British Empire and systematic organizational management (Yates, 1993); the need to control and market industrial revolution technologies and products

4   Ronald E. Rice ET AL. (Beniger,  1989); and even the development of dictionaries, maps, and classification schemes during the “age of enlightenment” (Headrick, 2002). The core argument is that the basis of wealth is shifting to the collecting, management, analysis, and application of data and information (Daley, 2015; Nonaka & Takeuchi, 1995). This shift is also a manifestation of the rise of information capitalism and the exploitation of knowledge labor (Castells,  2000; Curtin & Sanson,  2016; Fuchs,  2014). Thus, information is a crucial aspect of modern economies, as well as everyday life, in most countries, although with considerable disparities across countries and even within regions, cities, and towns. But the digital society involves additional dimensions. While mainframe digital computers had played major roles in WWII and after in telephone switching, office automation, and manufacturing in the 1960s and 1970s, the advent of end-user computing, widescale social uses of computing, and networked communication such as email and the Internet required the interaction of several factors. Although literature on the history of computing, transmission networks, the Internet, and programming is vast, we need to note only four basic components that underpin the transformative nature of the digital world. The first is digitization, or more specifically, the encoding of information into bits (binary digits). Negroponte (1995) was an early (but not the first) popularizer of the understanding that “being digital” was the foundation for widespread, pervasive, and unique changes in our social, economic, and political world. He articulated the main difference between the analog and digital world: the first, traditional, world was based on atoms (physical material), while the second, digital, world was based on bits, standing for “binary digits”: symbolic or electronic signals indicating presence or absence, “on” or “off,” or more colloquially, “0s” and “1s.” By converting information from analogue to digital representation the information contained within or collected about a process or artefact (Zuboff, 1985), one could “free” the information from its material “packaging.” For example, in the analog world, a book means the paper-based bound set of pages on which words and images are printed. However, in the digital world, the content becomes independent of any particular physical form. So digital information is freed from the analog, material world. In 1987 (though he had earlier raised this point in a 1984 Hacker’s Conference; http://www.rogerclarke.com/II/IWtbF.html), Brand (1987) claimed that “Information wants to be free,” in a slightly different way. First, its cost approaches zero because it is freed from material resources and so is easily storable, copyable, and distributable—although today we can recognize the infrastructural and environmental costs of moving data around. He also noted at the same time that information wants to be expensive because it may require exceptional resources to create initially, and can be extremely valuable if held privately or used in combination with other information. Importantly, he noted that there seemed to be a tension between these two tendencies, which may rise or fall in different contexts. He also noted that information sought to be politically free as access to high-quality information and free speech fosters political freedom—though we might today note that propaganda and misinformation are just as easily distributed. The key point is that digitizing information—converting frequencies, symbols, material dimensions, etc. to bits in a flow of information—means that content

Introduction: Terms, Domains, and Themes   5 in general is wholly or partly independent from the material used to convey the information (such as a the text in a printed book). This is one of the foundations for digital convergence: the same or different portions of content can appear via multiple devices, at the same or different times; and all digital media can partake of the same content in different forms. The second component is computing. Digitization also means that this content can now be treated as data by computer processes, vastly increasing what can be done with, or by, that content. For example, instead of one or two “see also” cards in a library’s card catalog associating a given book with other books, now almost any content in a digitized source can be searched and associated with similar or related content in the same or other sources. At present it is not perfect for all content (such as images, sounds, smells) but iterations of improved computing power and algorithms make this easier and easier. Thus, any kind of content (information in various forms) can be processed through computing programs. Information becomes a powerful raw resource and can be transformed, combined, integrated, and analyzed. This is the essence of datafication and digitization: anything that can be formally and systematically represented in a digital form can then be processed, combined, and analyzed, for a vast and growing range of purposes. Practically related but conceptually distinct, the third component is microprocessors via integrated circuits, the increasingly small and powerful devices for performing a wide array of computer processes. The integration of basic computing functions into one chip increased computing functionality, speed, and power, yet reduced computing size and computing power cost, leading to the ability to embed computing power into ever smaller objects. Current smartphones and games consoles easily outperform super computers of the 1980s and early 1990s (as measured by calculations per second). Recent developments in massive multiprocessing and quantum computing, and embedding of computing power onto and into tiny devices and our bodies, will extend this growth in power and decrease in size. The fourth component is digital networking, or transmission of digitized information among nodes that are themselves computers. Information can be conveyed in analog form, through material carriers (books, photographs), and amplitude or frequency modulation (pre-digital radio, television). But digital networks allow much faster, more error-corrected, more distant, and more robust ways of processing, accessing, and distributing information (content of any form). Digital networks can interconnect with local, “last-mile” analog transmission lines. The Internet is a vast interconnected set of subnetworks, using various protocols to standardize and facilitate exchange of digital information from source to receiver. Wireless networking allows devices and people to communicate with each other without constraints of physical wiring, also enabling computing power to be distributed throughout space (such as in the Internet of things, radio-frequency identification [RFID], and mobile phones). The transformative power of the digital comes from combining these elements. If we put these together, we can move artefacts and ideas around the globe, and at increasing speed. We can undertake a 3D scan of a contemporary artwork or new product, send the data around the world, and print it out on a 3D printer minutes later. Citizen journalists

6   Ronald E. Rice ET AL. can live-stream news events as they happen. Doctors can diagnose patients from thousands of miles away. Consumers can watch almost any film or listen to almost any music ever recorded. Grandparents can see and talk to the grandkids in Australia. Politicians can directly message followers to their heart’s content. The examples are ever expanding.

Terms and Growth of These Developments The smart mobile phone has become the most general exemplar of the integration of these four transformative components in one device. However, our focus in this book is not on these four crucial components of digital technologies, or any one technology, but, rather, on how their integration shapes, and is shaped by, social factors. Hence the title is the Handbook of digital technology and society. While technology has developed, and continues to evolve over time, we become more aware of the implications of these changes—positive and negative, intended and unintended, short-term and long-term, individual and collective, and straightforward and contradictory—for digital technology, individuals, groups, communities, organizations, societies, nations, and the world. So along with increasing mention of the technologies in the academic and general literature, ways of characterizing the role and implications of digital technologies have also arisen. The four primary terms that have been used to refer to such changes are digital age, digital era, digital society, and digital technology. In the spirit of some of the literature analyses to follow that utilize digital tools to extract and evaluate academic discussions about the social impacts of digital, we have used databases to explore how these terms were first used. Tables 1.1 through 1.4 show the first entries that used these terms in major academic reference, news, and periodicals databases (Web of Science, Science Direct, Nexis Uni News, and Proquest Periodicals Index Online, respectively). While in general all four terms began being mentioned in publications covered by these four sources between 1972 and 1983, the earliest terms used were “digital technology” (1967) and “­dig­ital society” (1968), followed by “digital era” (1970) and then “digital age” (1982). Naturally, most were mentioned in reference to the growth and development of ­computers and digitization. For example, “digital technology” typically referred to computers, computerized control, data flow, and the computer-based telephone switching network. “Digital society” noted the diffusion of technology use in everyday contexts. “Digital era” highlighted the introduction of the personal computer, digital satellites, and industrial automation; while the “digital age” referenced the transition of technology to digital forms, and twice with specific reference to analog models. However, not all mentions of these terms related only to new technological developments at the time. Both “digital technology” and “digital society” were associated with new training and education, and the first discussion of “digital technology” (in 1967) specifically emphasized its potential social and economic impacts. Some of these

Introduction: Terms, Domains, and Themes   7 Table 1.1  First Appearances of Four Major Digital Terms in Web of Science Term

First-year entries

digital age

Zenman, M. J. (1982). Even in a digital age, scopes remain the instrument. Electronic Design, 30(18), 129ff.

digital era

Electronics. (1970). Bold new inroads for computer as digital era gets under way. Electronics, 43(1), 105 ff.

digital society

Cazes, B. (1984). The digital society: New technologies in everyday use. Quinzaine Litteraire, 421, 8–9.

digital technology

Fenik, F., & Stopper, H. (1968). Rapid switching circuits in digital technology. Elektrotechnische Zeitschrift B-Ausgabe, 29(7–8), 229ff. Ulrich, G. (1968). Comparison between analogue and digital technology in information flow. Periodica Polytechnica Electrical Engineering, 12(2), 145ff.

Note: Based on Topics (in title)

Table 1.2  First Appearances of Four Major Digital Terms in ScienceDirect Term

First-year entries

digital age

Geballe, T. H. (1984). Materials: Analogue answers in a digital age. Physica B&C, 172(1–3), 50–58.

digital era

Dement, D. K. (1980). Developing the next phase in NASAs satellite communications program. Acta Astronautica, 7(11), 1275–1286. “In the coming digital era, maximizing the use of frequency spectrum allocations will require special techniques for transmitting television signals within allowable bandwidths . . . ” Latour, P. R. (1980). S2: Requirements for successful closed-loop optimization of petroleum refining processes. IFAC Proceedings Volumes, 13(9), 11–23. “DIGITAL ERA: Adequate Hardware and General: Software for Automation of Industrial Plants. Costs are still Dropping. . . . ”

digital society

Delorme, J-C. (1985). Education in a digital world. Education and Computing 1(2), 117–124. “These questions not only have pertinence from the perspective of or as a consequence of the emergence of the digital society . . . ”

digital technology

Beaverstock, M. C. & Bernard, J. W. (1977). Advanced control: Ready able accepted? IFAC Proceedings Volumes, 10(16), 335–341. “Further application of more advanced control systems to industrial processes is limited by acceptance of the newer digital technologies.” Rony, P. R. & Larsen, D. G. (1977). Teaching microcomputer interfacing to non-electrical engineers. Euromicro Newsletter, 3(2), 57–62. “Rather, we are providing them with specific training in digital technology that may be useful to them professionally.” Benvenuto, F., DiTomaso, C., Donati, L. F., Sbragia, D., & Valcada, A. (1977). Digital control system for uninterruptible power supply. IFAC Proceedings Volumes, 10(10), 973–979. “The most peculiar characteristic of the system just described consists in the fact that the control has been carried out using the digital technology in almost every part of it.” (continued )

Table 1.2  Continued Newstead, A. (1977). Australia’s telecom 2000. Telecommunications Policy, 1(2), 158–162. “Continue network studies on optimum rate of transition to digital technologies in transmission and switching . . .” Schnieder, E. (1977). Control of DC-drives by microprocessors. IFAC Proceedings Volumes, 19(10), 603–608. “The current signal is the only analogue variable requiring A/D conversion, since speed measurement can be performed by digital techniques thus disposing of analogue transmitters such as DC tachometers.” Fujii, K., Takeda, N., Kogure, Y., Neda, T., . . . Abe, M. (1977). Recent computerized power generation plant automation and advanced man-machine interface system. IFAC Proceedings Volumes,10(1), 16–20. “In the future, their reliability will have to be highly improved using microcomputerized digital technology for example.” Nyborg, P. S. (1977). Computer technology and US communications law. Telecommunications Policy, 1(5), 374–380. “Significant technologies, among others, are large-scale integration, software control of switching devices and terminals, digital technology, and new services and techniques relating to audio transmission (including satellite).” Owen, E. W. & Moseley, E. C. (1977). A user-compatible terminal for medical applications. Computers in Biology and Medicine, 7(2), 165–176. “He is currently working on the application of microprocessor and related digital technologies to these fields.” Note: Based on Abstract, Title, Keywords, Text in Research Articles

Table 1.3  First Appearances of Four Major Digital Terms in Nexis Uni (News) Term

First-year entries

digital age

Williams, E. (1982). Comdial system ready for switch to the digital age. Financial Times. Mulligan, H. A. (1982). [no headline]. The Associated Press. “Urchins raised in this digital age do not know which direction is clockwise.” Safire, W. (1982). Watch what you say. The New York Times. “ . . . moving finger has written that we are now in the Digital Age.”

digital era

Heine, C. (1970). This diabolical hipster hoodwinker almost sold a bag of Brooklyn air for $20,000. Adweek. “ . . . scenario could be that it was modern art for the digital era, an existential exhibit, if you will.”

digital society

Salisbury, D. F. (1983). Life in the computer age: Social choices in a futuristic world. The Christian Science Monitor. “In its extreme form, a ‘Digital Society’ would become simply a giant, clean, well-ordered Disneyworld, Vallee warns . . . ”

digital technology

Chapman, W. (1978). High stakes race: Japanese search for breakthrough in field of giant computers. The Washington Post. “It [Japan] took transistors and digital technology, added automation and superior quality control, and transformed those innovations into profitable exports.” Anon. (1978). Scientific and technical exchanges in China. Xinhua General Overseas News Service (China). “. . . the first conference on digital technology was recently held in Kochiu in Yunnan province.” Ostry, B. (1978). The Mermaid Inn: The wiring of Canada: A danger, a challenge, a certainty. The Global and Mail (Canada). “One writer insists digital technology will turn the international telephone network into the biggest blooming computer the world has ever seen.”

Introduction: Terms, Domains, and Themes   9 Table 1.4  First Appearances of Four Major Digital Terms in Proquest Periodicals Index Online Term

First year entries

digital age

Julesz, B. (1983). The role of analog models in our digital age. The Behavioral and Brain Sciences, 6(4), 668–669.

digital era

Stauffacher, J. (1985). The Transylvanian Phoenix: The Kis-Janson types in the digital era. Visible Language, 19(1), 61–76.

digital society

Bixby, J. L. (1968). Public opinion and school music. Music Journal, 26(3), 48–53. “Effects of reduced privacy, restrictions on individualism (in a computerized digital society; shall we tattoo the Social Security number on the newborn?)”

digital technology

Baran, P. (1967). The future computer utility. The Public Interest, 8(summer), 75–87. “These new developments in computer technology are of such significance as to affect materially the nature of our economic and social life.”

impacts of the “digital age” were quite novel: for example, an early concern about the switch to digital clocks was that people would no longer know what “clockwise” meant. So we see that right from the beginning that social aspects and implications (both positive and negative) were part of the discussion, even though the very first uses of these four terms were stimulated by technological developments. Table 1.5 lists the earliest mentions of these same four terms in books that Google has digitized and indexed, and that were retrieved through their Ngram Viewer (https:// books.google.com/ngrams). For some of these, the plots do indicate entries before 1967, but either those do not retrieve an entry, do not retrieve a book entry, or are snippets from journals whose starting publishing date was in that time range, but the document with the term occurred in a much later issue. Ngram Viewer provides results through 2008. Figure 1.1 shows the trends in these four sets of terms over time. In books, the “digital technology” focus is the most frequently mentioned over time, really increasing during 1975–1995. However, the terms “digital society” (1965) and “digital era” (1969) appeared a bit earlier. The term “digital age” was the last to be introduced, around 20 years later, but quickly became the most used term indicating the societal aspects of dig­ital technology.

Main Digital Technology and Society Issues and Contexts in Recent Books Method Presuming that books integrate and distill considerable prior work, and represent topics that are important, salient, and likely of broad and timely concern, we turn to recent

10   Ronald E. Rice ET AL. Table 1.5  First Substantive Entries of Four Major Digital Terms in Books, from Google NGram Term

First substantive entry

Percentage of all Ngram entries that year; in 2008; and times greater

digital age

Watkinson, J. (1990). Coding for digital recording. Focal Press. https://books.google.com/books?id=cjBTAAAAMAAJ “Perhaps some future historian will classify this as the digital age, when everyday processes increasingly came to be performed using discrete numbers.”

.00000040; 0.00000770; 18.25

Unites States. Congress. Senate. Committee on Commerce, Science, and Transportation. Subcommittee on Communications. (1990). Hearings on . . . Digital Audio Tape Recorder Act of 1990. U.S. Government Printing Office. https://books.google.com/books? id=DTGqJhRf4jsC With the passage of S. 2358, such synergy will extend into the digital age, to the benefit of everyone.” digital era

Parrish, L. (1969). Space flight simulation technology. H. W. Sams. https://books.google.com/books?id=k7NZAAAAYAAJ “ . . . the accomplished simulation designer . . . will, of necessity, have qualified as a digital-computer programmer (this latter accomplishment being a forced requirement of the digital era).”

.00000002; 0.00000140; 69

digital society

White, T. H. (1965). The making of the president 1964. Antheneum. https://books.google.com/books?id=NIVkmNgX7_UC “The emotions of normal people resist the general condition of a Digital Society—digits for the boys who are drafted, digits for Social Security and income-tax people, digits on credit cards and union cards, digits replacing familiar telephone exchanges, the electronic recordings that answer the telephone at airports and railway stations.”

.00000000; 0.00003800; 3800+

digital technology

Canada. Department of Communications. (1971). The future of communications technology. Department of Communications. https://books.google.com/books?id=lBe4AAAAIAAJ “6.5 DIGITAL TECHNOLOGY The evolving carrier network will be composed mainly of digital sub-systems, which can offer the complete range of digital and analogue capability required by any user on a switched network basis.”

.00000100; 0.00004700; 46

books for indicators of the main issues and contexts about digital technology and society. Using the same four sets of terms, we searched Amazon Books for relevant titles, and relevant recommended titles, in the past decade. We ended up with 89 books, from 2009 to 2018 (M = 2015.2). There are of course many more books related to various aspects of digital technology and society, both those retrievable through other terms and from earlier

Introduction: Terms, Domains, and Themes   11 0.0000550% 0.0000500%

digital technology (All)

0.0000450% 0.0000400%

digital age (All)

0.0000350% 0.0000300% 0.0000250% 0.0000200% 0.0000150% 0.0000100%

digital ora (All)

0.0000050% 0.0000000% 1950

1955

1960

1965

1970

1975

1980

1985

1990

1995

2000

2005

digital society (All)

Figure 1.1  Trends over time in mention of four major digital terms in books through 2008, based on Google Ngram Viewer.

years, but this seems a reasonable sample (in both size and source) to represent the most frequent and important issues and contexts. Within this sample 77 were (co)authored, and 12 were edited (including one encyclopedia); 16 were general (i.e., textbook, overview or coverage of many issues), and 73 were specific (about an identifiable issue or topic; e.g., youth and media, or Internet governance). Again, as our goal was to identify main issues and contexts, we did not analyze the text of each book; rather, we collected information about each book, including summaries, reviews, prefaces, and table of contents; that is, what do the authors and others think the book is “about,” or what topics do they emphasize? We combined all those materials about each book into a file for each book. The total text across all books constituted around 29,000 words. We read each file, compiling a list of possible issues and contexts from each one. We then reviewed that compilation, reorganizing, regrouping, and combining terms into an alphabetized list of main codes and subcodes. This grouped list provided our initial a priori coding scheme. Needless to say, others might have developed a more or less different list, many of the subcodes could have been included with other main codes, and more or less different main codes could have been developed. We will return to a possible different grouping of the main codes later on. The purpose of the following overview is to identify some of the primary themes and topics of recent books in this domain. We entered those 89 book summary files and the initial coding taxonomy into NVivo 11. Then we re-read each book file and coded the materials using the initial codes, as well as adding new codes as they arose. After coding all files, we revisited the coding taxonomy and again reorganized, regrouped, and combined terms, and then re-coded the book files. For the following overview of the issues and contexts covered in these books, we re-organized the main codes by general themes: (A) Theory and Conceptualization, (B) Digital Technology, (C) Issues, (D) Contexts, and (E) Effects. Table 1.6 lists the general themes, their main codes and their subcodes, along with the number of sources that used that code and the number of times (references) that code was used.

12   Ronald E. Rice ET AL. Table 1.6  Themes, Main Codes and Subcodes Used to Identify Issues and Concerns in Recent Books on Digital Technology and Society Theme, Code, and Subcodes A. Theory and Conceptualization

B. Technology

A1. Theory [17] Actor-network theory (48) Critical studies, theory (9,22) Diffusion (39) Digital divide (38) Digital media & social change (51) Mediation theories (23, 31, 33) Model of digital coping with illness (68) Network theory (35, 39, 51) Public good (23) Science & technology studies (33) Social capital (54) Sociological (4, 21) Socio-technical (4, 10, 26) Various (51, 53)

B1. Technology venues [68] 3D printing, fabrication (32, 77) Algorithms (10, 19, 24, 29, 33, 51, 63, 84) Artificial Intelligence (13, 30, 49) Blockchain (81) Cloud computing (4, 11, 39, 42, 59) Constant change and development of technology (4, 11) Data storage (24) Drones (59, 77) Gaming (57, 69, 71, 82) Internet of things (11, 15, 26, 56, 59, 81) Mobility (11, 26, 51, 84) Robots & social robots (30, 49, 59, 82) Search engines (63) Smart homes, cities, e-government (5, 11, 37, 39, 41, 77, 81) Social media-networking sites (4, 11, 39, 51, 57, 79, 80, 85) Ubiquitous computing (26, 59) Wearable computing, devices, sensors (26, 59, 77, 79)

A2. Names for new digital technology and society era [49] Age of big data (56) Attention economy (89) Culture of connectivity (85) Digital age/society/revolution (6, 7, 9, 51) Digital natives, immigrants (12, 64) Ecosystem of connective media in a culture of connectivity (85) Fourth industrial revolution (77) Fourth wave (digital health) (79) Global information society (58) Global network society (17, 48) Information society (14, 38) Integrate technological, social, political, cultural, economic dimensions (4, 9, 13, 17, 18, 21, 26, 28, 33, 39, 41, 47, 57, 58, 69) Marketplace of attention (86) Mass surveillance society (75) Media ecologies (44) Mediation (interrelated technical, social, biological processes) (4, 22, 23) Network as “defining concept of our era” (20) New mobile age (49) Next Internet (59) Participatory condition (5) Second machine age (13) Softwarization of society (9) Superconnected society (18) Third wave of computing technologies (25, 32)

B2. Technology characteristics [9] Affordances (68, 71, 82) Habitual, updating (20, 82) Materiality (33, 42) Mediation vs. objects, devices, apps (47) C. Issues C1. Content, creation [33] Art, performance (5, 10, 23, 40, 52, 54, 81) Collective intelligence (70) Creative production, industry, digital media production (44) Crowdfunding (7) Design (5, 71) Humor & memes (46, 66, 78) Online expression (66) Producers, users, produsers (2, 5, 28, 43, 69, 76) Public, online debate (66) Storytelling (5, 7, 66) C2. Big data, data mining, data storage, analytics, user data [56] Attention industry, marketplace, merchants, customers (2, 13, 24, 39, 50, 75, 83, 84, 86, 89)

Introduction: Terms, Domains, and Themes   13 Audience behaviors and meaning changing, fragmentation, overlap (7, 33,62, 86) Big data, data mining, data analytics (5, 11, 14, 29, 39, 51, 56, 59, 71, 84) Data user, personal, online, digital traces (22, 35, 51, 55, 61, 73, 75, 83, 84) Privacy, surveillance, security, anonymity (5, 10, 12, 15, 18, 19, 26, 39, 40, 45, 48, 53, 59, 61, 64, 69, 82, 84, 87)

Families (38, 44, 54, 72, 80) Friendship (44) Identity, selfhood (12, 18, 19, 22, 38, 40, 43, 44, 51, 54, 64, 65, 82, 87) Individual, collective; public, private (35, 70, 87) Intimacy (44, 82) Sex, sexuality (69) Social (interactions, relationships, networks (6, 18, 35, 46, 54, 70, 72, 82)

C3. Civic issues [50] Civic media, citizenship, democracy, public sphere, the news press (3, 4, 5, 7, 11, 29, 33, 39, 40, 51, 59, 65, 69, 81, 87) Digital countercultures, underground (52) Engagement, participation civic (3, 5, 27, 37, 46, 53, 59, 65) Political, politics (17, 21, 25, 39, 55, 69) Power (5, 7, 21, 42, 51) Social movements & digital activism (incl. feminist activism, play as resistance), collective action (5, 17, 25, 31, 37, 39, 51, 69, 87)

D4. User groups [19] African-Americans (38) College students (72) Elderly (68, 71) LGBTQ (31, 37) Worshippers (8) Youth (6, 12, 39, 44, 45, 53, 54, 64, 69, 80)

C4. Participation, engagement [7] (5, 12, 27, 45, 48, 70) C5. Inclusion, exclusion, discrimination [26] Digital divide (6, 38, 39) Disability (22, 69) Discrimination (19, 29, 63) Gender (25, 39, 63, 69) Inclusion, exclusion; equality, inequality (7, 18, 37, 53, 54, 59, 69, 81, 88) Race (39, 63, 69)

D5. Culture, everyday life, education, learning [35] Culture (6, 10, 25, 37, 39, 51, 69, 88) Domesticity (26) Education, learning (5, 13, 30, 37, 41, 44, 53, 54, 64, 69) Everyday life, practice (4, 9, 21, 26, 38, 39, 53, 54, 58, 80) Literacy (46, 53, 70, 88)

C6. Ethics, ethical issues [6] (28, 29, 45, 47, 64)

D6. Work and organizations [37] Business models (7, 24, 49, 81, 89) Innovation (11, 27, 37, 71, 77, 79) Labor, creative, digital, employment (7, 13, 30, 33, 39, 76, 77) Organizations & business (10, 21, 24, 39, 41, 57, 71, 79, 81) Work, work-life boundaries (1, 44, 46, 71, 77, 82)

C7. Manage digital experience [8] (23, 29, 35, 38, 39, 41, 53, 57, 60, 79, 81)

D7. Law, policy, regulation [12] (23, 29, 35, 38, 39, 41, 53, 57, 60, 79, 81)

D. Contexts

E. Effects

D1. Digitization of self & others [13] Biosensing, quantified self & animals (5, 10, 47, 49, 52, 55, 59, 61, 67, 84) Qualified self (45) D2. Health [12] Digital health (13, 30, 61, 71, 79) End of life (68) Healthspan and lifespan (49) Online information, interventions (68) Support, coping (68, 82) D3. Relationships [40] Community (4, 51)

E1. Effects negative [25] Addiction (1, 82) Attention, brain, overload (6, 16, 64, 70, 82) Cyberbullying (6, 12) Danger, harm, risk (6, 12, 46, 53, 56, 82) Disconnection (among people) (54) Multitasking (82) Online hate and shaming (72) Pressure for access, connectedness, response (82) Wasting time (34, 82) E2. Effects positive [13] Collaboration, cooperation, sharing (5, 12, 13, 34, 70, 71) (continued)

14   Ronald E. Rice ET AL. Table 1.6  Continued Connectivity, connectedness (1, 2, 38) Creativity (34) Safety (12, 82) Social capital (38) E3. Effects societal [23] Crime (36) Economy, economics (13, 30, 39, 41, 58, 71, 81, 87)

Environment implications of digital media (23, 42, 59, 69) Global impacts (17, 18, 22, 58, 69, 71) ICTs for development (41, 57, 87) Institutions (18, 41) E4. Effects contradictions, paradoxes, tensions, unintended [21] (1, 6, 9, 10, 31, 32, 34, 35, 36, 41, 48, 50, 52, 66, 69, 82, 86)

Note: [# times code was referenced], aggregated from # times each subcode was referenced (number of the specific publication referenced); see References of Books for Issues and Contexts Analysis for correspondence between number and reference.

A.  Theory and Conceptualization A1. Theory. Theoretical perspectives are not likely to be highlighted in book summaries or reviews, especially in edited books. However, of those mentioned, several appear multiple times. A socio-technical approach appears in Athique’s (2013) overview of dig­ital media and society; Beyes, Leeker, and Schipper’s (2017) analysis of digital performance: and Dourish and Bell’s (2011) discussion of ubiquitous computing. Network theory is an obvious framework for considering digital technologies due to their networked nature and the rise of a global network society, and how they connect people through social networks and social networking sites (González-Bailón, 2017; Graham & Dutton, 2014; Krieger & Belliger, 2014; Lindgren, 2017). Similarly, as digital technologies provide mediated communication, representation, and interaction, mediation and materiality theories are relevant (Cubitt,  2016; Fotopoulou,  2017; Gillespie, Boczkowski, & Foot, 2014). Other specified theoretical approaches include critical studies, sociological theories, and others ranging from public goods and social capital to diffusion of innovations and digital divide. Several books emphasize multiple theories, related to areas such social media, cyber-optimism, social interaction, social change, identity, development, education, and participation (Lindgren, 2017; Livingstone, 2009). Names for new social era. Based on the article and NGram analysis, we might refer to the emergence of relationships between digital technology and society as the “Digital Age.” Indeed, several books use some variant of that (Bauerlein, 2011; Bennett, Chin, & Jones, 2015; Berry, 2015; Lindgren, 2017). However, in recent books authors have used a wide variety of terms. Some refer to a broad phenomenon (culture of connectivity, dig­ital age/society/revolution, ecosystem of connective media in a culture of connectivity, fourth industrial revolution, global information/network society, the familiar term

Introduction: Terms, Domains, and Themes   15 information society, next Internet, the participatory condition, second machine age, super connected society, and third wave of computing technologies). Several of these emphasize the increased opportunities for connecting, communicating, and networking (Barney et al., 2016; Castells, 2015; Chayko, 2017; Chun, 2017; James, 2014; Van Dijck, 2013). Barney et al. (2016) refers to this as the “participatory condition,” whereby participation has become the theme for most everyday activities; similarly, Van Dijck (2013) shows how social media in particular have created a “culture of connectivity,” and Chayko (2017) argues that the Internet and digital media in general have made social life “superconnected.” Others indicate more specific aspects (age of big data, the attention economy, mass surveillance society, new mobile age, and the “softwarization” of society; Berry, 2015; Kvedar, Colman, & Cella, 2017; Lynch, 2016; Schneier, 2015; Wu, 2017). Note that all but one of these are concerned with how large-scale collection of user behavior both fuels the economy as well as enables individual, corporate, and government surveillance. More common, however, is a general reference to the integration of biological, cultural, economic, environmental, political, psychological, social, and technological dimensions. Athique (2013) and Berry (2015), for example, discuss how digital systems pervade all aspects of our lives, especially given vast global information and communication networks (Castells, 2015). Many of these broader approaches emphasize how the technological and the social are interrelated (e.g., Graham & Dutton, 2014).

B.  Digital Technology B1. Technology venues. Many other books explain and study specific digital technologies, but those mentioned in the context of societal concerns in these books range from 3D printing/fabrication to online gaming and ubiquitous computing. More frequent are associations with algorithms, cloud computing, Internet of things, mobility, robots (including social robots), smart homes and cities, social media, and wearable computing devices/sensors. Algorithms are both based on, and influence, our search behaviors, news viewing, online friends, and shopping (Cheney-Lippold, 2017; Turow, 2017), shape who gets access to social services (Eubanks, 2018), and affect how race and gender are portrayed in search engine results (Noble, 2018). Cloud computing seems abstract and ethereal, yet requires massive infrastructure and energy (Hu, 2015), and raises issues of privacy (Graham & Dutton, 2014). The miniaturization of computing power and the increasing reach and strength of wireless networking provide the foundations for the Internet of things (Bunz & Meikle,  2017), ranging from interactive refrigerators to worldwide tracking of shipping containers (Dourish & Bell, 2011), as well as “smart” cities and governments (Barney et al., 2016; Hanna, 2016). Robots have already replaced many manufacturing jobs (Ford, 2015), while social robots can provide physical, and emotional support to patients, the elderly, and coworkers (Mosco, 2017). Turkle (2011) argues that mobile phones are social robots.

16   Ronald E. Rice ET AL. B2. Technology characteristics. While not much mentioned at the general level of our source documents, some work does approach digital technologies not from their purely physical or technical aspects, but, rather, from their main functions or capabilities, the ways in which they mediate. The argument here is that particular technologies and their manifestations are always changing, so a more conceptual approach is more enduring and generalizable. This approach is variously labeled here as affordances, habit, materiality, and mediation. For example, from a patient’s or a physician’s perspective, persistent awareness of the patient’s condition is more crucial than a particular medical device (Rains, 2018), or from a project team’s perspective, searchability of a database in order to share knowledge is critical independent of the system used (Rossignoli, Virili, & Za, 2017). Continuing, everyday use of a digital technology may, however, make these capabilities and affordances become taken-for-granted, habitual, and invisible (Chun, 2017).

C. Issues C1. Content, creation. Digital technologies make it possible for all kinds of people to find, create, share, reshape, and link a dizzying array of existing and unforeseeable content. Familiar content issues include produsage, crowdfunding/sourcing, digital media production, and online expression. Yet less common topics are discussed as well. Software, computing, devices, and networks enable new kinds of and ways of presenting, art, performance, dance, music, and design (Beyes, Leeker, & Schipper, 2017; Gronlund, 2016). New kinds of digital images (from space and the deep sea) may foster more engaged responses to environmental threats (Cubit,  2016). A major motivation for creating, viewing, and sharing online content is its humorous nature (Phillips & Milner, 2018), while considerable research attempts to understand the power and rapid diffusion of certain memes (Shifman, 2014). Digital storytelling can be multi-modal, collaborative, and interactive (Barney et al., 2016). C2. Big data, data mining, data storage, analytics, user data. Transforming information and communication from analog to digital has a major, inherent implication: the content is (potentially, depending on the context and laws) now easily captured, stored, analyzed, associated, separated, (re)combined, transmitted, and networked. Thus, there is considerable recent coverage of issues relating to both specific and very large scale (big) user data. Such data capture and mining provide the economic model for much digital media, marketers, and vendors (think social media and search engines, especially), leading to terms such as the attention economy or the attention market, and fundamental changes in the nature of media audiences (Anand,  2016; Daley,  2015; Napoli, 2011; Schneier, 2015; Turow, 2012, 2017; Webster, 2014). But big data, from personal biosensors to Google searches, also allow both scientific and commercial analyses of topics otherwise not possible (González-Bailón,  2017; Graham & Dutton,  2014; Lupton,  2016; Rudder,  2014). For example, Webster (2014) argues that the expansion of multiple media sources and content allows audiences to

Introduction: Terms, Domains, and Themes   17 both concentrate attention on, as well as overlap with other audiences across, some outlets and content. The generation, access, analysis, and selling of such personal data also lead to concerns about anonymity, privacy, and surveillance. Several have noted the irony in the fact that while digital technology inspires so much participation and sharing, that very participation generates information that may be used to control, influence, or otherwise shape us and our possibilities (Barney et al., 2016; Bunz & Meikle, 2017). As Turkle (2011, p. 243) wrote, “Facebook looks like ‘home,’ but you know that it puts you in a public square with a surveillance camera turned on.” C3. Civic issues. For some, civic issues are at the heart of debates about digital technology and society. Awareness, participation, freedom of speech, and exposure to diverse ideas are crucial for the practice and maintenance of democracy. The public sphere is now online, but not necessarily civil. Digital and online technology can both facilitate and constrain, improve and harm, these activities. It provides opportunities for political engagement and citizen marketing as well as tools for political message targeting, and opinion control by both governments and corporations (Anduiza et al.,  2012; Athique, 2013; Penney,  2017; White, 2014), but increasingly also by interest groups and individuals. Offline divides by class, disability, ethnicity, gender, and race potentially may be overcome online, but often are reinforced (Reed, 2014). Online spaces provide meeting ground, support, and solidarity for countercultural communities (Lingel, 2017), and citizen and political activism and collective action, from small towns to governments, and from nations to global regions, sometimes successful, sometimes not (Castells, 2015; Graham & Dutton, 2014). Online citizen engagement also represents opportunities for and means of identity expression (Penney, 2017). Some work discusses implications of site and system design, accessibility, and use on the nature of civic engagement (Godron & Mihailidis, 2016). Underlying these civic issues are questions about power and politics shape and are shaped by new forms of participation and their actors (Barney et al., 2016; Hu, 2015). C4. Participation, engagement. As noted in the overview about labeling this changing social condition, a central underlying theme is the increased amount, diversity, forms, and actors in online participation in general (i.e., other than civic or political). Social media in particular enable people to engage in communication and activities in multiple ways, continuously (boyd, 2014), often leading to over-dependence and disconnection from ethical and moral behavior (James, 2014). Yet features and designs, as well as online attitudes and behavior, still limit participation by those with disabilities (Ellcessor, 2016). C5. Inclusion, exclusion, discrimination. Thus an explicit or implicit thread running throughout much of the discussions about digital technology concerns inclusion, exclusion, and discrimination, both in terms of accessing and using these technologies, as well as in how designs, data mining, site features or policies, and other users affect which people and what content are allowed online. The major throughline here is about the general digital divide (Graham, 2014, discussing African Americans’ digital practices; Graham & Dutton, 2014); but other specific distinctions appear too, such as disability, ethnicity, gender, and race (Ellcessor, 2016; Reed, 2014). And a wide variety of factors

18   Ronald E. Rice ET AL. affects forms of inclusion and exclusion, including algorithms shaping data mining and search engine results (Cheney-Lippold, 2017; Eubanks, 2018). Noble (2018), for example, shows how algorithm design, commercial interests, and oligopolies of search engine and social media companies serve to privilege whiteness while discriminating against people of color (especially women). Young users, while nearly continuously online, nonetheless experience exclusion and disconnections due to their pre-existing networks, digital literacy, and attitudes (Livingstone, Sefton, & Green, 2016; Wiesinger & Believau, 2016)). However, online communities and social media provide opportunities for digital activism on the basis of gender, feminism, LBGTQ, among others (Dey, 2018; Fotopoulou, 2017; Gordon & Mihailidis, 2016), and are empowering people around the world (Mosco,  2017). Blockchain technology may both increase and circumvent exclusion (Tapscott & Tapscott, 2018), by concentrating wealth and increasing energy demands, while bypassing control by financial institutions and intermediaries. C6. Ethics, ethical issues. As most commenters note, ethical issues receive limited coverage in digital technology discussions. The online environment provides extensive and new challenges to professional journalism ethics (Elliott & Spence, 2017), data mining and algorithms distance consequences from ethical criteria (Eubanks, 2018), and youth users seldom make connections between their online behaviors and more general moral ethical implications, experiencing ethical blind spots (James, 2014). The very proc­ess of mediation often distances awareness or knowledge of ethical implications (Palfrey & Gasser, 2016), and the beneficial use of social robots nonetheless has ethical implications such as emotional dependency and privacy (Kvedar, Colman, & Cella, 2017). C7. Managing the digital experience. While nearly all books on digital technology and society have sections on policy, implementation, and individual recommendations, some specifically focus on advice, based on research, on how to manage and improve one’s digital experience. James (2014), for example, explicates the concept of conscientious connectivity, which involves both ethical thinking as well as awareness of and sensitivity to online dilemmas. Other approaches include exercises for mindful technology use (Levy, 2016), and thriving online (Rheingold, 2012). Johnson (2015) develops the idea of an information diet, or how to evaluate and balance one’s online behaviors and use of information. Other guides are designed for parents interested in protecting their family from the negative aspects of the digital age (Steiner, Adair, & Barker, 2013), attempting to protect your online data and identity (Schneier, 2015), and reducing online shame and hate (Scheff & Schorr, 2017). Rowan-Kenyon, Aleman, and Savitz-Romer (2018) specifically honed in on how universities can improve the experience and retention of first-generation college students through their engagement with digital technology.

D. Contexts D1. Digitization of self and others. As devices become smaller, and more powerful, wireless, and connected, very personalized uses have developed, creating the quantified

Introduction: Terms, Domains, and Themes   19 self movement. People use biosensors (such as fitness trackers, smartphones, eye- and face-scanners, and even implants) to record their activities and responses, both for personal interest and health monitoring (Kvedar, Colman, & Cella, 2017; Lupton, 2016), but also for advertising and consumer user behavior (Turow, 2017). Interestingly, while in one way this allows people to develop a more detailed sense of their own identity, shared quantified self data creates online communities who compare and even compete (including, for example, brain scans; Barney et al., 2016). To some extent, this is one form of the cyborg or the singularity, or the melding of humans and machines (Beyes, Leeker, & Schipper, 2017; Kember & Zylinska, 2012; Mosco, 2017). Further, large-scale collection of such data can be used for medical diagnoses, genetic and epidemiological analyses, and possible threats to insurance and employment. As a form of the Internet of things, these applications extend to animals as well, for tracking their diet and health, as well as provenance, ownership, location, and migration, from cats to cattle to whales (Pschera & Lauffer, 2016). As a complement to this digital data collection via bodily devices and social media, Humphreys (2018) shows how people have been recording and commenting on their personal information for ages, through baby books, photo albums, pocket diaries, and postcards, to account for their everyday lives. D2. Health. Health is another major context for digital technology, including computerized medical instrumentation, digital device implants, data collection and analysis, health information monitoring, digital records, network sharing of medical information, online health information seeking, and mediated communication within support communities and between patients and caregivers (Rains, 2018; Turkle, 2011). Such technologies can be used for large-scale as well as personalized health interventions, improve one’s life-span and health-span (Kvedar, Colman, & Cella, 2017), as well as help manage end-of-life and bereavement (Rains, 2018). D3. Relationships. From a communication and interaction perspective, personal and social relationships are a major context for the use and implications of digital technologies. A frequent focus is on how people create, manage, promote, and try to protect, their (multiple) online identities and selfhood (Lindgren, 2017). This is especially salient for youth users (boyd, 2014; Ito et al., 2009; Palfrey & Gasser, 2016; Turkle, 2011), in their social and classroom lives (Livingstone, Sefton, & Green, 2016). Other central foci are about online ethnic, gender, racial, and sexual identities (Graham, 2014; Reed, 2014), and about how data mining entities construct and constrain our online and even offline identities (Cheney-Lippold, 2017). Online communities and services also hold the promise for bringing together individual identities to create a more powerful and positive (or negative) collective identity (González-Bailón, 2017; Rheingold, 2012), but also blur the distinctions between public and private, and offline and online, identities (White, 2014). Digital technology and society of course involve far more than just individual-level identity. More relational contexts include engaging in online intimacy even though the content may be public (Ito et al., 2009). Yet our technologies may be distorting intimacy; as Turkle (2011) notes, immersion in social media and mobile phones may create an illusion of intimacy while distancing actual personal relationships. The mobile phone

20   Ronald E. Rice ET AL. and social media may help maintain family relationships, especially when children move away to college (Rowan-Kenyon, Aleman, & Savitz-Romer, 2018), but also serve to wrest control away from parental monitoring and socialization (Graham, 2014; Ito et al., 2009; Livingstone & Green, 2016), both by youth and by their peers and marketing companies (Steiner, Adair, & Barker, 2013). And, of course, much work concerns the nature, engagement in, and effects of online communities, ranging from political to health and culture (Lindgren, 2017). D4. User groups. Different kinds of audiences, groups, or users have different motivations for and experiences with online and digital technology, so some books focus on specific user groups. These include how African Americans use such technologies to deal with inequalities (Graham, 2014), how students engage with technology to manager their transition from home to their first year at college (Rowan-Kenyon, Aleman, & Savitz-Romer, 2018), how the elderly can manage the end of their lifespan (Rains, 2018), how LGBTQ members engage in media activism, promote visibility, and work to combat suicide (Fotopoulou, 2017 Gordon & Mihailidis, 2016), and how worshippers participate in mediated liturgy practices, such as digital prayer chapels and live-streaming of religious services (Berger, 2017). Much work looks at how youth use digital media, with both positive and negative implications (Bauerlein,  2011). For example, boyd (2014) considers why youth share so much online and why they are so obsessed with social media, Graham and Dutton’s book (2014) includes chapters on children’s Internet use and next generation digital divides, and Steiner, Adair, and Barker (2013) consider how the digital age is significantly affecting childhood. D5. Culture, everyday life, education, learning. More general daily concerns include the ways in which ubiquitous computing might affect the form and meaning of domesticity (Dourish & Bell, 2011) and everyday life practices, such as how society is becoming embedded in software (Berry,  2015), the digital experiences of African Americans (Graham, 2014), and how the Internet is interwoven throughout children’s lives at home, school, and play (Livingstone, 2009). Research investigates the role of digital technology in education and learning, such as the participatory potential for education (Barney et al., 2016; Brynjolfsson & McAfee, 2014), the need for greater civic education (Gordon & Mihailidis, 2016), how young people do, or do not, learn through digital media within their daily class contexts (Livingstone, Sefton, & Green, 2016), the need for greater dig­ ital literacy (Johnson 2015; Rheingold, 2012) and ways in which educational technologies such as MOOCs and scholarly publishing are changing the nature of teaching and research (Reed, 2014). Concerns about digital culture include how digital technology shapes and is influenced by broad societal culture (Wiesinger & Beliveau, 2016), artistic and creative culture (Reed, 2014), and the culture(s) associated with particular media, such as mobile phones (Lindgren, 2017). D6. Work and organizations. Another major context, work and organizations, is considered much more in the management and information systems literature. However, recent digital technology and society books discuss how business models, industries, and economies are being transformed, such as through crowdsourcing, crowdfunding,

Introduction: Terms, Domains, and Themes   21 and microcelebrity (Bennett, Chin, & Jones, 2015); the “gig” economy such as Uber and Airbnb (Daley, 2015); and the ability to identify and monetize attention (Wu, 2017). They also discuss how the very nature of organizations and industries is changing, towards more networked, virtual, and distributed forms (Graham & Dutton, 2014). Note that these opportunities and challenges also apply to not necessarily commercial or for-profit contexts, such as health provision and caregiving (Kvedar, Colman, & Cells, 2017), and smart cities and technology parks (Hanna, 2016). Not only are innovation processes crucial to the development and diffusion of new digital technologies, but such technologies are also necessary for implementing other innovations. For example, innovations in medical technologies can transform experiences and possibilities for the disabled (Ellcessor,  2016), patients (Sonnier,  2017), and caregivers (Rossignoli, Virili, & Za, 2017), and innovative designs for egovernment (such as ways to visualize data, involvement in open policymaking, engaging young and feminist activists) can foster greater civic engagement (Gordon & Mihailidis, 2016). Digital innovations are making work-life boundaries more permeable (Alter, 2017; Schwab, 2017), increasing the ability to share and manage knowledge (Botto & Resende, 2017), and threatening the loss of traditional and even knowledge work through artificial intelligence, algorithms, blockchain technology, and robots (Ford, 2015; Tapscott & Tapscott, 2018). D7. Law, policy, regulation. As many have noted, technology develops and diffuses faster than laws, policy, and regulation can keep up with. Should the Internet and social media be regulated as a common carrier, or subject to the same regulations and liabilities as other publishers (Graham & Dutton, 2014)? Who should govern what aspects of the Internet (Mueller, 2010)? What are the effects of supporting or removing net neutrality? Should algorithms affecting search results and service provision be regulated and made explicit (Eubanks, 2018)? What policies and regulations best stimulate public ICT provision (Hanna, 2016)? Is a HIPAA sufficient to protect personal digital medical records (Sonnier, 2017)?

E. Effects E1. Negative effects. An enduring research, policy, and popular topic is the extent to which digital technologies are associated with negative or positive effects. The list of possible implications is endless. The books included here refer to just a few. Addiction is at the top of many people’s list, both alphabetically and behaviorally (Alter, 2017). As Turkle (2011, p. 154) notes, “Always on and (now) always with us, we tend the Net, and the Net teaches us to need it.” But she argues that addiction is not inherent to the technology; rather it’s to how we practice the use of that technology. For example, social pressures to be constantly accessible and to respond quickly create stress and reinforce dependencies (Turkle, 2011). Excessive use also ends up wasting considerable time, often fostering feelings of guilt (Goldsmith, 2016). Watching YouTube music videos, and endlessly scrolling friends’ text messages, do not strengthen personal relations or get one’s (home) work done.

22   Ronald E. Rice ET AL. As noted in the terms associated with digital technology and society, attention has gained a lot of attention, not only in regard to the commercial focus on collecting and analyzing user attention, but also about the cognitive effects of excessive screen use and multitasking on attention span (Bauerlein, 2011; Carr, 2011). Further, excessive attention to our devices reduces our attention to those people around us (Turkle, 2011, p. 268). Research has identified a wide array of possible dangers, harms, and risk, including cyberbullying (Bauerlein, 2011), information overload (Johnson, 2015), threats to children (Livingstone, 2009), and loss of control over one’s identity in the present and the future. Turkle (2011), for example, notes people’s vulnerability is not just limited to their communication or site content, but also to anyone taking a photo of them or posting comments about them. There is thus a constant worry about one’s offline behavior being recorded and distributed. This leads some to self-censor and self-surveil both their online and offline comments and behavior. Another kind of harm is online harassment, shaming, hating, and trolling (Scheff & Schorr, 2017), negatively affecting everyone from children to CEOs and celebrities. E2. Positive effects. Needless to say, digital technologies are associated with many positive benefits. Chief among these is the ability to connect and communicate with others, from family and friends to fellow group members, and with people and organizations otherwise unknown and inaccessible, allowing the co-creation of meaning and sharing of resources, from emotional support to complex information (Barney et al.,  2016). Computing and networking support new and distributed forms of collaboration and cooperation, increasingly between humans and machines, necessary for accomplishing tasks, creating content, and generating innovative ideas (Brynjolfsson & McAfee, 2014; Rheingold, 2012; Rossignoli, Virili, & Za, 2017). Tools such as mobile phones and GPS also improve one’s safety (boyd,  2014), and keep others aware of your locations and activities (Turkle, 2011). This support for connectivity and relationships also develops social and cultural capital (Graham, 2014). E3. Societal effects. More societal negative effects include the uses of digital technologies for crime, including hacking and identity theft, fomenting hate crimes, and drug and sex trafficking, among many others (Goodman, 2015). Also, little understood is the increasingly devastating environmental implications of digital technology, including cloud computing (with its need for massive server farms consuming increasingly more energy) and toxic materials recycling (Cubitt, 2016, Hu, 2015, Mosco, 2017). Digital, networked ICTs have a wide range of negative and positive implications for economies and economics, such as facilitating rapid and global financial crises, and threats to particular industries and jobs, but also making information and transactions more transparent and efficient, and supporting micro-economic and entrepreneurial activities and produsage (Graham & Dutton, 2014; Hanna, 2016; Martin, 2017). ICTs have held great promise for the developing world, from markets and health, to farming and education (Hanna, 2016; White, 2014). Other global impacts include occasions for (more or less successful) citizen participation (Castells, 2015), and broader intercultural communication (Cover, 2015).

Introduction: Terms, Domains, and Themes   23 E4. Contradictions, paradoxes, tensions, and unintended effects. Interwoven throughout discussions of effects of digital technology is the awareness of contradictions, paradoxes, tensions, and unintended consequences. The very concept of online expression is ambivalent, indicating helpful as well as harmful intent and content (Phillips & Milner, 2018). Both positive and negative implications may be associated with a particular technology or use, often simultaneously, and paradoxically. Online feminist and queer activist communities can use the same technologies that mis-portray or exclude them as ways to construct and promote valued identities (Fotopoulou, 2017). Digital technologies are both highly useful and entertaining, but also create stress, overload, and complications (Levy, 2016); they blur boundaries between work and home, private and public (Krieger & Belliger, 2014). Online and social media communication can strengthen relationships and promote intimacy while also generating user data that are processed by other digital technologies and software around the world to group, categorize, and target audiences (Beyes, Leeker, & Schipper, 2017). For example, Turkle (2011) identifies the following paradoxes (p. 176, 280): • Connectivity brings us closer, but some use technology to hide from others • In order to feel like themselves, users must be connected to their devices and others • It is easy to find others to interact with, but also to become tired by demands to perform • It is possible to make many new connections, but they are often tentative and temporary • Mobile phones enable as well as inhibit separation from parents, partners, work • Nonstop connection also means limited attention by self or others • People can play with identity but are less free from their past • People like online linkages and features that are based on knowledge about one’s use, but they are also concerned about the loss of privacy and are constrained by externally created online identity • People reject real-time phone but get lost in real-time online gaming • Providing online content may be available to immediate and broad audiences, but the content is often depersonalized and abbreviated • The ability to work from anywhere means one cannot escape work • Users develop expectations of instant connections to and response from others, but they themselves are then expected to always be available and respond ourselves • Users themselves acknowledge tensions between good and bad aspects, and often say they are resigned to this condition. What some researchers or users may perceive as a positive aspect may be considered negative by others. For example, increased ability to participate online may take the form of reading and posting only to groups with the same interest or political position, thus limiting exposure to diverse ideas and strengthening polarization (though Webster, 2014, argues that audiences are fragmented, but also participate in various venues,

24   Ronald E. Rice ET AL. creating overlapping audiences). By accident or user intent, technology may be used in ways that designers, vendors, or implementers did not intend, expect, or imagine (Lingel, 2017). For example, González-Bailón (2017) shows how data mining and network analysis can reveal unintended consequences of individual behavior for collective outcomes. Even use of digital technology that might be critiqued as “wasting time” can provide a context for allowing thoughtfulness and creativity (Goldsmith, 2016). The same systems and features can promote support and caring (Rains, 2018) as well as  international crime and terrorism (Goodman,  2015), democratization as well as authoritarianism (Berry,  2015), learning as well as fragmented attention (Bauerlein, 2011), empowerment as well as addiction (Alter,  2017). ICTs may increase the pace of development and economic growth, but also increase inequities, work dislocation, and environmental degradation (Cubitt, 2016; Gershenfeld, Gershenfeld, & CutcherGershenfeld, 2017; Hanna, 2016).

Summary The preceding sections reviewed the main arguments and concerns about digital technology and society, organized by the emergent coding of subthemes and then the main codes of Theory and Conceptualization, Technology, Issues, Contexts, and Effects. This provides a subjective conceptual framework for identifying the most noted and discussed topics of 89 recent books. Another way to identify general arguments and concerns is to assess how those coded themes co-occur across the material about the 89 books. Figure 1.2 shows the hierarchical clustering of the coded themes, based on the Jaccard similarity coefficients derived from the co-occurrence of the codes in each source text (provided through NVivo 11). What constitutes a cluster, or main theme, depends on the cutoff between sets of codes one wants to use. The most general distinction is among three primary clusters. The first cluster includes Content, creation; Digitization of self & others; Ethics, ethical issues; Participation, engagement; and Manage digital experience. Given that many of the Ethics, ethical issues have to do with personal data privacy, much of the Content, creation material has to do with individual use or production of content, and Manage digital experience emphasizes how individual can (more, or less, effectively) attempt to manage their own digital usage, this cluster could be considered to represent a major general theme of Individual uses. The second cluster contains two subclusters. The first subcluster consists of Theory; Culture, everyday life, education, learning; Relationships; and User groups. This as a somewhat diverse grouping, but seems to represent the central social and theoretical contexts of digital technology: groups, relationships, culture. The second subcluster reflects more societal issues, such as policy, societal effects, civic and public sphere behavior, inequalities, organizations and technology, and very broad issues of the nature of the developing societal changes, along with big data. Thus, the entire two-fold cluster might be labeled Societal and technological issues.

Introduction: Terms, Domains, and Themes   25 C1Content, creation D1Digitization of self & others C6Ethics, ethical issues C4Participation, engagement C7Manage digital experience A1Theory D5Culture, everyday life, education, leaming D3Relationships D4User groups D7Law, policy, regulation E3Effects Societal C3Civic issues C5Inclusion, exclusion, discrimination B1Technology venues D6Work and organizations A2Names for new social era C2Big data, data mining, data storage, analytics, user data B2Technology characteristics D2Health E4Effects contradictions, paradoxes, tensions, unintended E1Effects Neg E2Effects Pos

Figure 1.2  Hierarchical clustering of main codes based on co-occurrence (correlation) of main and subcodes within each source text.

The third cluster is mostly about Effects, notably in the health arena, as well as aspects of technologies that might shape or influence those effects. Note that specific negative or positive effects are subsumed within the more complex topic of contradictory and unintended effects.

Related Work A detailed analysis of the full content of these books would provide a comprehensive review of topics associated with digital technology and society, to say nothing of what individual articles and chapters discuss. There is, of course, a huge range of review articles, chapters, books, and handbooks on the many aspects of digital media and society. There are journals in specific disciplines that publish reviews, and there are handbooks in a wide variety of related research areas. Almost all of those, however, focus on one discipline (e.g., management, information systems, sociology), or one dimension (or­gan­i­za­tional communication, privacy, identity), or one technology (e.g., Internet, social media, videogames). Further, edited books or handbooks in these areas bring together diverse, expert authors who contribute on the topic of their own specialty, often without an underlying integrative foundation. Finally, many books on the “digital age” are more popular, applied, or oriented toward marketing, technology, management, or consulting practice.

26   Ronald E. Rice ET AL. For example, Salganik’s Bit by bit: Social research in the digital age (2017) is about the conduct and design of research in the digital environment, such as using big data, experiments, and collaborative studies. Baym’s Personal connections in the digital age (2015) emphasizes the communication discipline and relationships. Similarly, the book edited by Wright and Web, Computer-mediated communication in personal relationships (2011) exclusively focuses on relational communication. Other books, such as Noble and Tynes’ (2016) The intersectional Internet: Race, sex, class, and culture online do take a more interdisciplinary and multi-dimensional approach, applied to a range of digital media, platforms, and infrastructures, in more global and social contexts, but is pri­ma­ rily focused on the issue of intersectionality instead of on more general life in the digital age. Perhaps the single best overview is that of Mansell et al. (2015) International encyclopedia of digital communication and society, which covers 150 topics, ranging in length from 2,000 to 10,000 words. More relevant to this book, there are also handbooks on specific topics and media such as The Oxford handbook of Internet studies (Dutton, 2014); Routledge handbook of Internet politics (Chadwick & Howard, 2009); Oxford handbook of Internet psychology (Joinson & McKenna, 2009); Internet studies (Consalvo & Ess, 2012)–their book’s chapters do cover a number of similar topics, such as society, and culture, but again focus on the Internet; and Economics of the Internet (Bauer & Latzer, 2016). The two-volume Handbook of research on computer-mediated communication (Kelsey & St.-Amant, 2008) also covers some similar areas, such as identity (though from a credibility perspective), community and information exchange, and culture, but also different areas, such as instruction, design, discourse, and libraries, as well as chapters on specific technology contexts. Similarly, more handbooks on the mobile phone are appearing: Handbook of mobile communication studies (Katz, 2008) and Research on human social interaction in the age of mobile devices (Xu, 2016); as well as for related new media, such as Handbook of digital games (Angelides & Angus, 2014), and Sage handbook of social media (Poell & Marwick, 2018). Some handbooks are focused on particular populations, such as Handbook of children and the media (Singer & Singer, 2011). Sundar’s (2015) Handbook of the psychology of communication technology covers a wider array of digital contexts than many of the other books but does take a primarily individual and group perspective (reasonable, given the title), though also includes health issues. However, our book does not in any way overlap with the intriguing Handbook of porous media (Vafal, 2015). The only recent book that provides a similar multi-dimensional, interdisciplinary, and thematic review of recent research on life in the digital age is Graham and Dutton’s (2014, Oxford University Press) edited book Society & the Internet: How networks of information and communication are changing our lives. That excellent book frames the work as a major foundation for the new field of Internet studies (along with Dutton’s 2014 Oxford Handbook of Internet studies). Somewhat similar to the UK Economic and Social Research (ESRC) project theme chapters in our book (see the next section), the chapters in Graham and Dutton’s book evolved from research work and a lecture series at the Oxford Internet Institute. Their 23-chapter book covers some of the same main areas as

Introduction: Terms, Domains, and Themes   27 our book (with main sections called: Internet studies of everyday life; Information and culture on the line; Networked politics and governments; Networked businesses, industries, and economics; and Technological and Regulatory histories and futures). That book complements this book, but is primarily focused on the Internet, and does not have the organizing framework of the ESRC project reviews.

Purpose and Origins of This Book Purpose and Domains The purpose of chapters in this handbook is to provide detailed reviews of central topics about digital technology and society within our seven domain sections. It includes interdisciplinary, comprehensive reviews on central aspects of the current digital age. After the following chapter on project methodology, the next sections move from more individual and relational domains (Section 2: Health, Age, and Home; Section 3: Communication and, Relationships) to more organizational, community, and citizenship domains (Section 4: Organizational Contexts; Section 5: Communities, Identities, and Class; Section 6: Citizenship, Politics, and Participation), and then to more societal and governance domains (Section 7: Data, Representation, and Sharing; and Section 8: Governance and Accountability). It ends with Section 9: Synthesis. The chapters within each section provide a solid foundation for understanding the current state of research and theory in each of these areas and for grounding future research, theory, and practice. They also bring to bear literature from a wide variety of disciplines, necessary for understanding the interrelationships between digital technology and society.

Origins How did these chapters come to life? In 2016, the UK Economic and Social Research Council (http://www.esrc.ac.uk/) noted that “[t]he 21st century has witnessed significant changes due to digital technological advancements, which impacts the way we communicate, receive, consume and proc­ess information, travel, shop and do our work. The presence of digital technology mediates our perceptions, behaviours and practices across these areas and influences our ways of living, learning, sharing, engaging and seeing the world around us. This raises a number of fundamental questions about our ways of being in a dig­ital age, the risks and opportunities associated with digital living, and our understanding of the individual, community and society [. . . .] It is apparent that there is a real need for meta-analytic work to synthesise and interpret the existing literature and data, to refine and consolidate existing understanding of the social, cultural, economic, political, psychological and other effects of digitalisation. This

28   Ronald E. Rice ET AL. will enable the development of new insights, ideas and methods to be applied to a practical context. This approach will facilitate the exploitation of existing research, but also build new knowledge on synthetic work” (p. 2).

The University of Liverpool, in collaboration with a core project team, and 17 other partner Universities and organizations from the UK, EU, USA and Singapore, lead the UK Economic and Social Research Council (ESRC) scoping review on Ways of Being in a Digital Age (see https://waysofbeingdigital.com/ for details on the project, people, events, and reports). That scoping study developed a multi-domain holistic view of how digital technology mediates our lives, and of the way technological and social change co-evolve and shape each other. The project involved an i­ nterdisciplinary research team across the social sciences, arts and humanities, engineering, physical sciences and health. The final report included reviews and analyses in six domains: Communication, community, and identity; Citizens, politics, and governance; Understanding the platform economy; Data and digital literacies for engaged and included citizens; Everyday digital health and well-being; and Digital inequalities.

Conclusions and Recommendations from the ESRC Project The final ESRC report recommended funding initiatives to emphasize these six core areas. The work should have a strongly social science focus, even where it is interdisciplinary. The topics should avoid areas that are already well researched or have been supported by recent or current funding programs. Research efforts need to look more holistically at the social, economic, political, cultural, and community impacts and roles of digital technologies. The ESRC report proposed the following six areas, each with associated research topics derived from the literature reviews and analyses, the Delphi surveys and discussions, and the stakeholder workshops and discussions.

Communication, community, and identity • The norms and values of digital communication and relationships • The “affordances” different platforms provide for digital communication and relationships • The quality of relationships and communication supported by digital media and technologies • The management of relationships via digital media and technologies • Social and community aspects of everyday digital technology use • Digital community exclusion/inclusion • Digital community participation, action and social change • Power in online communities • Understanding global diaspora as digital communities • Understanding function of aspects of identity online (Gender/Race/Ethnicity/ Sexuality)

Introduction: Terms, Domains, and Themes   29

Citizens, politics, and governance • Digital technologies, radicalization, mobilization and political action • Digital technologies and the disruption of current political institutions • Digital technologies and new forms of citizenship • Digital technologies, political communication, debate and media • Digital technologies and state control–especially in non-democratic regimes • Impact of social media on governance • Success factors in digital governance at local, national and international level • Privacy, citizenship, the state and surveillance in the digital age • Regulation and governance of automated systems

Understanding the platform economy • Role and impact of major corporate digital platforms (Impacts on firms of digital platforms, Role of digital monopolies and large corporations) • Forms of digital labor (Impacts of digital labor on people’s life experience, Gig economy, linked to platforms)

Data and digital literacies for engaged and included citizens • Citizen and community use of data • Citizen interaction with data and algorithms • Data literacy in everyday life • Power and accountability for data and algorithms • Social construction of data and algorithms • Citizens/everyday life experiences and uses of data • Understanding open data/algorithm transparency/accountability • Digital identity and data • Data Exclusion/Inclusion/Divides

Everyday digital health and well-being • Understanding and addressing the governance of digital health technologies. • Need for detailed systematic evidence of the impact and lived experience of everyday health technologies (e.g., fitbits). • Questions of health and well-being in the digital workplace. • Digital technologies and health communication and health behavior change.

Digital inequalities • Digital community exclusion/inclusion • The two-way interaction between digital inequities and other areas of social inequity • Data exclusion/inclusion/divides • Digital cultural capital and cultural exclusion/inclusion • Digital governance, policy and inclusion • Digital health inequalities

30   Ronald E. Rice ET AL. For this book, these have been reorganized into the following seven ESRC domain chapters: • Chapter 3. ESRC Review: Health and well-being • Chapter 8. ESRC Review: Communication and relationships • Chapter 11. ESRC Review: Economy and organizations • Chapter 14. ESRC Review: Communities and identities • Chapter 16. ESRC Review: Citizenship and politics • Chapter 18. ESRC Review: Data and representation • Chapter 22. ESRC Review: Governance and security

Beyond the ESRC Project As culmination of this project, a conference to present project findings and provide a context for debate was held on October 10 and 11 2017 at the University of Liverpool (https://waysofbeingdigital.com/conference/). To complement and critique the work of the ESRC project reviews, a call was publicized (through association email lists and websites, and the conference website) to invite others to submit their extended abstracts for presentation at the conference. Each presentation was attended by at least one of the editors. After the conference, the editors went through the program and decided which of the other researchers to invite to prepare their abstracts, presentations, or papers as full reviews for the edited book. We were particularly interested in papers that built on reviews to offer analysis of research gaps and challenges for social research in the digital age. Contributions come from established, early career, and PhD scholars who systematically reviewed a research issue within one of the seven foci of the ESRC project. Two further workshops developed a closer focus on issues of work and automation. The joint UK ESRC and Defence Science and Technology Laboratory workshop held on 7th and 8th of October 2016 at University of Liverpool in London considered the topic of The Automation of Future Roles. This meeting brought together 33 academics, policy makers, and industry stakeholders to explore the likely future impact of digital tools in the workplace, in particular the possible implications of the continued ­“automation” of human tasks, roles, and jobs; knowledge, skills, and attributes; ­organizational structures, cultures, and development; workforce training, recruitment, engagement, and motivation; and decision-making in organizations. A joint UK ESRC and US National Science Foundation workshop was held on October 12 and 13 2017 at the University of Liverpool on the topic of Changing Work, Changing Lives in the New Technological World. This brought together 35 experts from the academic and professional community, as well as top executive and program directors from the  U.K.  Economic and Social Research Council and the U.S.  National Science Foundation to discuss shared programmatic research. In both cases the two days consisted of intensely interactive group activities, generating extensive information about issues, research programs, and timelines for possible impacts. The write-up

Introduction: Terms, Domains, and Themes   31 and analysis of these insights provided the basis for chapter 24, which synthesizes the implications of the domains for research and practice. Thus, two primary contributions of the book’s review chapters are their unifying approach and review focus, as well as the diversity of the authors’ expertise and disciplines. The central ESRC domain reviews are the product of extensive, multi-method, cumulative work, and provide a macro context for the associated more focused reviews of specific areas within that section. The non-ESRC chapters hone in on more specific topics within each of the domains, bringing to bear multi-disciplinary reviews and analyses. Overall, the sections and chapters provide a multi-dimensional perspective on one of the most consequential aspects of contemporary times: relationships between digital technology and society.

References in Main Text Bell, D. (1973). The coming of post-industrial society: A venture in social forecasting. New York: Basic Books. Beniger, J. (1989). The control revolution: Technological and economic origins of the information society. Boston, MA: Harvard University Press. Brand, S. (1987). The Media Lab: Inventing the future at MIT. New York: Viking Penguin. Castells, M. (2000). The rise of the network society. Malden, MA: Blackwell. Curtin, M. & Sanson, K. (Eds.) (2016). Precarious creativity: Global media, local labor. Berkeley, CA: University of California Press. Daley, B. (2015). Where data is wealth: Profiting from data storage in a digital society. Stoke-OnTrent, UK: Play Technologies. Fuchs, C. (2014). Digital Labour and Karl Marx. Abingdon-on-Thames: Routledge. Headrick, D. R. (2002). When information came of age: Technologies of knowledge in the age of reason and revolution, 1700–1850. Oxford, UK: Oxford University Press. Machlup, F. (1962). The production and distribution of knowledge in the United States (Vol. 278). Princeton, NJ: Princeton University Press. Negroponte, N. (1995). Being digital. New York: Vintage. Nonaka, I., & Takeuchi, H. (1995). The knowledge creating company. NY: Oxford University Press. Porat, M. U. (1971). The information economy. Ann Arbor, MI: University of Michigan Library. Yates, J. (1993). Control through communication: The rise of system in American management. Baltimore: Johns Hopkins University Press. Zuboff, S. (1985). Automate/informate: The two faces of intelligent technology. Organizational Dynamics, 14(2), 5–18.

References of Books for Issues and Contexts Analysis [Numbers refer to their use in Table 6] 1. Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us hooked. London: Penguin. 2. Anand, B. (2016). The content trap: A strategist’s guide to digital change. New York: Random House Group.

32   Ronald E. Rice ET AL. 3. Anduiza, E., Perea, E. A., Jensen, M. J., & Jorba, L. (Eds.). (2012). Digital media and political engagement worldwide: A comparative study. Cambridge, UK: Cambridge University Press. 4. Athique, A. (2013). Digital media and society: An introduction. Hoboken, NJ: John Wiley & Sons. 5. Barney, D., Coleman, G., Ross, C., Sterne, J., & Tembeck, T. (Eds.) (2016). The participatory condition in the digital age. University of Minnesota Press. 6. Bauerlein, M. (2011). The digital divide: Arguments for and against Facebook, Google, texting, and the age of social networking. London: Penguin. 7. Bennett, L., Chin, B., & Jones, B. (Eds.). (2015). Crowdfunding the future: Media industries, ethics, and digital society (No. 98). Bern, Switzerland: Peter Lang. 8. Berger, T. (2017). @ Worship: Liturgical practices in digital worlds. New York: Routledge. 9. Berry, D. M. (2015). Critical theory and the digital. New York: Bloomsbury Publishing USA. 10. Beyes, T., Leeker, M., & Schipper, I. (Eds.). (2017). Performing the digital: Performance studies and performances in digital cultures. Bielefeld, Germany: Transcript-Verlag. 11. Botto, R. & Resende, L.M. (2017). Digital transformations: Technological innovations in society in the connected future. Independently published via Amazon Digital Services. 12. Boyd, D. (2014). It’s complicated: The social lives of networked teens. New Haven, CT: Yale University Press. 13. Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. New York: WW Norton & Company. 14. Buckland, M. (2017). Information and society. Cambridge, MA: MIT Press. 15. Bunz, M., & Meikle, G. (2017). The Internet of things. Hoboken, NJ: John Wiley & Sons. 16. Carr, N. (2011). The shallows: What the Internet is doing to our brains. New York: WW Norton & Company. 17. Castells, M. (2015). Networks of outrage and hope: Social movements in the Internet age. Hoboken, NJ: John Wiley & Sons. 18. Chayko, M. (2017). Superconnected: The internet, digital media, and techno-social life. Thousand Oaks, CA: Sage Publications. 19. Cheney-Lippold, J. (2017). We are data: Algorithms and the making of our digital selves. New York: NYU Press. 20. Chun, W. H. K. (2017). Updating to remain the same: Habitual new media. Cambridge, MA: MIT Press. 21. Couldry, N. (2012). Media, society, world: Social theory and digital media practice. Cambridge, UK: Polity. 22. Cover, R. (2015). Digital identities: Creating and communicating the online self. Cambridge, MA: Academic Press. 23. Cubitt, S. (2016). Finite media: Environmental implications of digital technologies. Durham, NC: Duke University Press. 24. Daley, B. (2015). Where data is wealth: Profiting from data storage in a digital society. Play Technologies. 25. Dey, A. (2018). Nirbhaya, New media and digital gender activism. Bingley, UK: Bingley, UK: Emerald Group Pub Ltd. 26. Dourish, P., & Bell, G. (2011). Divining a digital future: Mess and mythology in ubiquitous computing. Cambridge, MA: The MIT Press. 27. Ellcessor, E. (2016). Restricted access: Media, disability, and the politics of participation. New York: NYU Press. 28. Elliott, D., & Spence, E. H. (2017). Ethics for a digital era. Hoboken, NJ: John Wiley & Sons.

Introduction: Terms, Domains, and Themes   33 29. Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press. 30. Ford, M. (2015). Rise of the robots: Technology and the threat of a jobless future. New York: Basic Books. 31. Fotopoulou, A. (2017). Feminist activism and digital networks: Between empowerment and vulnerability. New York: Springer. 32. Gershenfeld, N., Gershenfeld, A., & Cutcher-Gershenfeld, J. (2017). Designing reality: How to survive and thrive in the third digital revolution. New York: Basic Books. 33. Gillespie, T., Boczkowski, P. J., & Foot, K. A. (Eds.). (2014). Media technologies: Essays on communication, materiality, and society. Cambridge, MA: The MIT Press. 34. Goldsmith, K. (2016). Wasting time on the Internet. New York: Harper Perennial. 35. González-Bailón, S. (2017). Decoding the social world: Data science and the unintended consequences of communication. Cambridge, MA: The MIT Press. 36. Goodman, M. (2015). Future crimes: Inside the digital underground and the battle for our connected world. New York: Random House. 37. Gordon, E., & Mihailidis, P. (Eds.). (2016). Civic media: Technology, design, practice. Cambridge, MA: MIT Press. 38. Graham, M. & Dutton, W.  H. (Eds.) (2014). Society & the Internet: How networks of information and communication are changing our lives. Oxford, UK: Oxford University Press. 39. Graham, R. (2014). The digital practices of African Americans: An approach to studying cultural change in the information society. Bern, Switzerland: Peter Lang. 40. Gronlund, M. (2016). Contemporary art and digital culture. New York: Routledge. 41. Hanna, N. K. (Ed.). (2016). Mastering digital transformation: Towards a smarter society, economy, city and nation. Bingley, UK: Emerald Group Publishing Limited. 42. Hu, T. H. (2015). A prehistory of the cloud. Cambridge, MA: MIT Press. 43. Humphreys, L. (2018). The qualified self: Social media and the accounting of everyday life. Cambridge, MA: MIT Press. 44. Ito, M., Baumer, S., Bittanti, M., boyd, d., Cody, R., Stephenson, B. H., Horst, H. A., . . . & Tripp, L, (2009). Hanging out, messing around, and geeking out: Kids living and learning with new media. Cambridge, MA: MIT Press. 45. James, C. (2014). Disconnected: Youth, new media, and the ethics gap. Cambridge, MA: MIT Press. 46. Johnson, C. A. (2015). The information diet: A case for conscious consumption. Sebastopol, CA: O’Reilly Media, Inc. 47. Kember, S., & Zylinska, J. (2012). Life after new media: Mediation as a vital process. Cambridge, MA: MIT Press. 48. Krieger, D.  J., & Belliger, A. (2014). Interpreting networks: Hermeneutics, actor-network theory & new media (Vol. 4). Bielefeld, Germany: Transcript-Verlag. 49. Kvedar, J.  C., Colman, C., & Cella, G. (2017). The new mobile age: How technology will extend the healthspan and optimize the lifespan. Amazon Digital Services. 50. Levy, D. M. (2016). Mindful tech: How to bring balance to our digital lives. New Haven, CN: Yale University Press. 51. Lindgren, S. (2017). Digital media and society. Thousand Oaks, CA: Sage. 52. Lingel, J. (2017). Digital countercultures and the struggle for community. Cambridge, MA: MIT Press. 53. Livingstone, S. (2009). Children and the Internet. Cambridge, UK: Polity.

34   Ronald E. Rice ET AL. 54. Livingstone, S., & Sefton-Green, J. (2016). The class: Living and learning in the digital age. New York: NYU Press. 55. Lupton, D. (2016). The quantified self. Hoboken, NJ: John Wiley & Sons. 56. Lynch, M. P. (2016). The internet of us: Knowing more and understanding less in the age of big data. New York: WW Norton & Company. 57. Mansell, R., Ang, P. H, Steinfield, C., van der Graaf, S., Ballon, P., Kerr, A., . . . Grimshaw, D. J. (Eds.). (2015). The International encyclopedia of digital communication and society (3 volume set). Hoboken, NJ: Wiley Blackwell. 58. Martin, W. J. (2017). The global information society. New York: Routledge. 59. Mosco, V. (2017). Becoming digital: Toward a post-Internet society. Bingley, UK: Emerald Publishing Limited. 60. Mueller, M.  L. (2010). Networks and states: The global politics of Internet governance. Cambridge, MA: MIT Press. 61. Nafus, D. (Ed.). (2016). Quantified: Biosensing technologies in everyday life. Cambridge, MA: MIT Press. 62. Napoli, P. M. (2011). Audience evolution: New technologies and the transformation of media audiences. New York: Columbia University Press. 63. Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press. 64. Palfrey, J., & Gasser, U. (2016). Born digital: How children grow up in a digital age. New York: Basic Books. 65. Penney, J. (2017). The citizen marketer: Promoting political opinion in the social media age. Oxford, UK: Oxford University Press. 66. Phillips, W., & Milner, R. M. (2018). The ambivalent Internet: Mischief, oddity, and antagonism online. Hoboken, NJ: John Wiley & Sons. 67. Pschera, A. & Lauffer, E. (translator) (2016). Animal Internet: Nature and the digital revolution. New York: New Vessel Press. 68. Rains, S. A. (2018). Coping with illness digitally. Cambridge, MA: MIT Press. 69. Reed, T.  V. (2014). Digitized lives: Culture, power, and social change in the Internet era. New York: Routledge. 70. Rheingold, H. (2012). Net smart: How to thrive online. Cambridge, MA: The MIT Press. 7 1. Rossignoli, C., Virili, F., & Za, S. (Eds.). (2017). Digital technology and organizational change: Reshaping technology, people, and organizations towards a global society. New York: Springer. 72. Rowan-Kenyon, H.  T., Alemán, A.  M.  M., & Savitz-Romer, M. (2018). Technology and engagement: Making technology work for first generation college students. New Brunswick: Rutgers University Press. 73. Rudder, C. (2014). Dataclysm: Love, sex, race, and identity—What our online lives tell us about our offline selves. Crown. 74. Scheff, S., & Schorr, M. (2017). Shame nation: The global epidemic of online hate. Naperville, IL: Sourcebooks, Inc. 75. Schneier, B. (2015). Data and Goliath: The hidden battles to collect your data and control your world. New York: WW Norton & Company. 76. Scholz, T. (Ed.). (2012). Digital labor: The Internet as playground and factory. New York: Routledge. 77. Schwab, K. (2017). The fourth industrial revolution. New York: Crown Business. 78. Shifman, L. (2014). Memes in digital culture. Cambridge, MA: MIT press.

Introduction: Terms, Domains, and Themes   35 79. Sonnier, P. (2017). The fourth wave: Digital health. https://storyofdigitalhealth.com/ fourth-wave-book 80. Steiner-Adair, C. & Barker, T. H. (2013). The big disconnect: Protecting childhood and family relationships in the digital age. New York: Harper Business. 81. Tapscott, D. & Tapscott, A. (2018). Blockchain revolution: How the technology behind Bitcoin and other cryptocurrencies is changing the world. New York: Portfolio-Penguin. 82. Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. New York: Basic Books. 83. Turow, J. (2012). The daily you: How the new advertising industry is defining your identity and your worth. New Haven, CT: Yale University Press. 84. Turow, J. (2017). The aisles have eyes: How retailers track your shopping, strip your privacy, and define your power. New Haven, CT: Yale University Press. 85. Van Dijck, J. (2013). The culture of connectivity: A critical history of social media. Oxford, UK: Oxford University Press. 86. Webster, J. G. (2014). The marketplace of attention: How audiences take shape in a digital age. Cambridge, MA: The MIT Press. 87. White, A. (2014). Digital media and society: Transforming economics, politics and social practices. New York: Springer. 88. Wiesinger, S., & Beliveau, R. (2016). Digital literacy: A primer on media, identity, and the evolution of technology. Bern, Switzerland: Peter Lang. 89. Wu, T. (2017). The attention merchants: The epic scramble to get inside our heads. New York: Vintage

chapter 2

ESRC R ev iew: M ethodol ogy Simeon J. Yates, Iona C. Hine, Michael Pidd, Jerome Fuselier, and Paul Watry

Introduction As noted in chapter 1 of this book, many of the chapters are developed from the findings of the “Ways of Being in a Digital Age” project commissioned by the UK Economic and Social Research Council (ESRC). This scoping review project was commissioned to provide a more holistic view of research on how digital technology mediates our lives, and of the ways technological and social change co-evolve and impact each other. A core goal of the project was to undertake a systematic literature review and synthesis of expert opinions so as to identify gaps in current research. This chapter sets out how the data and results presented in the ESRC overview chapters of this book were generated (chapters 3, 8, 11, 14, 16, 18, and 22). The methods used were in part defined by the nature of the challenge: as the project was commissioned to complete the review in just under 12 months, a considerable part of the project had to be automated in some manner. This provided the opportunity to examine a range of “digital” methods by which a large body of literature, data, and evidence could be summarized.

Participants Project Team The core project team consisted of staff who, at the time, were based at the University of Liverpool, University of Sheffield, and University of Newcastle. A broader group of UK co-investigators and non-UK advisors from 16 universities across the UK, EU, USA, and Singapore also supported the project. This provided expertise across a range of social

ESRC Review: Methodology   37 Table 2.1  Steering Group Group member Role

Group Institution

Discipline

Research

Simeon Yates

PI, Core

SG

University of Liverpool

Social science

Digital culture

Michael Pidd

Co-I, Core DH

University of Sheffield

History

Digital humanities

Adam Joinson Co-I

SG

University of Bath

Psychology

Computermediated communication

Ann Light

Co-I

SG

University of Sussex

HCI and design

Human computer interaction and design

Simon Maskell Co-I

SG

University of Liverpool

Computer science

Data analytics

Claire Taylor

Co-I

SG

University of Liverpool

Modern languages

Digital culture and community

Leanne Townsend

Co-I

SG

University of Aberdeen

Sociology

Communities and digital

Vishanth Weerakkody

Co-I

SG

Brunel University

Information studies

e-Government

Bridgette Wessels

Co-I, Core SG

University of Newcastle

Sociology

Internet studies

Monica Whitty

Co-I

SG

University of Leicester

Psychology

Identity and security online

Naomi Baron

SG

American University, Washington, D.C.

Linguistics

Computermediated communication

Catherine Brookes

SG

University of Arizona

Information studies

Identity online

William Dutton

SG

Michigan State University

Communication studies

Internet studies

Alex Frame

SG

University of Bourgogne, Dijon

Linguistics

Digital media and politics

Ellen Helsper

SG

London School of Economics

Communication studies

Digital inclusion

Rich Ling

SG

Nanyang Technological University, Singapore

Sociology

Media technology

Alison Preston

SG

Ofcom

Media policy

Head of media Literacy research

Ronald E. Rice

SG

University of California, Santa Barbara

Communication

New media, diffusion

Laura Robinson

SG

Santa Clara University/ University of California Berkley

Sociology

Digital exclusion

(continued)

38   Simeon J. Yates et al. Table 2.1  Continued Group member Role

Group Institution

Discipline

Research

Alison Vincent

SG

Cisco

CDI sector

Chief technology officer for Cisco

Paul Watry

DH

University of Liverpool

School of Histories, Digital Languages, and humanities Cultures

Note: SG = steering group; DH = digital humanities group

science, arts, engineering, and science backgrounds (see Table 2.1). Overall these colleagues predominantly provided input to the Delphi elements of the project, workshops, and conferences. The key contribution from all these colleagues was the provision of initial inclusion criteria, key words, and key citations for the systematic reviews. The main technical partner for this project was the Digital Humanities Institute (DHI) at the University of Sheffield. In this project the DHI provided the technical and analytical skills to undertake the concept-modelling work needed to explore the full range of literature covered by the review chapters. The work of the DHI was complimented by University of Liverpool researchers using related methods.

Stakeholder Engagement In order for both academics and potential stakeholders to have an opportunity to inform the review within the short time frame, the team made use of key networks of which they were already members. As will be detailed in the section on the Delphi methods, these existing networks were key to the initial data collection. To connect with nonacademics, the project worked with the Digital Leaders network as a route to engage private sector, public sector, and third sector partners (http://digileaders.com). Established by Martha Lane Fox, the Digital Leaders network provides access to around 40,000 corporate, Small and Medium Sized Enterprise (SME), national government, local government, academic, and charity staff and organizations. The project lead (Yates) ran the Digital Leaders Research theme. Yates and Helsper are also members of the Department for Digital, Culture, Media and Sport’s (DCMS) Digital Skills and Inclusion Research Working Group, which undertakes reviews of UK digital engagement strategy and policy research. All of the UK Steering Group members have been members of relevant ESRC, Arts and Humanities Research Council, or Engineering and Physical Sciences Research Council networks, funded programs, or have had senior roles in UK Research and Innovation policy and practice in regard to digital research. Other networks that the overall team are part of include the Communities and Cultures Network; New Social Media, New Social Science Network; European Sociological Association (Communities and Digital Cultures groups); International Communication Association (Communication and

ESRC Review: Methodology   39 Technology Section); US Partnership for Progress on the Digital Divide; Digital Latin American Cultures Network; Centre for Research and Evidence on Security Threats; the Cabinet Office Behavioural Science Expert Group; ESRC Emoticon Network; EU COST Action on Social Media and Social Networks; ECREA Material Digital Cultures group; British Sociological Association Digital Sociology group; the EU e-forum; and the Meccsa Policy Group.

Initial Outline for the Scoping Areas Domains and Goals The ESRC commission identified a number of potential questions for future research work. The scoping review took these as a starting point that could be added to, developed, and validated. The team separated these into seven major foci for the review (see Table 2.2). We have called these seven foci “domains.” The goal of the review was to assess the following for each domain:

Table 2.2  Initial Scoping Questions 1.  Citizenship and politics How does digital technology impact our autonomy, agency, and privacy—illustrated by the paradox of emancipation and control? Is our understanding of citizenship evolving in the digital age—for example does technology help or hinder us in participating at individual and community levels? If so how? 2.  Communities and identities How do we define and authenticate ourselves in a digital age? What new forms of communities and work emerge as a result of digital technologies—for example, new forms of coordination including large-scale and remote collaboration? 3.  Communication and relationships How are our relationships being shaped and sustained in and between various domains, including family and work? 4.  Health and well-being Does technology makes us healthier, better educated, and more productive? 5.  Economy and sustainability How do we construct the digital to be open to all, sustainable, and secure? What impacts might the automation of the future workforce bring? 6.  Data and representation How do we live with and trust the algorithms and data analysis used to shape key features of our lives? 7.  Governance and security What are the challenges of ethics, trust, and consent in the digital age? How do we define responsibility and accountability in the digital age?

40   Simeon J. Yates et al. • What existing literature addressed these domains and what central topics emerged from them • How the reported research addressed these domains • What experts viewed as the gaps in understanding in regard to these domains • Some suggestions of future research directions and challenges for each of these domains The reviews also sought to describe and assess the use of theory and of methods in each of the domains.

Use of Theory This element of the analysis considered how theories are used both deductively to set up empirical work and/or to provide explanation and conclusions from inductive work. Some key questions around theory included: How is the digital socially and technically conceptualized? Which theories are predominant in which domains? What new theory has been developed, and/or is “old theory” adequate to the task of explaining the social impacts and use of the digital? To what extent is digital research theoretically or empirically driven? Which concepts and key themes cluster and link regardless of theoretical or empirical approach? Can a new “theoretical framework” for understanding the dig­ ital be generated, and is this needed? To what extent have interdisciplinary approaches modified or developed theory?

Use of Methods This element emphasized the range of methods, types of data, and research contexts in the examined literature. Some key questions that were addressed include: Which methods predominate in which domains of work? Does the availability of large volumes of digital data change how the digital is studied and/or the approaches taken to the social in a digital world? Are certain methods intrinsically linked to certain domains or theories? How are methods tied to the social contexts around digital research? Have interdisciplinary approaches modified or prioritized certain methods in the study of the digital?

Approaches for the Review The project explored these questions for each domain through both established and new digital approaches to systematic reviewing and expert opinion elicitation:

ESRC Review: Methodology   41 • Delphi reviews of expert opinion for each domain • Stakeholder engagement • Digital examination of and systematic review of a citation-led sampling of the literature

Delphi Process As a starting point the project undertook seven sets of Delphi process interviews (Linstone & Turoff, 1975). An eighth set, run with non-academic stakeholders, was undertaken via a series of workshops and “salon events.” Round one of the Delphi proc­ ess was undertaken with the project Steering Group. The results from this were used to develop a snowball sample of additional domain experts. Round two was undertaken with this identified sample. Round three consisted of a confirmatory survey of international scholars and a consultation workshop with the UK Steering Group and a set of invited UK academics. Delphi methods have a long history going back to the 1950’s and were initially designed as a method for forecasting or predicting outcomes in complex situations. More recently, the methods have been employed as a set of tools for systematic knowledge elicitation in complex domains. The Delphi method in most cases is a structured iterative communication technique or method by which a panel of experts provide evidence and then review this evidence, looking to move to a broad consensus position (Figure 2.1). Delphi methods are based on the principle that knowledge, decisions, or forecasts from a selected group of individuals Selection methods Level of experience Panel size Groupings Definition of problem

Selection of experts

Questionnaire development Type and focus of questions Measurement and scoring Repeat up to 3 times

Questionnaire administration

Questionnaire analysis Coding Relevant statistical and qualitative methods

Figure 2.1  Delphi process.

Consensus outcomes

From Sanchez (1999)

42   Simeon J. Yates et al. (experts) who iteratively review information are more accurate than those from unstructured groups. In a standard Delphi process the selected experts answer questionnaires or semi-structured surveys in two or more rounds. The results of each round are summarized by the team managing the process and provided back as an anonymized summary to the expert panel. The panel is then provided the opportunity to revise their answers in light of these summaries, with the goal of reaching either an overall consensus, or statistically acceptable “mean” or average, where numeric predictions are being sought. We modified our Delphi process to incorporate the outcomes of the literature review work. Rounds one and two helped to provide the basis of both the literature work and potential research gaps. Round one was conducted with the project Steering Group (see Table 2.1). This included the opportunity for the team to identify key scholars in the field for round two of the Delphi process as well as starting points for the literature review. Delphi reviews are often undertaken anonymously—in that the experts do not know who the other contributors are—and are also conducted remotely. In our case Round one was conducted with the steering group, so this was not anonymous. Round two was undertaken anonymously among the experts identified in Round one. Invitations to contribute were sent out by email, and the data were collected via an online survey tool. Experts were invited to contribute answers around one of the seven domains relevant to their backgrounds as identified by the core team. The team only changed this allocation when experts notably self-identified with a different domain to that which they had been allocated. Round three consisted of a consultation workshop based on the Delphi and literature review findings and a confirmatory survey. The survey was sent to all prior participants, asking asked them to rank the identified topics and challenges in terms of importance for future research. The Delphi process therefore identified four sets of data for each domain: 1) Initial scoping questions for future programs of research 2) Key authors and key literature for each domain 3) Key topics to be addressed within these programs of work 4) Key challenges when undertaking these programs of research Initial scoping questions for future programs of research. Though Table 2.2 details the initial scoping questions set by the ESRC, we utilized rounds one and two of the Delphi process to validate and expand these as required. In all cases, this led to the initial scoping questions being modified. The specific changes to each of these domains are detailed in the ESRC review chapters (Chapters 3, 8, 11, 14, 16, 18, and 22). In some cases, this involved expansion of the questions (e.g., in the Communication and Relationships domain), or focusing on specific interpretations of terms (e.g., in the Health and Wellbeing domain). Delphi respondents were asked to use their interpretation of the domain scoping questions as the basis for their answers to the Delphi survey. Key authors and key literature for each domain. The experts were asked for key authors, key items of literature, and key search terms (derived from their scoping questions) for the

ESRC Review: Methodology   43 collection of the domain literature. This information was then used to systematically collect literature from key databases (Web of Science-ISI Web of Knowledge; Social Sciences Citation Index; Google Scholar). Key topics to be addressed within these programs of work. The survey first explored which “topics” the experts believed needed further research within the domain. These might be areas where there is a lack of research, where further research was needed, or where specific questions needed to be unpacked further. The responses provided the basis for assessing future research areas in the domain. They could also be matched against and compared with the concepts and topics identified in the literature review. Key challenges when undertaking these programs of research. The survey next asked experts to highlight the methodological, practical, and other challenges that might be faced when attempting to address the topic areas they had identified. These might be existing challenges for relevant research but also new ones due to the digital context of the research. The methodological challenges could be compared to and contrasted with the methods and approaches identified in the literature. One of the key features of the Delphi process results was the commonality of responses to the “challenges” questions across all seven domains. We have therefore reported these cross-cutting challenges as a separate chapter (Chapter 25) and sought to identify specific challenges when reporting on each domain.

Stakeholder Engagement: Workshops The project organized a range of facilitated workshops to engage academic and nonacademic stakeholder partners. The main one of these was a final consultation workshop to review the outcomes of the Delphi process. This was attended by the majority of the UK members of the Steering Group as well as UK colleagues identified in the Delphi process, via the literature review and the other workshops. Two additional workshops explored the impacts of Automation on work and society and were supported by the ESRC, the UK Defence Science and Technology Laboratory (DSTL), and the US National Science Foundation (NSF). A total of six workshop programs contributed to the project: 1) Salon events in collaboration with Digital Leaders (www.digileaders.com). Salon events involved short presentations to develop discussion followed by open “Chatham house rules” (https://www.chathamhouse.org/chatham-house-rule) discussions among academic, industry, and policy partners. Salon events were led by academics based on the domains and the team attended industry led Salon events. These allowed for non-academic input to the Delphi process. 2) A joint ESRC and DSTL funded facilitated workshop to explore research topics around the social impacts of automation and augmentation in the workplace.

44   Simeon J. Yates et al. 3) A further joint ESRC and NSF workshop on “Work at the Human Technology Interface.” 4) A joint MECSSA and project supported workshop on “digital policy,” which examined the policy and policy-making issues arising from digital media. 5) A project and UK Department of Digital, Culture, Media and Sport workshop to explore the impacts of digital on the arts and cultural sector. 6) An academic symposium discussing the results from the project and seeking further invited review papers, some of which are in this volume, conducted by the project just prior to the ESRC and NSF workshop.

Systematic Literature Reviews Approach. As noted above the Delphi process provided the literature survey with three initial starting points for the literature review: • Key authors • Key words—from the scoping questions • Key literature—as the starting point for citation searchers The collection of literature was undertaken twice, following rounds one and two of the Delphi process. This produced two overlapping sets of key literature that were combined for the final analytical work and content analysis. Given the volume of published work within the seven domains, undertaking a meta-analysis to synthesise the quantitative results of available empirical studies (Blundell, 2013) was not possible. Nor, as the ESRC Review chapters (Chapters 3, 8, 11, 14, 16, 18, and 22) point out, were there enough empirical studies of similar design and focus (either deductive or inductive) to undertake such a process. Rather, to address the challenges of dealing with such a large body of data, a partly automated systematic narrative review (Popay et al., 2006) was undertaken with the goal of synthesizing primary studies and descriptively exploring the heterogeneity of work. This hopefully provides the basis for targeted systematic literature reviews for hypothesis generation (Petticrew & Roberts, 2008) likely to be undertaken by future studies. A key element defining the approach was the need to address the large volume of work in each domain within the limited timescale of a few months. The project had an overall database of just under 6,000 potential target publications from key authors and flowing from citations of key papers, identified by the two rounds of the Delphi work. The databases searched to collect this material were the ISI Web of Science (http://webofknowledge. com/), the International Bibliography of the Social Sciences and Google Scholar. Google Scholar produced results outside the date range and some non-academic “grey literature.” Other bibliographic studies, especially social studies of science, undertaking citation analyses have been able to gain formally agreed commercial access to publisher APIs (application programming interfaces)—so as to be able to “scrape” searched-for

ESRC Review: Methodology   45 items—this was not feasible within the budget and timescale of the project. As a result, the majority of papers were downloaded manually or by utilizing tools with limits on downloads. Of the initial 6,000 target cases, once non-academic “grey literature,” papers published outside the main sampling frame date range (2000–2016), and items not available in digital format were removed, this left 3,971 publications included in the analysis. We estimated that to systematically read, review, and code these by hand for all of the various aspects discussed in the review chapters of the analyses below (that is, coding for all overlapping concepts and topics, theory, methods and analytical approach) would have taken as a minimum 12,000 person hours or around seven person years of work. This challenge is not unique to contemporary research in all academic fields, and reflects a growing problem for academic work, as Petticrew and Roberts note: The problem is not just one of inconsistency, but one of information overload. The past 20 years have seen an explosion in the amount of research information available to decision makers and social researchers alike. With new journals launched yearly, and thousands of research papers published, it is impossible for even the most energetic policymaker or researcher to keep up-to-date with the most recent research evidence, unless they are interested in a very narrow field indeed. (Petticrew & Roberts, 2008, p. 7)

We would also argue that this issue is compounded for inherently interdisciplinary work, such as the study of the social impacts of digital media and technologies. Relevant papers on a question such as the role of digital media in interpersonal interaction may be found in psychology, sociology, linguistics, computer science, information studies, philosophy, and health care publications. To solve this dilemma, we looked to digital technology solutions. The survey was therefore in part an “experimental” consideration of the use of digital tools developed in digital humanities, social sciences, and linguistics to analyze large bodies of text. These tools supported the core team in undertaking the review through linguistic, content, and reflective methods. Similar, both automated and non-automated, methods have been applied to the contents of this volume in chapter 25. More specifically, undertaking a systematic review in the social sciences involves a number of challenges that are less of a concern in other science contexts. First, a large proportion of the work will be case study based. Second, a considerable amount of work will be in long form (books and edited collections) and will contain mostly narrative and theoretical content. Third, other work, mainly in journals, will be predominantly empirical. We therefore needed: • An overall content analysis across theory, method, research topic, and context • A predominantly narrative systematic review across the material to address descriptive, case study, theoretical, qualitative, and quantitative publications Digital tools. As a first step, the literature was analyzed using linguistic, text mining and computational tools to identify predominant topics and concepts within each domain,

46   Simeon J. Yates et al. involving three approaches: concept-modelling and two kinds of topic analysis. Then a traditional manual content analysis was applied to assess theory and methods. Concept-modelling—Linguistic DNA. First, literature identified after round one of the Delphi process was subjected to a lengthy and detailed concept analysis. Conceptmodelling procedures, developed at the Digital Humanities Institute at the University of Sheffield, in association with the University of Sheffield’s School of English, analyzed patterns within the literature to identify recurrent associations and themes. The procedures output groups of words (or more specifically, lemmas) representing dominant associations within each given dataset. For the current project, groups were limited to pairs accompanied by a non-ranked list of further associates, i.e. words repeatedly located alongside those pairs (Fitzmaurice et al., 2017a; 2017b; also http://linguisticdna.org linguisticdna.org). This process is underpinned by the notion of a discursive concept, as theorized by Fitzmaurice et al. (2017a, 2017b). Though sometimes referenced by a single word, the discursive concept cannot be reduced to that word but is a complex meaning with wider inference. This inference can be detected by the other language that surrounds the word. For example, when an author uses the word “society” (or “societies”) we can determine the inferred conceptual characteristics by identifying other words found repeatedly in proximity (e.g. within the same paragraph), and modelling such patterns of proximate words in the given text and in other texts. Importantly, ­concept-modelling enables us to detect how ideas, theories, and methods emerge and evolve within discourse, by detecting changes in proximate words across texts and across time. In the initial sample of documents supplied to the DHI team, “business” was discovered to be strongly linked to “competence” (and vice versa), “consumer” with “selfservice,” and “knowledge” with “seeker.” Table 2.3 shows these concept-pairs with the first 20 associated words arranged alphabetically (ranking of the associated words represents a further analytical step). This type of concept-modelling is distinct from topic modelling, in that it focuses on sections of discourse that are shorter than a text, with a goal of extracting conceptual structure and tracing patterns and change in language and thought. In practice, the process begins by identifying parts of speech and lemmas (in this case we used Helmut Schmidt’s TreeTagger). The machine-readable texts then pass through a further algorithm pipeline that uses frequency and location to identify prominent lemma pairs, applying a refined statistical probability calculation based on Pointwise Mutual Information. The core process can be repeated to identify other lemmas cooccurring repeatedly with each pair, which have here been termed “associates.” These lexical relationships can then be visualized via clustering algorithms and network diagrams. Concept-modelling is considered more nuanced than topic modelling, because it pays attention to the relative location of words. The end result is a “concept model” that enables users to explore how ideas, theories, and methods relating to ways of being in the digital age emerge and evolve across the literature. As a tool, it provides a datadriven map for identifying trends and anomalies that might warrant further study. Concept-modelling outputs were presented via a range of visualizations including bubble maps (Figure 2.2) and tree maps (Figure 2.3), with specific versions in each of

ESRC Review: Methodology   47 Table 2.3  Example Concept-Mapping by Digital Humanities Institute at the University of Sheffield business, competence

consumer, self-service

knowledge, seeker

administration

academy

ability

area

addition

action

awareness

adoption

ambiguity

breadth

amount

anticipation

capability

anxiety

average

category

attitude

awareness

client

attribute

beginning

collaboration

banking

bit

competency

behavior

capacity

component

characteristic

caution

concept

checkout

choice

construct

comparison

colleague

contribution

control

complexity

core

customer

condition

creation

customization

conjunction

definition

delay

correlation

deployment

delivery

cross

depth

determinant

decision

development

difference

delay

dimension . . .

ease . . .

description . . .

Note: Only the first 20 associated alphabetic terms are listed under the three example concepts.

the  ESRC Review chapters. An interactive browser-friendly version and further ­documentation may be found online at https://dhi.ac.uk/waysofbeingdigital/. Topic analysis. As a follow up to this analysis, the team applied two further digital methods to the full data set of literature gathered after round 2 of the Delphi process. The first involved the application of project-specific tools developed using Python to extract topics through the statistical analysis of word frequency within individual documents. This work was undertaken by a team at the University of Liverpool following methods outlined by Sievert and Shirley (2014) and Chuang et al. (2012). This produced interactive maps of topics (Figure 2.4) and key words (Figure 2.5). Second, the same data were examined utilizing the commercial WordStat tool (https://provalisresearch.com). The data were examined in WordStat at a paragraph

48   Simeon J. Yates et al. 2000 – 2017 2000

2002

2004

2006

2008

form/netw

participat

2010

internet/i

2012

2014

2016

network/pr citizen/se society/th

interest/ relationship/u delivery/

network/societ

communication/ society/su

participa network/site

citizen/gover action/me

access/techno

government

medium/use access/internet

action/ne culture/mediu

information/society participat

medium/news

action/gr

governmen

citizen/me

home/internet

access/home

communication/medium

internet/use

citizen/in

site/web

communication/s

government/s

medium/pol

internet/par

site/user

medium/society

communication/netwo

medium/plat

group/participant

medium/retat

network/pow network/orga

government/s

medium/twit

medium/part

engagement/

Figure 2.2  Bubble map of concept pairs.

rather than document level. WordStat split the papers into paragraph segments and then constructed a word-by-segment frequency matrix. This matrix is subjected to an exploratory factor analysis using a Varimax rotation, from which a set of “factors” is extracted—these should map onto consistent topics in the data. All words with a loading higher than a target criterion (we used 0.3) are then defined as being an extracted topic (Figure 2.6). Using this tool produced similar results to those from the University of Liverpool topic analysis. Combining these three results allowed the team to develop a thematic meta-analysis of the overall themes and issues in the literature (Petticrew & Roberts, 2008). The results from these three approaches are presented in the ESRC Review chapters for each of the domains (chapters 3, 8, 11, 14, 16, 18, and 22). Though the underlying text-mining methods are relatively established, these are novel and experimental approaches within the context of a study review such as this. In using these tools it was hoped that the team would gain an overall appreciation of their usefulness for future research. Importantly they provided a route to understanding key concepts and topics within this very large

ESRC Review: Methodology   49 2004–2012 2000 government/use

2002 datum/facebook

government/res advertising/studiv:

2004

2006

society/surveillan

medium/network

knowledge/aurvei

movement/protest

2008

broadband/inter government/webs site/warehouse

medium/society

site/web

government/tr site/tactic

2016 coefficient/sig

communication/r

correlation/sig

government/var internet/participati broadband/home

studivz/user

network/organization

communication/medium

internet/use access/intern

activism/internet

internet/movem government/state

nonwarehouse/

access/home

2014

technology/use

site/user

access/technolo

2012

home/internet action/protes network/socie facebook/use

coefficient/correlation

internet/politics information/user

2010

action/site

citizen/governme

access/use

datum/user

access/broadband

information/technology

communication/society

network/power

information/society

Figure 2.3  Tree map of concept pairs.

literature set within a short time frame, and allowed the team to compare the literature topics with the proposed future topics identified in the Delphi process. Interactive visualizations of the topic-based data can be examined at https://waysofbeingdigital.com/ literature-analysis-interactive-results/; these are also explained as a note in relevant ESRC Review chapters. Content analysis. The final stage of the literature review was a content analysis focusing on the methods, theory, and data in the collected papers. The initial plan was to conduct a random sample of papers so as to manage both volume and timescale, as this work could not be done easily via digital tools. An initial test run found that nearly all of the required information could be found in the abstracts, methods, and conclusion sections of the papers, cutting down the time needed to code papers. This allowed a team of six researchers working in parallel to code the full corpus over a period of six weeks. The project took inspiration from a recent in-depth content analysis of the Communication Studies literature (Borah, 2017) on the social impact of the Internet, covering 56 journals over a 16-year period (1998 to 2014). Borah’s analysis found that 70 percent of journal papers on this subject did not employ any core theoretical position, nor use theory to define a research question. Instead, papers predominantly reported on case studies or presented analyses of empirical data sets. Following a similar method to Borah, the content analysis systematically documented six aspects the publications in each domain: 1) Main discipline: as in sociology, communication studies, computer science, etc., primarily determined by the discipline of the lead or main authors.

Intertopic Distance Map (via multidimensional scaling) PC2

12

5

11

13 PC1

14

7 1

3 8

10

19 4 18

15 9

17

20 2

16

8

Marginal topic distribution 2% 5% 10% Slide to adjust relevance metric:(2) λ=1

0.0

0.2

0.4

0.6

0.8

1.0

Top-30 Most Relevant Terms for Topic 3 (6.9% of tokens) facebook capital move tie bridging ellison respondent distance friendship frequency item prior month mover bonding pre esteem benefit weak maintenance motivation page lampe actual profile job adjustment closeness college strength

0

1,000

2,000

3,000

4,000

5,000

Overall term frequency Estimated term frequency within the selected topic 1.saliency(term w) = frequency(w)* [sum_t p(t l w)* log(p(t l w)/p(t))] for topics t; see Chuang et. al (2012) 2.relevance(term w l topic t) = λ * p(w l t) + (1–λ)* p (w l t)/p(w): see Sievert & Shirley (2014)

Figure 2.4  Interactive topic modelling graph–topic.

Intertopic Distance Map (via multidimensional scaling) PC2

12

5

11

13 PC1

14

7 1

3 8

10

19 4

15

9 18 17

20 2

16

6

Conditional topic distribution given term = ‘facebook’ 2% 5% 10% Slide to adjust relevance metric: λ=1

(2)

0.0

0.2

0.4

0.6

0.8

1.0

Top-30 Most Salient Terms 1 0

1,000

2,000

3,000

4,000

5,000

6,000

teen adolescent pair tie facebook privacy sexual risk parent cell cmc chat game team disclosure capital mobile health sex frequency friendship dating per trust move local partner room profile relational Overall term frequency Estimated term frequency within the selected topic 1.saliency(term w) = frequency(w)* [sum_t p(t l w)* log(p(t l w)/p(t))] for topics t; see Chuang et. al (2012) 2.relevance(term w l topic t) = λ * p(w l t) + (1–λ)* p (w l t)/p(w): see Sievert & Shirley (2014)

Figure 2.5  Interactive topic modelling graph–keyword.

52   Simeon J. Yates et al.

Figure 2.6  WordStat topic modelling.

2) Theories used in empirical work: the project coded actual use of theory to define hypotheses or to explore data. Very often theoretical positions were mentioned but not used to define hypotheses or explore data. For example, the works of Castells (2011) were often cited as well as those of van Dijk (2013). But such references were used as scene-setting or as justifications for why studies of digital media are important; they were rarely used to construct models nor explanations of findings. 3) Theory development: either inductively from new data or deductively via empirical testing. 4) Empirical methods used: whether qualitative, quantitative, or mixed methods. 5) Type of population studied: from nationally representative surveys to target population or case studies. 6) Data analysis methods: including whether methods were described as “big data.” The results of this content analysis are presented in each of the ESRC review chapters (chapters  3,  8,  11,  14,  16,  18, and  22). The results show considerable variation across domains, with some favoring strongly quantitative work and others more varied approaches. As with Borah’s (2017) work, the formal use of theory to either develop hypotheses or explore data was fairly limited in most domains.

Conclusion This chapter has presented the approach to the analysis of both the literature review and Delphi processes undertaken by the ESRC project “Ways of being in a digital age.” The methods used addressed two challenges:

ESRC Review: Methodology   53 • Undertaking the analysis within a limited time frame of less than one calendar year • Utilizing and evaluating digital tools as far as possible in the analysis It is important to note that the digital processes utilized by the project can be c­ oupled with tools to present the results in interactive—often visual—forms. Such processes thereby provide an opportunity to explore the data and literature in novel ways. This allows researchers to make use of the different views and representations as routes into the literature and data in ways not previously available. The use of tools and processes in this project was effectively experimental, exploring their to manage, review, and assess a large body of literature. One of the team’s future research area recommendations is to further assess such approaches to examining prior research publications. As we noted earlier in this chapter, the challenge of having to deal with a large body of literature, or a considerable body of academic evidence or opinion, is  not specific to this project or this research area. It is somewhat ironic that in support of an ever-growing range of academic publications, and facilitating their ready access, digital media are also making it harder to overview and assess these bodies of  knowledge. Again, somewhat ironically, digital tools (including algorithmic ­solutions) provide a route to manage this volume. In reflecting on the process, the team would note that having multiple views, and importantly different “algorithmic solutions” underlying these views, provides routes to cross-reference and cross-validate results, as well as provide different insights. This digitally-derived insight must also be combined with the insights from the extensive engagement of the researchers with the source materials.

References Blundell, M. (2013). Understanding and synthesising my numerical data. In A.  Boland, M. G. Cherry, & R. Dickson (Eds.), Doing a systematic review: A student’s guide (pp. 94–124). London: Sage. Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Castells, M. (2011). The rise of the network society (Vol. 12). New York: John Wiley & Sons. Chuang, J., Manning, C.  D., & Heer, J. (2012, May). Termite: Visualization techniques for assessing textual topic models. In Proceedings of the international working conference on advanced visual interfaces (pp. 74–77). New York: ACM. Fitzmaurice, S., Robinson, J. A., Alexander, M., Hine, I. C., Mehl, S., & Dallachy, F. (2017a). Linguistic DNA: Investigating conceptual change in early Modern English discourse. In Studia Neophilogica, 89 Suppl, 1: Interfacing individuality and collaboration: Patterns in English language research (Guest editors: I. Taavitsa, J. Smith, & M. Kytö). 2017. http://dx. doi.org/10.1080/00393274.2017.1333891. Fitzmaurice, S., Robinson, J. A., Alexander, M., Hine, I. C., Mehl, S., & Dallachy, F. (2017b). Reading into the past: Materials and methods in historical semantics research. In T. Säily, A. Nurmi, M. Palander-Collin, & A. Auer (Eds.), Exploring future paths for historical sociolinguistics. Amsterdam: John Benjamins.

54   Simeon J. Yates et al. Okoli, C. & Pawlowski, S.  D. (2004). The Delphi method as a research tool: An example, design considerations and applications. Information & Management, 42(1), 15–29. Petticrew, M., & Roberts, H. (2008). Systematic reviews in the social sciences: A practical guide. Hoboken, NJ: John Wiley & Sons. Popay, J., Roberts, H., Sowden, A., Petticrew, M., Arai, L., Rodgers, M., . . . Duffy, S. (2006). Guidance on the conduct of narrative synthesis in systematic reviews. ESRC Methods Programme. https://www.researchgate.net/publication/233866356_Guidance_on_the_ conduct_of_narrative_synthesis_in_systematic_reviews_A_product_from_the_ESRC_ Methods_Programme/download doi:10.13140/2.1.1018.4643 Sievert, C. & Shirley, K. (2014). LDAvis: A method for visualizing and interpreting topics. In Proceedings of the workshop on interactive language learning, visualization, and interfaces (pp. 63–70). Baltimore, MD. Van Dijk, J. A. (2013). A theory of the digital divide. In The digital divide (pp. 49–72). New York: Routledge.

Section 2

H E A LT H , AGE , A N D HOM E

chapter 3

ESRC R ev iew Health and Well-Being Simeon J. Yates, Leanne Townsend, Monica Whitty, Ronald E. Rice, and Elinor Carmi

Introduction This chapter provides an overview of the results from analyses of the literature, the Delphi process, and any relevant workshops for the Health and Well-Being domain. The initial ESRC scoping question for this area of work was: “whether technology makes us healthier, better educated, and more productive?” We first explore the results of the various digital humanities analyses of the literature and the review of methods and theory, and then set out the results of the Delphi process. We compare these results, and we ­conclude with recommendations for areas of future study.

Initial Comments This domain generated the largest set of literature of all. This appears to reflect disciplinary differences with other domains. Much of the literature was within health studies and health research journals. There was a stronger tendency to report experimental and empirical findings and there were far fewer general reviews. The responses to the Delphi process focused on health and mainly health-based, well-being issues but not much on the education element. We have also bracketed off the productivity issue in the health domain as this was extensively addressed in the Automation Workshop and therefore is presented in chapter 24. Workshops we ran with stakeholders via the UK Digital Leaders network focused on two main areas: health inequalities and access to digital technologies and privatization of health delivery through digitization. As a result, the one element of

58   Simeon J. Yates et al. the ESRC brief that is under-represented here is the question, “ Does digital media make us better educated?” We would argue that in relation to formal education this area is well served by work on educational technology, so that issue is not analyzed here. In regard to informal learning and also the specifics of both basic and complex digital skills— digital literacies—this issue clearly runs through many of the chapters and analyses in this volume.

Literature Analysis Topics As with the other literature analysis chapters in this volume, we aimed to identify two sets of data. The first was key concept pairs and topics within the existing literature. This allowed the comparison with areas of importance identified by the Delphi review. The second was a content analysis of the literature to explore the predominance of specific theories, methods, and approaches. The 11 most common concept pairs identified in the Round-1 literature are listed in Table 3.1. These represent the topics covering 2% or more of the identified cases. Table 3.2 lists the main and second (sub) concepts identified. Figures 3.1 and 3.2 display the changing nature and frequency of concept pairs from the periods 2000–2004 and 2012–2016.1 Clearly the focus in the early period was on the technology (computers, system, information, Internet, data, navigation, space, robot, phone) with some relationships with people (user, scientist, and group), and only an emerging focus explicitly on the health context (care, health, support, intervention,

Table 3.1  Analysis Concepts Ranked Concepts Disease Body Care Health Behavior Loss Activity Network Communication Child Intervention

Percent 7.3 4.6 4.0 3.8 3.7 3.3 3.2 2.6 2.4 2.2 2.1

Note: Topics occurring in at least 2% of the cases.

ESRC Review: Health and Well-Being   59 Table 3.2  Concept Pairings—Main and Secondary Concepts Concepts

Percent

Concepts

Percent

disease outbreak prevention sufferer surveillance

18.60 6.26 4.59 1.07 6.68

health promotion

9.66 9.66

loss weight

8.47 8.47

body device embodiment mass mother object self

11.69 2.44 2.15 3.22 .95 1.91 1.01

activity conduct isolation leisure pedometer sport

8.17 2.09 1.25 1.13 1.31 2.39

care caregiver clinic follow-up

10.26 3.22 2.74 4.29

network outbreak rice stress vaccination

6.56 1.43 .89 2.92 1.31

Concepts

Percent

communication conflict mail stress

6.14 1.91 .95 3.28

behaviour counselling

9.36 3.10

recycling smoking taxonomy

2.03 3.58 .66

child donation mother

5.66 1.13 4.53

intervention mobile vegetable

5.43 1.91 3.52

Note: bolded term is the main concept; the unbolded terms below that and above the line are the related subconcepts.

effects, weight). By the later period, the most frequent concepts involve health (health, care, intervention, participant, patient, group, support) with the most frequent concept pairs involving those items, and then some emphasis on research (study, intervention, analysis, data, control, outcome, effect, trial). All the literature collected from both rounds was analyzed using Wordstat. Wordstat identified 18 topics, presented in Table 3.3. As we can see in Table 3.4, there is a good overlap between the two analyses. We would argue that the analyses point to literature that is focused on the use of digital technologies and social media in three main areas. First, monitoring and supporting individuals in changing health behaviors (such as weight loss or stopping smoking); second, using digital technologies to monitor and support patients with chronic illness (e.g., hypotension); and third, using digital technologies to support health communication or as part of health support communities. Separate from this, the literature is focused on the measurement and evaluation of the efficacy of such interventions. This evaluation fits with the content analysis on methods and theory that follows. A section of the literature also included work on educational technology with some crossover to technologies to support health education. Six key areas stand out from the analysis (Table 3.3): educational technology, health care, measures and measurement, mobile and smartphone devices, social support, and weight loss. As noted earlier, we put the “educational technology” issue to one

60   Simeon J. Yates et al.

communicatio

communicatio

medium/task

communicatio

information/ access/inter

communication/

system/use

effect/suppo

phone/user

interventio

support/woma

nonuser/use robot/space internet/user

communication/infor

health/support

line/segment

leavl/suppo

navigation/spac graph/node

internet/phone

communication/syst

computer/system

measure/supp

informatio

datum/system

implementat

image/scan

internet/use information/scien system/user system/usag

communication/me internet/usa

student/wo

system/time group/support

time/user

navigation/r care/health

interactio

communication internet/s

pattern/rob compression

image/syste

shape/space

network/sy graph/level

Figure 3.1  Health and Well-Being 2000–2004: Most frequent concept pairs.

side, except where it overlaps with issues of health and well-being. In the following sections we consider some examples of how these issues have been examined in the recent literature. Health care. Work on the interaction in health care provision between social, occupational, and organizational roles and digital media has a long history. For example, Aydin and Rice (1991) argued that membership of specific occupational and departmental social worlds can help to explain attitudes toward medical information systems within health care organizations. They noted that Physicians, for example, expected involvement in decision-making and felt the system had become primarily an administrative system, while other medical ­ employees were more concerned with computer use as infringement on their patient care activities.  (p. 132)

More recently the focus has moved to the role of digital systems in the range of health services including public health and personalized health. The analysis by the University

ESRC Review: Health and Well-Being   61

analysis/stu

access/inter

disease/healt

health/service

health/internet

care/interven communication/hea

health/use

effect/inte loss/weight

health/patient outcome/stud

health/system group/particip

intervention/study

health/intervention

health/support

communicat

interventio

study/suppo

health/technol

datum/patie

group/interven care/health

health/information

health/rec

network/supp

change/interve

behavior/health intervention/

intervention/par participant/study

study/trail

care/patient

health/outcom

intervention interventio

group/support

health/medic

analysis/datum

interventio

control/int

patient/study

change/healt access/hea

interventio health/revi

health/ris

group/study

interventi

Figure 3.2  Health and Well-Being 2012–2016: Most frequent concept pairs.

of Sheffield shows that literature before 2004 had a stronger emphasis on information systems and users, whereas more recent work has focused on care, intervention, and health information for patients. For example, Bennett and Glasgow (2009) discuss advantages in public health interventions conducted via the Internet and Web 2.0. Within this domain they point out that there are also challenges, such as reach (access), sustainability of effects, reporting in standardized measures, and attrition. Bennett and Glasgow argue that these challenges could be overcome with more tailored messages and greater use of social networking functions. This shift from a system focus to a user or “person” focus can be found in a lot of the literature on digital media use. This represents a shift from the novelty and specifics of technologies to the integration of these into everyday practice. ennett and Glasgow see advantages for digital media in reach and efficacy in health-related interventions, as “Internet-based implementation allows participants to access intervention content at their convenience, in a manner that can feel largely anonymous” (p. 276). Combined with available data, “Internet interventions can be structured to provide highly personalized messages” (p. 276). As digital technologies

62   Simeon J. Yates et al. Table 3.3  Wordstat Analysis of Topics Topic

Keywords

Eigenvalue

Freq

Educational technology

Cases Cases (%)

LEARN; STUDENT; TEACHER; LEARNER; EDUC; COLLABOR; TECHNOLOGI

9.38

21,504

752

92.7

Health care

CARE; HEALTH; PATIENT; MEDIC; INFORM; PRACTIC; PROFESSION

2.97

54,753

775

95.6

Measures

ITEM; SCALE; MEASUR; SCORE; WA; QUESTIONNAIR; ASSESS

2.35

25,758

759

93.6

Social support WEAK; TIE; TI; NETWORK; SUPPORT network analysis

2.26

13,485

739

91.1

Mobile devices

MOBIL; DEVIC; PHONE; APP; DIGIT; MONITOR; TRACK

2.11

11,251

680

83.9

Weight loss

WEIGHT; LOSS; OBES

2.00

4616

419

51.7

Ethnicity and gender

ETHNIC; GENDER; AG; STATU; BLACK

1.88

7575

640

78.9

Disease outbreak surveillance

OUTBREAK; SURVEIL; DISEAS; INFECT; INFLUENZA; VACCIN; UENZA

1.86

6349

469

57.8

Stopping smoking

SMOKE; CESSAT; SMOKER

1.71

2363

183

22.6

Efficacy

EF; CACI; FECT

1.68

2798

304

37.5

Family

MOTHER; INFANT; PARENT; CHILDREN; BODI

1.66

3764

537

66.2

Product quality

HEDON; BEAUTI; USABL; PRODUCT; QUALITI

1.61

5776

634

78.3

Social media

FACEBOOK; MEDIA; TWITTER; SOCIAL; SITE; BLOG; POST; SHARE; CONTENT

1.54

23,283

746

92.0

Hypertension

PRESSUR; BLOOD

1.52

1537

269

33.2

Chronic diseases

CHRONIC; PAIN; DISEAS; ILL

1.48

3190

452

55.7

Palliative care

PALLI; TELECONSULT

1.46

510

25

3.1

Activity

ACTIV; TECHNIQU; AR

1.45

22,405

764

94.2

Controlled trial

TRIAL; INTERVENT; RANDOM; CONTROL

1.42

17,838

677

83.5

often have low marginal costs, providing specific services can be undertaken while lowering costs. As a result they conclude that [g]iven their potential for low costs, scalability, adaptability, and effectiveness, Internet interventions may be appropriate for dissemination to a range of settings (e.g., health systems, health plans, employers, municipalities). However, each of these settings varies considerably with regard to their resources, expertise, interest, and ability to implement Internet interventions independently.  (p. 279)

Table 3.4  Comparison between Concepts and WordStat Topics Concept/Topic Palliative care Stopping smoking Hypertension Efficacy Weight loss Chronic diseases Disease outbreak surveillance Family Product quality Ethnicity and gender Controlled trial Mobile devices Social support network analysis Social media Educational technology Measures Activity Health care

Disease

Body

X

Care

Health

Behavior

X X

X

Loss

Activity

Network

Communication

Child Intervention

X

X

X

X X X X

X

X X X X X

X

64   Simeon J. Yates et al. This shift also reflects the rise of new digital forms such as social media. For example, Chou et al. (2009) pointed out that US-based health-related communication programs, which seek to impact population health (such as smoking cessation and dietary interventions), should consider carefully key social factors when looking to communicate via social media. They argue that social networking sites by far attract the most users, making them an obvious target for maximizing the reach and impact of health communication and eHealth interventions.  (p. 9)

In looking specifically at communication around cancer they found that among family members who had cancer, there was a high prevalence of Internet and social media use. This therefore made social media a potentially fruitful route to “‘secondary audiences,’ that is, caregivers, family, and friends of cancer patients” (pp. 9–10). Thus they concluded that “social media promise to be a way to reach the target population regardless of socioeconomic and health-related characteristics” (p. 10). Househ et al. (2014) also explored the role of social media, in community empowerment in US health care contexts. They argued that there is a promising future for social media in community engagement, information sharing, data collection functions, appointment setting, prescription notifications, providing health information, engagement of the elderly, improved participation, autonomy, motivation, trust, and perceived self-efficacy.  (p. 56)

On the other hand, they point out key challenges related to the use of social media for health care, such as privacy, security, the usability of social media programs, the manipulation of identity, and misinformation. These factors can pose serious threats to patient safety if not addressed appropriately by those who wish to engage patients through social media.  (p. 56)

Measures and measurement. Another theme in the literature is the interplay between using digital media derived data, using digital tools to collect data, and measuring the impacts of digital media use or digital media interventions in the health context. Very often these three elements are combined. In the early 2000s the focus around measurement appears to be on tools such as Internet surveys. For example, Eysenbach and Wyatt (2002) examined the use of the Internet to conduct research as well as other parts of the analysis. They provided recommendations for implementing Internet-based surveys as well as emphasizing ethical considerations. They focused on Internet survey methods and did not address the use of “big data” nor data scraped from social media, issues that have become more prevalent in recent years. But two of their key warnings are still very relevant: In ‘open’ surveys conducted via the Internet where Web users, newsgroup readers, or mailing list subscribers are invited to participate by completing a questionnaire,

ESRC Review: Health and Well-Being   65 selection bias is a major factor limiting the generalizability (external validity) of results . . . The ethical issues involved in any type of online research should not be forgotten. These include informed consent as a basic ethical tenet of scientific research on human populations, protection of privacy, and avoiding psychological harm.  (p. 4)

As noted earlier, the analysis of the literature sees a strong shift towards issues of digital media in health interventions. For example, Glasgow (2007) examines the measurement and assessment of eHealth intervention and behavior change programs and provides recommendations on design, measurement, and methods, concluding with four main recommendations, First, explore outside of research silos, meaning work across different illnesses taking into account multiple variables. Second, explore the role of human support, which could be the most important contextual factor. Third, tailor experiment design and reporting criteria to eHealth questions, meaning that they have to be interactive, user-centered, dynamic, and evolving. Fourth, follow translation and diffusion theories of technology uptake and innovation. They point out that [t]he majority of evidence-based health care procedures fail to translate into practice. Part of the reason for this failure to translate is because of the research methods most often used to evaluate interventions. In particular, typical designs do not address external validity concerns or provide information relevant to policymakers or to those considering program adoption.  (p. 120)

He concludes that “eHealth is complex, contextual, evolving, and has effects at multiple levels. The designs and measures for eHealth research need to have these same characteristics” (p. 125). Given the nature of this domain, the focus of many papers is on the empirical evaluation of specific digital interventions from bespoke digital tools to general social media campaigns. Often these involve the application of a digital media format to a specific health intervention. One example is provided by Coyle and Doherty (2009), who examined the use of a 3D computer game developed to support adolescent mental health interventions. This was a goal-oriented computer game that adolescents and therapists could play together in sessions. In the evaluation, the shortcomings that therapists mentioned are applicable to many digital technology interventions. These included an overreliance on literacy skills, lack of engagement with the specific technology, and a need to adapt to clients’ needs (for example, choosing more suitable characters). As we have noted elsewhere in this volume (especially chapters 18, 19, and 20), issues of digital literacy and digital efficacy underpin many aspects of digital media use. Having noted these shortcomings, Cole and Doherty argue, “The initial clinical evaluation of [the game] has provided evidence that computer games have the potential to assist therapists working with adolescent clients” (p. 2058). But they add that [f]uture projects in the MHC [mental health care] domain may benefit from more rigorously applying traditional user-centered requirements gathering techniques. However, the problem of access to clients by HCI researchers still remains.

66   Simeon J. Yates et al. Techniques are required which help HCI researchers to gain access to the tacit knowledge of MHC professionals.  (p. 2058)

Such work points out a challenge found in many other domains (e.g., social care, government policy interventions, etc.), where existing design and evaluation tools are designed around existing practice and need to take on methods from digital and computer science disciplines. Coyle and Doherty also note that development and evaluation of digital systems is time-consuming; therefore “systems should aim to be useful to a broad range to therapists, in a broad range of settings and with a broad range of clients” (p. 2059). Not only are digital media forms (e.g., games and social media) being applied in health settings, but also digital devices are now key to monitoring and evaluating health, both personal (e.g., wearables) and public (e.g., environmental monitoring and sensors). Again the number and range of papers in this area is vast. An example from our corpus is Pantelopoulos and Bourbakis (2008), who reviewed the research and development of wearable biosensor systems for health monitoring. These can provide low-cost unobtrusive solutions for continuous all-day and any-place health, mental, and activity status monitoring. The article outlines the technical challenges of these technologies. As with many other evaluations in this area, they found that many of the systems in fact remain poorly designed for wearability, for many practical reasons to do with size, weight, and complexity. Alternatively, they propose that integration of these tools into clothing and textile modules is an efficient alternative approach, though it has the disadvantage of being less scalable. There is then a tension between scalability (e.g., mass availability) and wearability. As with other digital technologies in the medical domain, security is a key concern. They concluded that “integration of proper encryption and authentication mechanisms is required to ensure privacy and security of personal health data” (p. 4890). Mobile and smart phone devices. Over the same period, mobile and smartphone devices have increased in popularity globally. They clearly have become a major target for digital health solutions from health and activity monitoring to information and advice to supporting behavior change. It is unsurprising therefore that this is a topic identified in the literature. For example, Dennison et al. (2013) argue that young, currently healthy adults have interest in apps that attempt to support health-related behavior change. The factors that most influence their app use were accuracy and legitimacy, security, effort required, and immediate effects on their mood and well-being. However, they point to drawbacks, such as context skepticism and security and privacy of healthrelated data, especially keeping control over what apps can do with the user’s health data. Dennison et al. raise doubts “around whether users will use behaviour change apps for long periods of time, a critical issue that will affect the effectiveness of many behavior change apps” (p. 8). As was noted in the work on wearables, there are concerns about usability and accuracy. Dennison et al. noted that “participants lacked faith in the accuracy with which a smartphone could sense relevant states (e.g., mood, activity levels)

ESRC Review: Health and Well-Being   67 and expected that incorrect and irritating suggestions would make them mistrust the app and cease using it” (p. 9). Importantly, this work identified concerns among users as to whether health apps were linked to digital media such as social networks. One of the areas of intervention in the literature is that of self-diagnosis. As an example, Lupton and Jutel (2015) analyzed the way lay people negotiated the use of self-diagnosis smartphone apps in mid-2014. Their main findings are that they represent a contested and ambiguous site for meaning and practice in relation to personal health. Importantly, they point out that many apps purport a level of medical authority that they may not possess, and that much of this is undertaken through the presentation of information and imagery related to broader societal discourse around “healthy living” (p. 131). As they note: Self-diagnosis apps (. . .) state and engage with the discourses of healthism and ­control that pervade contemporary medicine. They also participate in the quest for  patient ‘engagement’ and ‘empowerment’ that is a hallmark of digital health ­rhetoric  (p. 132)

Lupton and Jutel point out that the implied medical authority combined with the apparent accuracy of “algorithms” provides a basis for both their promotion and use. Yet the users themselves are well aware of their own status as “not medically qualified.” The combination of both user uncertainty and, in some cases, the lack of robust medical evaluation and transparent algorithms means that there remain many challenges to making such systems work. Such work highlights the challenge found elsewhere outside the health domain, that digital technologies disrupt (for good or ill) existing systems and in many cases both individual practice and necessary societal regulation may take time to catch up. One area that is strongly prevalent in the literature is that of using mobile and smartphone devices to support patients with long-term (chronic) conditions. For example, Gollamudi et al. (2016) find that smartphones data allow these patients to make informed health decisions, though they point out that this changes the dynamics of health care relationships: [O]ne of the more intriguing aspects of this technology as a tool to enhance ­individual health is that data is collected, stored, and presented digitally without the  need for direct interaction between the user and (as traditional) health ­professional.  (p. 12)

Another area of work we noted in the literature and which may need to be better developed and formalized within the medical domain is the systematic comparison of digital solutions. For example, in the case of enhanced self-management of the chronic arthritic-like condition of gout, Nguyen et al. (2016) reviewed 57 mobile health apps. Very few apps met the internationally accepted gout management guidelines, with only one meeting all requirements. As noted previously, it is clear that more systematic work

68   Simeon J. Yates et al. is needed to assess the viability of such apps. Nguyen et al. point out a range of limitations in the apps with regard to this specific condition, especially the lack of routes for accessing health care professionals, but still argue that [T]he use of mobile applications to support self-management of chronic conditions presents much potential. The extent to which such apps contain content consistent with treatment guidelines and are user-friendly is central to their likely adoption and effectiveness.  (p. 71)

Social support. With the rise of social media, we also see a range of literature concerned with social support in health contexts. This work goes back to some of the earliest work around online communities with a focus on Internet fora. For example, Richardson (2004) explored issues of Internet use and heath debates across Cancer, SARS, and the debate about the measles/mumps/rubella vaccine and Autism. Such work has taken on much greater importance in recent years as citizens and patients have become able to engage others, often of like minds, on such issues via social media. This range of work is very broad and overlaps with research around online communities, issues of identity, and political debate where health issues are tied to policy issues. We will focus here on the more clinical health and well-being issues. As with other material discussed in this chapter, many of the publications evaluate a specific intervention or compare across technology contexts, with foci ranging from perceptions to behavior change to the links between digital media use and health. An example of comparative work is Barrera et al. (2002), who examined if diabetes patients change their perception about support following their participation in Internetbased support groups. The study finds that after three months of intervention, patients who participated in Internet-based social support significantly changed their views compared to those patients who had only participated in computer access to information about diabetes. This was achieved with patients who did not have previous experience with the Internet. In another comparative study, Barak et al. (2009) review the literature about Internet-supported psychological therapeutic interventions, conceptualizing them into four categories: web-based interventions, online counselling and therapy, Internet operated therapeutic software, and other online activities (e.g., as supplements to face-to-face therapy). They concluded that [T]he ability to develop feasible and effective alternatives by exploiting the Internet for clinical work—alternatives that suit many people and distress areas—should be regarded as broadening and expanding the availability of professional help, especially for those who feel comfortable in the virtual environment.  (p. 14)

Such work highlights the conceptual challenge of tidying up the conceptualization of, and regulating and assessing different forms of, digital media-based interventions in the medical context. Overall, much of the work in this area is not about direct clinical support interventions but rather about fostering patient and citizen empowerment in online support

ESRC Review: Health and Well-Being   69 groups. As an example, earlier work by Barak et al. (2008) point out that online support groups encourage well-being, a sense of control, self-confidence, feeling of more independence, social interactions and self-image, loneliness, optimism, and mood state. Therefore, the authors argue that participation in online support groups can foster personal empowerment, which can help in dealing with feelings of distress, but do not necessarily help in producing therapeutic changes. These groups also have drawbacks, such as developing dependence, developing distance from interpersonal contacts, and experiencing uncomfortable situations which are part of online social interactions. Barak et al. argue that It seems that the basic factors identified by quantitative research, as well as by our qualitative study—impact of writing, expressing emotions, gathering information and improving knowledge, developing interpersonal relationships, and bettering decision-making skills—generate, each and all of them, a personal sense of empowerment.  (p. 1878)

They conclude, however, that such groups are not a substitute for professional treatment where such clinical intervention is needed but can offer a complementary component to such interventions. Yet, there are always challenges in regard to communication, digital skills, and competences in such circumstances. These may interact with and influence both outcomes and well-being in and of themselves. Wright et al. (2013) argue that interpersonal motives, increased face-to-face communication, communication competence, and computer competence can predict whether college students are feeling more depressed. One of the most important skills to reduce depression was found to be communication competence, which is a set of skills that enable college students to mobilize social support in a better way. Weight loss. One area that brings all these issues of health care, social support, device use, monitoring, measurement, and personal digital technology use is that of weight loss. This is a domain where online groups, digital media, and apps have all been both promoted and critiqued as routes to intervention (or not). It is not unsurprising then that this has been highlighted as one of the few specific health topics in the analysis of the literature. One immediate question is the extent of the link between digital media use (or at least data on digital media use) and the prevalence of obesity. For example, Chunara et al. (2013) examined the relationship between online social environments via web-based social networks and population obesity prevalence. Their main finding is that activity-related interests (such as television watching as opposed to sports) across the United States and neighborhoods in New York City were significantly linked with obesity. They argue that their study corroborates the association of social environments and obesity, and also begins to uncover aspects of the environment, such as interests in the online medium, and how they are positively or negatively related to this outcome. Sharing of these norms through Facebook may also be magnified because network connections are ‘friends’; people who likely share demographic profiles, meaning there messages are better focused.

70   Simeon J. Yates et al. Issues of digital self-monitoring are also found in the literature. Steinberg and others (2013) examine the impact of weight loss interventions that focus on self-monitoring digital techniques such as “smart scales” (which displayed current weight and sent it directly to a website), a web-based weight loss graph, and weekly tailored feedback via emails. These interventions have proved to be successful when combined with other intervention elements. They found that a lower intensity weight loss intervention that focused on daily self-weighing as the main self-monitoring strategy and also included emailed tailored feedback and skills training with no regular face-to face-contact or focus on self-monitoring of diet and physical activity behaviors produced clinically significant weight losses.  (p. 8)

With regard to mobile and smartphone interventions, Svetkey et al. (2015) examined the efficacy of mobile health weight loss intervention apps in young adults: smartphone selfmonitoring, or personal coaching enhanced by smartphone self-monitoring (PC), compared with a control group. They concluded that digital interventions were not successful. This lead them to the conclusion that a combination of methods, both digital and social support of human interaction which are adaptive can be more beneficial. The researchers found that relative to the control group: “neither a mobile app alone nor personal coaching with mobile self-monitoring resulted in statistically significant weight loss after 24 months” (p. 2139). Like many other studies, they concluded that iterative and rapid development and testing of health interventions in context are needed to ensure the best outcomes. Summary. We would argue that there has been a shift in focus from health care technologies, to interaction with health care technologies, to a greater focus on the role of digital technologies in intervention, especially in regard to health behaviors and perceptions. Where the focus is on non-clinical and community interventions, there is notable overlap with the literature around digital communities. In regard to digital clinical interventions from this selection of literature, it is clear that much more work is needed on the veracity, development, and regulation of such tools.

Theory, Method, and Approach As with the other review chapters, this analysis builds on Borah (2017). A slight majority of the analyzed papers (52%) were deductive, applying existing theory (Table 3.5). Nearly Table 3.5  Epistemological Approach Percent Deductive (testing of existing theory) Inductive (conclusions driven by data)

51.5 48.5

ESRC Review: Health and Well-Being   71 half of papers utilized primary collected data (48%), with 43% of the papers using ­secondary data (Table 3.6). In line with the focus on health interventions and health behavior, the main disciplines from which theory was used or for which theory was developed were psychology (50%), sociology (19%), health studies (8%), communication and media (8%), and information studies (5%). There was considerable variety in the specific theories applied from these disciplines. Theories of behavior change, social cognition, and planned behavior (each 8% of total) were the main theories in psychology studies, while social network analysis was the most frequent theory (2% of total) in sociology articles. There was a fairly even split between statistical and qualitative approaches (Table 3.7). For those items that undertook empirical research, the main research methods were predominantly quantitative: experiments or comparisons (19%), surveys (11%), social network analysis (3%), and meta-analysis (4%) (Table 3.8). The majority of the empirical work focused on specific groups, but with a larger proportion of general population studies (31.5%) than in the other domains (Table 3.9). Less than 2% of the work described itself as using a “big data” approach. This domain is notably different than the others in two clear respects. First, the number of published papers by identified authors was much higher, and second, the majority of these reported quantitative empirical studies. Much of the work was broadly psychological and focused on the role of digital technologies in supporting or driving health behavior changes. This is reflected in the main theories identified in the literature. Unlike the other domains, there is a limited amount of reflection on the broader social or health impacts of digital media.

Table 3.6  Empirical Approach Percent Primary empirical (data collected and analyzed) Secondary empirical (analysis of existing data) Discursive/descriptive (no new data or theory) Theoretical (synthesis of current or prior work)

48.0 43.4 8.2 .5

Table 3.7  Analytic Approach Percent Qualitative (textual—non-discourse) Statistical (numerical) Not applicable Discourse (textual—linguistic-discourse)

48.4 42.6 8.3 .7

72   Simeon J. Yates et al. Table 3.8  Research Method Percent Literature Review (general or narrative) Other Experiment Survey Interview(s) Content analysis Meta-analysis or systematic review Social network analysis Focus groups Textual (linguistic-discourse analysis) Ethnography

28.6 22.0 18.8 10.8 6.6 4.5 3.3 2.6 2.0 .4 .4

Table 3.9  Study Population Percent Specific group General population Not applicable Case study (studies)

53.8 31.5 12.8 1.9

Delphi Review This section provides details of the results of the Delphi process for the Health and WellBeing domain, covering suggested scoping or research questions, key topics to address within these questions, and key challenges to researching these questions.

Future Research and Scoping Questions The Delphi review identified a set of scoping questions for the domain, which were coded into four categories: design for positive health impacts of digital technology use;  health behavior and using digital technologies; health user needs; and negative health impacts of digital technology use (Table 3.10). Their ranked importance from the confirmatory survey is given in Table 3.11. It is important to note that ranked importance is almost the inverse of the number of questions allocated to the category.

ESRC Review: Health and Well-Being   73 Table 3.10  Delphi Review Scoping Questions Question category

Example questions

Design for positive health impacts of digital technology use

What types and amounts of technology make us healthier, better educated and more secure? How can we design technology assist in making us healthier, better educated and more secure? How can we design technology to support us being healthier and thrive psychologically? What are the best practices/processes in the design of technology that will make us healthier, better educated and more secure?

Health behavior and using digital technologies

How do people engage with technology to improve health and well-being? You could extend well-being to personal and social well-being What motivates people to be healthier, better educated and more secure, and how can these motivational drivers be incorporated into technology?

Health user needs

What are the factors that lead to development of health information technology programs that meet the needs and capacities of different users? How can research be used to guide the strategic development of health information technology programs that meet the needs of different users? How can we engage different technology users in developing and implementing strategic health information systems that will meet their health information and support needs?

Negative health impacts of digital technology use

What isn’t asked here though is if technology is also hurting health. I.e., is it replacing going to the doctor, moving around (not just sitting in front of a computer all the time), too much sitting, lack of social ties, etc.? Does the use of digital technology contribute positively to our health and well-being?

Table 3.11  Delphi Review Scoping Questions Ranked by Importance Question category Design for positive health impacts of digital technology use Health behavior and using digital technologies Negative health impacts of digital technology use Health user needs

Percent 30.8 30.8 20.5 17.9

The consultation workshop found these scoping areas too broad and noted that the issue of “design” created a focus on devices and away from a more holistic view of societal health and well-being. The workshop suggested other scoping areas or questions. These include that more should be done to understand the role of digital technologies in health inequalities (do they help to alleviate, reproduce or deepen these inequalities?) and to link educational technology and health (for example, to think about learning

74   Simeon J. Yates et al. about well-being and the role of digital technology in this). The workshop also suggested addressing the governance of digital health technologies and the need for detailed systematic evidence of the impact and lived experience of everyday health technologies (e.g., fitbits). Finally, they recommended looking at the broader socio-economic and technical challenges of “joining up” health providers and services through digital technologies, and examining more questions of health and well-being in the digital workplace. The topics identified in the Delphi review were then coded into 11 categories as detailed in Table 3.12, with their ranked importance from the confirmatory survey are presented in Table  3.13. As with the scoping questions, those topics that were most Table 3.12  Key Topics Ranked by Percentage of Delphi Survey Responses Topic

Percent

Device, environment and service design Benefits and harm from digital technology use Health communication Education Device and service design Digital literacy Other Preventative and long-term condition support Digital divide Organizational change Privacy

31 15 15 10 5 5 5 5 3 3 3

Table 3.13  Key Topics Ranked by Importance from Delphi Survey Topic

Very important

Important

Neutral

Unimportant

Very unimportant

Benefits and harm from digital technology use

76.9%

23.1%

0.0%

0.0%

0.0%

Health communication

46.2

46.2

7.7

0.0

0.0

Privacy

46.2

38.5

7.7

7.7

0.0

Device, environment, and service design

38.5

53.8

7.7

0.0

0.0

Preventative and long-term condition support

38.5

46.2

15.4

0.0

0.0

Digital divide

38.5

30.8

15.4

15.4

0.0

Digital literacy

30.8

38.5

23.1

7.7

0.0

7.7

76.9

15.4

0.0

0.0

Organizational change

ESRC Review: Health and Well-Being   75 commonly cited in the Delphi workshop were not those deemed most important in the review. The four most frequent were device, environment, and service design; benefits and harm from digital technology use; health communication; and education. Benefits and harm from digital technology use received by far the highest importance ratings, followed by health communication and privacy. The consultation workshop identified a set of additional potential topics within the health care domain. These were: what are “healthy” environments or “life worlds” and what role can digital technologies have in these; how do or can digital technologies help people to generate their own definition of a healthy “lifeworld”; and finally, understanding the impact of major digital platforms on behavior, perception of health and well-being, and routes to health information.

Research Challenges The challenges in undertaking research in this area identified by the Delphi panel were placed into seven categories. These categories are detailed in Table 3.14 and ranked by the percentage of coded items. The ranking of these by the confirmation survey are presented in Table 3.15. The methods category was twice as frequent as the next category, processes of co-design, followed by collecting and accessing data. Methods were also rated as the most important challenge, followed by rapid changes, big data for health, and interdisciplinarity. The consultation workshop agreed with the challenges identified by the Delphi ­process, in particular focusing on “big” health data, personal and commercial uses of health data, linking personal and clinical health data with well-being outcomes, governance in digital health care, and digital technologies’ role in the rich pathways of health and social care. Combining this broad range of ideas with the material in the literature provides a clearer picture. The next section undertakes this reflection.

Table 3.14  Challenges Ranked by Percent of Cases Challenge Methods to analyses digital health Processes of co-design Collecting and accessing data on digital health Rapid change in digital and health technology Big data for health Education Interdisciplinarity

Percent 46 21 14 7 4 4 4

76   Simeon J. Yates et al. Table 3.15  Challenges Ranked by Importance from Delphi Survey Challenge

Very important

Important Neutral Unimportant Very unimportant

Methods to analyze digital health

61.5%

30.8%

7.7%

0.0%

0.0%

Rapid change in digital and health technology

38.5

61.5

0.0

0.0

0.0

Big data for health

38.5

46.2

15.4

0.0

0.0

Interdisciplinarity

38.5

46.2

15.4

0.0

0.0

Collecting and accessing data on digital health

30.8

61.5

7.7

0.0

0.0

Processes of co-design

30.8

46.2

15.4

7.7

0.0

Conclusion As in the Communication and Relationships (chapter 8), and the Communities and Identities domains (chapter 14), much of the work in the Health and Well-Being domain appears to be focused on specific technologies, in this case the use of bespoke or platform technologies to impact health behavior. There are few if any examples of crossplatform or holistic assessments examining the effects of broad, everyday digital technology use on health and well-being. There were also clear crossovers with the Communication and Relationships (see chapter 8) and the Communities and Identities domains (see chapter 14). Much of the work involved aspects of health communication supported by digital technologies, or at least interaction with digital technologies that afforded aspects of patient-carer-doctor-service interactions. There were also a good number of cases focused on the role of online health support communities. Health and well-being may therefore be a context for applied communications and community research. To summarize, the majority of the literature in the Health and Well-Being domain is focused on the evaluation of digital health technologies. There appears to be a limited literature on the broader question of the impacts of digital lifestyles on health and wellbeing and limited work on the negative impacts of the digital technologies. Moreover, the broader social questions identified in the Delphi work and consultation workshops that appear to go beyond the literature include the following: • Understanding and addressing the governance of digital health technologies • Need for detailed systematic evidence of the impact and lived experience of everyday health technologies (e.g., fitbits) • Questions of health and well-being in the digital workplace

ESRC Review: Health and Well-Being   77 • Digital technologies and health communication and health behavior change • Broader socio-economic challenges and issues in “joining up” health providers and services through digital technologies

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualizations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https://waysofbeingdigital.com/literature-analysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each domain, and the relative frequency of concepts associated with each cluster.

References Aydin, C. E., & Rice, R. E. (1991). Social worlds, individual differences, and implementation: Predicting attitudes toward a medical information system. Information & Management, 20(2), 119–136. Barak, A., Boniel-Nissim, M., & Suler, J. (2008). Fostering empowerment in online support groups. Computers in Human Behavior, 24(5), 1867–1883. Barak, A., Klein, B., & Proudfoot, J. G. (2009). Defining internet-supported therapeutic interventions. Annals of Behavioral Medicine, 38(1), 4–17. Barrera, M., Glasgow, R. E., Mckay, H. G., Boles, S. M., & Feil, E. G. (2002). Do Internet-based support interventions change perceptions of social support? An experimental trial of approaches for supporting diabetes self-management. American Journal of Community Psychology, 30(5), 637–654. Bennett, G. G., & Glasgow, R. E. (2009). The delivery of public health interventions via the Internet: Actualizing their potential. Annual Review of Public Health, 30, 273–292. Bennett, G. G., Steinberg, D. M., Stoute, C., Lanpher, M., Lane, I., Askew, S., . . . & Baskin, M. L. (2014). Electronic health (eHealth) interventions for weight management among racial/ ethnic minority adults: A systematic review. Obesity Reviews, 15, 146–158. Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636.

78   Simeon J. Yates et al. Chou, W. Y. S., Hunt, Y. M., Beckjord, E. B., Moser, R. P., & Hesse, B. W. (2009). Social media use in the United States: Implications for health communication. Journal of Medical Internet Research, 11(4). Chunara, R., Bouton, L., Ayers, J. W., & Brownstein, J. S. (2013). Assessing the online social environment for surveillance of obesity prevalence. PloS One, 8(4), e61373. Coyle, D., & Doherty, G. (2009, April). Clinical evaluations and collaborative design: Developing new technologies for mental healthcare interventions. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 2051–2060). ACM. Dennison, L., Morrison, L., Conway, G., & Yardley, L. (2013). Opportunities and challenges for smartphone applications in supporting health behavior change: Qualitative study. Journal of Medical Internet Research, 15(4). Eysenbach, G., & Wyatt, J. (2002). Using the Internet for surveys and health research. Journal of Medical Internet Research, 4(2). Glasgow, R. E. (2007). eHealth evaluation and dissemination research. American Journal of Preventive Medicine, 32(5), S119–S126. Gollamudi, S. S., Topol, E. J., & Wineinger, N. E. (2016). A framework for smartphone-enabled, patient-generated health data analysis. PeerJ: Life and Environment, 4, e2284. https://doi. org/10.7717/peerj.2284 Househ, M., Borycki, E., & Kushniruk, A. (2014). Empowering patients through social media: The benefits and challenges. Health Informatics Journal, 20(1), 50–58. Lupton, D., & Jutel, A. (2015). “It’s like having a physician in your pocket!” A critical analysis of self-diagnosis smartphone apps. Social Science & Medicine, 133, 128–135. Nguyen, A. D., Baysari, M. T., Kannangara, D. R., Tariq, A., Lau, A. Y., Westbrook, J. I., & Day, R. O. (2016). Mobile applications to enhance self-management of gout. International Journal of Medical Informatics, 94, 67–74. Pantelopoulos, A., & Bourbakis, N. (2008, August). A survey on wearable biosensor systems for health monitoring. In Engineering in Medicine and Biology Society, 2008. EMBS 2008. 30th annual international conference of the IEEE (pp. 4887–4890). IEEE. Richardson, K. (2004). Internet discourse and health debates. New York: Palgrave MacMillan. Steinberg, D.  M., Tate, D.  F., Bennett, G.  G., Ennett, S., Samuel-Hodge, C., & Ward, D.  S. (2013). The efficacy of a daily self-weighing weight loss intervention using smart scales and e-mail. Obesity, 21(9), 1789–1797. Svetkey, L. P., Batch, B. C., Lin, P. H., Intille, S. S., Corsino, L., Tyson, C. C., . . . & Gallis, J. A. (2015). Cell phone intervention for you (CITY): A randomized, controlled trial of behavioral weight loss intervention for young adults using mobile technology. Obesity, 23(11), 2133–2141. Wright, K.  B., Rosenberg, J., Egbert, N., Ploeger, N.  A., Bernard, D.  R., & King, S. (2013). Communication competence, social support, and depression among college students: A  model of Facebook and face-to-face support network influence. Journal of Health Communication, 18(1), 41–57.

chapter 4

Compu ter-M edi ated Com m u n ication a n d M en ta l H e a lth A Computational Scoping Review of an Interdisciplinary Field Adrian Meier, Emese Domahidi, and Elisabeth Günther

Introduction Since the earliest days of Internet, mobile phone, and social media use, researchers and the general public have debated how computer-mediated communication (CMC) is related to mental health (e.g., Kraut et al., 1998; Kross et al., 2013; Turkle, 2011; Twenge, Martin, & Campbell, 2018). Today, various disciplines (e.g., communication, psychology, sociology, medicine) investigate a smorgasbord of CMC variables in relation to a great variety of negative (i.e., psychopathology) and positive (i.e., psychological wellbeing) markers of mental health. Research in this field asks questions as diverse as, is loneliness a driver or outcome of Facebook use (Song et al., 2014)? Does passively browsing through Instagram increase depression levels by eliciting upward social comparison and envy (Verduyn, Ybarra, Résibois, Jonides, & Kross, 2017)? Does mobile voice communication increase social capital and, hence, affective well-being (Chan, 2015)? This diverse interdisciplinary field has become seemingly impossible to overview, with both primary research studies and reviews being published at what appears to be a rapidly increasing rate. Currently, various reviews exist, each synthesizing only a fraction of the available evidence on the relation between CMC and mental health (e.g., Domahidi, 2018; Huang, 2010; Huang, 2017; Liu, Ainsworth, & Baumeister, 2016). This fragmented state of the research landscape calls for a higher-level integration.

80   Adrian Meier ET AL. We  answer this call in the form of a scoping review (Colquhoun et al.,  2014; Pham et al., 2014), using both computational and qualitative methods to chart the boundaries of this emerging research field and to identify its core topics. In defining and mapping the field of CMC and mental health research, we integrate this fast-growing and interdisciplinary literature in the hopes of assisting researchers in navigating through it. Therefore, this review has three main goals: 1. To assess the scope, growth, and current state of the field by tracing the development of core topics in research on CMC and mental health for the last 20 years. 2. To characterize the publication behavior in the field, specifically by illuminating who contributes to it (i.e., journals and disciplines). 3. To identify patterns of how the key construct of mental health has been studied in relation to CMC. We first define our key constructs, CMC and mental health, and provide a brief overview of the state of the field. Guided by five research questions and three hypotheses, our scoping review then addresses the goals outlined. Results of this comprehensive assessment of the literature are discussed with regard to implications for a future research agenda.

Computer-Mediated Communication and Mental Health Defining Key Constructs As a first step towards an overview of the research field, the key constructs—CMC and mental health—require thorough definition. We understand both terms as umbrella constructs for a variety of technological (i.e., CMC) and psychological (i.e., mental health) phenomena and consequently define them broadly. We reviewed classical and more contemporary uses of the term (e.g., Hiltz & Turoff,  1978; Lee & Oh,  2015; Walther,  1992), and arrived at a broad definition of ­computer-mediated communication (CMC) as multimodal human-to-human social interaction mediated by information and communication technologies (ICTs). Social interaction here encompasses all forms of interpersonal behavior, including everything from mere social attention (e.g., browsing through the Facebook News Feed) to deep communication (e.g., a conversation via voice call; cf. Hall, 2018). We also limit our definition to those ICTs whose primary and original—though not exclusive—function is the facilitation of CMC as social interaction (e.g., email, chat, mobile text messaging, instant messenger, social network sites, but not, e.g., games). Turning to our second umbrella construct, mental health is commonly understood from two distinct perspectives: mental illness (psychopathology) and mental

Computer-Mediated Communication and Mental Health   81 thriving (psychological well-being). Psychopathology (PTH) refers to “any pattern of behavior—broadly defined to include actions, emotions, motivations, and cognitive and regulatory processes—that causes personal distress or impairs significant life functions, such as social relationships, education, work, and health maintenance” (Lahey, Krueger, Rathouz, Waldman, & Zald, 2017, p. 143). Psychological well-being (PWB), in contrast, is understood as a positive condition characterized by “optimal psychological functioning and experience” (Ryan & Deci, 2001, p. 142). Note that we exclude physical health, as well as socio-economic well-being, with these definitions. This review is based on the extended two-continua model of mental health developed by Meier and Reinecke (2020). Based on previous two-continua models (Greenspoon & Saklofske,  2001; Keyes,  2007), they integrate the PTH and PWB perspective into a coherent framework and argue for a simultaneous assessment of both the negative (PTH) and positive (PWB) side of mental health in relation to CMC. Based on a review of recent PTH and PWB literature (e.g., Huta & Waterman, 2014; Lahey et al., 2017), Meier and Reinecke (2020) further differentiate indicators of PTH (into externalizing and internalizing disorders and symptoms) and PWB (into hedonic and eudaimonic dimensions) and complement the two mental health continua with risk factors (e.g., loneliness, stress, poor sleep) and resilience factors (e.g., social resources, self-esteem), which are frequently studied in relation to CMC. Note that these risk and resilience ­factors are understood as important predictors of PTH and PWB, but not indicators of mental health in a strict sense. Next, we specify how CMC and mental health can relate to each other and which of these relationships is eligible for our proposed definition of the field. This review is motivated by the long-standing public and research debate on the key question of whether the availability and usage of CMC hampers or contributes to “the good life” (e.g., Kraut et al.,  1998). Accordingly, we limit our review to two perspectives that address how the usage of CMC relates to indicators of mental health. In the first perspective, CMC is understood as a causal factor contributing to declines or improvements in mental health (i.e., the technology effects perspective), while in the second, mental health is understood as a causal factor explaining amount or types of CMC usage (i.e., the technology selection perspective). Other approaches to CMC and mental health (e.g., inferring mental health from CMC data traces or delivering mental health treatments via CMC) go beyond the focus of this review.

The State of the Research Field This section provides a brief narrative overview of the development and current state of research on CMC and mental health in order to illustrate why we believe the field can benefit from a higher-level scoping review. Following a long-standing research debate about the quality of social interaction via CMC vs. face-to-face communication (e.g., Rice, 1980; Walther, 1992), the first study to explicitly investigate CMC in relation to mental health and hence virtually constitute

82   Adrian Meier ET AL. this field was Kraut et al.’s (1998) HomeNet study. In a longitudinal panel, 73 households were surveyed during their first two years of Internet use. Although the authors found that participants had frequently used the Internet for social interaction, higher levels of Internet use were negatively related to indicators of social involvement and mental health. The authors explained these negative effects through the displacement of offline social activity and strong tie communication. The study received various critical responses (e.g., Walther & Parks, 2002) and follow-up studies both succeeded (e.g., Nie, Hillygus, & Erbring, 2002) and failed (e.g., Kraut et al., 2002) to replicate its findings. Specifically, the same authors in a later wave of the HomeNet panel found no evidence for social displacement (Kraut et al., 2002). Instead, in a second sample, they reported evidence for positive effects of Internet use on mental health, at least for users high in extraversion and social support, labeling this a “rich get richer” or social enhancement effect (as compared to a “poor get richer” or social compensation effect). For reviews of this early research, see, for example, Bargh and McKenna (2004), Huang (2010), Katz and Rice (2002), or Valkenburg and Peter (2009). Since then, numerous studies have addressed the core question of whether Internet use and CMC impact social resources, and, hence, mental health (for recent reviews, see, e.g., Domahidi,  2018; Forsman & Nordmyr, 2015; Liu et al., 2016; Mikal, Rice, Abeyta, & DeVilbiss, 2013). However, beyond the focus on social resources, the field has markedly branched out in recent years, specifically since CMC via social network sites (SNS) and mobile (smart)phones has permeated much of daily life. While researchers continue to address social resources in relation to these newer ICTs (e.g., Chan,  2015; Ellison, Steinfield, & Lampe, 2007), numerous other lines of inquiry have emerged. Researchers have, for instance, started to address how passively consuming others’ positively biased self-presentations in SNS is linked to mental health, specifically through the lenses of social comparison and envy (Verduyn et al., 2017). The authenticity of SNS ­self-presentations has also been linked to the mental health of the presenters themselves (Twomey & O’Reilly, 2017). Furthermore, studies have repeatedly linked “screen time” as a global indicator of ICT usage to the mental health of adolescents (Przybylski & Weinstein, 2017; Twenge, Martin et al., 2018). Various theoretical links between CMC and decreased or increased mental health have also been confirmed, including selfaffirmation (Toma & Hancock, 2013), social sharing of emotions (Choi & Toma, 2014), extended self theory (Clayton, Leshner, & Almond, 2015), multitasking (van der Schuur, Baumgartner, Sumter, & Valkenburg, 2015), or deficient self-regulation (Meier, Reinecke, & Meltzer, 2016), among many others. Simultaneously, a more clinical “addiction” or “problematic usage” approach to CMC and mental health has gained traction (for reviews, see, e.g., Carli et al., 2013; Tokunaga & Rains, 2010). Overall, the field appears to have grown considerably in recent years and has become increasingly difficult to overview for individual researchers. While several reviews of specific relationships between forms of CMC (e.g., SNS use) and single indicators of mental health (e.g., depression) exist (e.g., Baker & Algorta, 2016), there is little awareness of the research field as a whole. This is problematic for at least two reasons: First, researchers may simply not be aware of similar or related work being done outside of their “disciplinary bubbles” or “invisible colleges” (Zuccala, 2006). Without awareness

Computer-Mediated Communication and Mental Health   83 of a field as a whole, researchers may unnecessarily “reinvent the wheel,” especially every time a new ICT grasps (younger) users’ attention. Second, while the diversity of research questions and theoretical concepts in this field seems staggering, several of the topics outlined earlier also show considerable conceptual overlap. Integrating this literature to achieve consensus about its basic concepts (i.e., CMC and mental health), their relation, as well as its core underlying themes and topics, thus appears paramount.

The Present Study: Foci, Hypotheses, and Research Questions Based on the available literature on CMC and mental health and its deficient higherlevel integration as a larger research field, we arrive at three distinct foci for our review: (1) core topics, (2) publication behavior in the field, and (3) mental health concepts. 1.  While our brief narrative review highlights some of the issues that research on CMC and mental health has addressed, it does so in an inherently selective manner. In contrast, here we aim to systematically identify a variety of core topics that have received considerable research attention in the field. Moreover, the development of these topics over time and their relative impact on the field as a whole remain unclear: While some topics may continuously dominate the field, others may have fallen or risen in research attention. Accordingly, we ask the following research questions: RQ1:  What are the core topics of research on CMC and mental health? RQ2:  How are the core topics distributed over time? 2.  Beyond identifying core topics, we also aim to characterize the publication behavior in the research field, that is, the publication rate, publication outlets, and contribution of different disciplines. These criteria allow an assessment of the trajectory of research on CMC and mental health (publication rate) and a critical examination of who (journals and disciplines) contributes to this field. The latter, in particular, may have implications for the kind of research questions asked, the concepts studied (see focus 3: mental health concepts), and the representativeness of research findings. First, based on the increasing public debate on the relationship between CMC and mental health (e.g., Turkle, 2011; Twenge, Martin et al., 2018), a rise in systematic review articles on this issue in recent years (Meier & Reinecke, 2020), and a general upward trend in overall publication output (Günther & Domahidi, 2017), we assume that this research field is growing: H1:  The number of publications in CMC and mental health research has increased over time. Second, we aim to identify key publication outlets that particularly contribute to research on CMC and mental health. This is important for two reasons: First, researchers previously unfamiliar with this field can benefit from knowing which key outlets to turn to for both

84   Adrian Meier ET AL. targeted literature searches and to submit publications that reach an audience likely to be interested in their work. Second, publication outlets can serve as proxies to identify the contributions of different disciplines to this field (see H2 and RQ4). Accordingly, we ask: RQ3:  What are the key outlets that publish research on CMC and mental health? Third, beyond assessing the publication outlets, another approach to mapping a research field lies in assessing its disciplinary boundaries (e.g., de Chavez, Backett-Milburn, Parry, & Platt, 2005). Interdisciplinary research creates a rich and multifaceted literature, but may come at the price of insufficient research integration, for example, because researchers are not aware of relevant work being published outside of their discipline. Our review attempts to further integration by making visible who contributes to the field of CMC and mental health. However, based on our narrative review and the nature of the subject matter studied, we feel safe to assume that the field has been particularly driven by researchers with a background in psychology (e.g., Twenge, Martin et al., 2018) and by papers published in psychological journals (e.g., Kraut et al., 1998; Meier & Reinecke, 2020). Accordingly, we expect: H2:  The relative majority of research on CMC and mental health is published in outlets from psychology. Beyond psychology, however, there may be various other disciplines contributing to this research field due to the increasing concern over and recognition of technology’s impact on society and the individual (see this Handbook). We thus ask: RQ4:  Based on the publication outlets, which other disciplines contribute to research on CMC and mental health? 3.  Our definition of mental health encompasses two distinct perspectives or meta-­ concepts, that is, psychopathology (PTH) and psychological well-being (PWB). However, past reviews have shown little attempts to reflect upon these two mental health concepts that have been studied in relation to CMC (Meier & Reinecke,  2020). For instance, Huang (2017) reviewed the literature on time spent on SNS in relation to ­“psychological well-being,” operationalized via self-esteem, life satisfaction, depression, and loneliness (i.e., lumping together PTH and PWB indicators under the PWB label). Researchers in this field often fail to address whether the mental health indicators they empirically assessed allow statements about PTH or PWB or both (e.g., Kraut et al., 1998). Moreover, at least from a media effects perspective, the choice of mental health indicators in empirical studies should depend on whether one expects CMC to impair mental health, which should favor PTH indicators, or contribute to mental health, which should favor PWB. The choice of mental health concepts—that is, whether researchers measure indicators of PTH or PWB—thus at least partly reflects researchers’ (implicit)

Computer-Mediated Communication and Mental Health   85 assumptions about whether CMC is more relevant for mental illness (PTH) or mental thriving (PWB). These assumptions may vary over time (e.g., due to researchers shifting their focus from negative to positive technology effects, or because some mental health indicators become more relevant with the emergence of new ICTs) as well as discipline and topic (e.g., some disciplines focus on topics that relate CMC only to PTH while others focus on CMC in relation to PWB; cf. de Chavez et al., 2005). This leads us to ask: RQ5:  How are the mental health concepts PTH and PWB distributed (a) over time, (b) within and between disciplines, and (c) within and between topics? While we are interested in the general trajectory of both concepts, there is some reason to believe that, overall, studies will address PTH more often than PWB. Typically, new media and communication technologies are met with cultural critique, skepticism, or even moral panic (e.g., Jensen, 1990; Buckingham & Strandgaard Jensen, 2012). This is certainly the case with CMC, as illustrated by fierce public and research debates about the impact of each new and popular ICT, especially on younger users (e.g., Kraut et al., 1998; Turkle, 2011; Twenge, Joiner, Rogers, & Martin, 2018; Walther & Parks, 2002). With regard to mental health, researchers are then more likely to address CMC in relation to impairments of mental health (i.e., PTH) rather than to contributions to optimal psychological functioning (i.e., PWB). Accordingly, we expect: H3a:  Overall, there will be more research investigating PTH than PWB. However, over time, researchers typically go beyond a “negative effects” paradigm and start investigating the positive potentials of media and communication technologies. Television research, for instance, started by addressing the potentially negative impact on aggression, people’s tendency to seek escape, or cultivation effects, but later turned to positive potentials for mood management and meaningful entertainment (for an overview, see Reinecke & Oliver, 2017). We expect a similar shift with regard to CMC and mental health. Specifically, we expect to see a larger increase in research studying CMC in relation to the positive (PWB) than the negative side of mental health (PTH). H3b:  Over time, the rate of research investigating PWB will increase more than the rate of research investigating PTH.

Method Scoping Review Methodology To chart the vast, highly fragmented, and fast growing literature on CMC and mental health we make use of scoping review methodology (Colquhoun et al.,  2014; Pham

86   Adrian Meier ET AL. et  al.,  2014). In reaction to the exponential rise in research output (Günther & Domahidi, 2017), scoping reviews have become a popular approach for research synthesis in many disciplines. The main function of this type of review is “to map a vast body of research literature in a field of interest in terms of the volume, nature, and characteristics of the primary research” (Pham et al., 2014, p. 371). Scoping reviews are particularly relevant when a field of interest is highly heterogeneous in nature (Mays, Roberts, & Popay, 2001) and helpful in tracing the emergence of new (sub-)fields. Moreover, they can illuminate a field’s main lines of inquiry (i.e., its topics) as well as its disciplinary boundaries; uncover gaps and trends in the literature; and, most importantly, point to future directions for further research integration. Scoping reviews have rarely been applied in the field of communication (e.g., Peng, Zhang, Zhong, & Zhu, 2013), although research on mass media, ICTs, and CMC is inherently interdisciplinary and growing fast (Günther & Domahidi,  2017). We believe that this type of review represents a useful method to assess the state of research on the relationship between CMC and mental health. As research in this area is currently particularly fragmented and likely growing exponentially, a “classical” review approach that involves hand searches and manual coding would be highly resource-intensive. We therefore make use of economical and effective computational methods that have recently gained popularity in communication (e.g., Günther & Domahidi, 2017; Peng et al., 2013; Rauchfleisch, 2017). Besides allowing us to conduct a scoping review based on a great amount of information (Tsafnat et al., 2014), computational methods facilitate thematically comprehensive reviews, as they are typically based on large-scale systematic sampling and quantitative analysis (Borenstein, Hedges, Higgins, & Rothstein,  2009). To also provide more in-depth and detailed accounts of the reviewed body of literature, we combine these computational methods with qualitative manual coding wherever necessary (e.g., to select the topics relevant for our research question, or to further illustrate the meaning of topics identified by our quantitative analysis).

Sample To realize our sample of relevant studies, we systematically developed a search string covering both CMC and mental health concepts (see Table 4.1). An original string was developed and then iteratively refined and validated manually in multiple steps (e.g., by assessing the number of false-positive search results). During this procedure, it became apparent that certain highly prevalent terms (e.g., sex, suicide, therapy, or cancer) as well as technology and media terms not meeting our definition of CMC (e.g., games, robots, porn) needed to be explicitly excluded in order to reduce high numbers of false-positives. The final search string was then applied to search articles’ abstracts via the meta-database EBSCO Host from 1998 (i.e., the year of publication of Kraut et al.’s pivotal article) until April 4th 2018. EBSCO Host offers access to a wide range of databases and journals and

Computer-Mediated Communication and Mental Health   87 Table 4.1  Search Terms, Databases, and Concept Operationalization Search String of the Systematic Database Search AB(((Internet or cyber* or “online media” or “online communication” or “online social network*” or “online communit*” or chat* or email or “computer-mediated” or “mobile phone” or smartphone or "instant mess*" or “mobile mess*” or “social media” or “social network* site*” or “information and communication technolog*” or facebook or instagram or snapchat or twitter or wechat or weibo or texting) not gam* not robot* not porn*) and ((“well-being” or “psych* functioning” or “life satisfaction” or “satisfaction with life” or “positive affect” or “negative affect” or psychopatholog* or “mental health” or anxiety or loneliness or “self W1 esteem” or depression) not sex* not suicid* not disabil* not trauma* not patient* not emergency not therap* not training not protocol not intervent* not prevent* not care not program not cancer)). EBSCO Host Databases Searched Academic Search Ultimate; Business Source Premier; Communication Source; EconLit; ERIC; Library, Information Science & Technology Abstracts; MEDLINE; PsycARTICLES; PsycINFO; SocINDEX with Full Text. Operationalization of Mental Health Concepts for Concept Analysis Concept 1, psychological well-being (very broadly defined, incl. resilience factors): “well-being” or “psych* functioning” or “life satisfaction” or “satisfaction with life” or “positive affect” or “negative affect” or happiness or “social support” or “social capital” or “self-esteem” Concept 2, psychopathology (very broadly defined, incl. risk factors): psychopatholog* or anxiety or loneliness OR depress* or stress or phobia or fear or disorder or “substance abuse” or “attentiondeficit” or “hyperactivity” or “ADHD” or “AD/HD” or aggress*

allows downloading abstracts and metadata (though not full texts) for all search results as an .xml file. We searched 10 databases within EBSCO Host representing a broad variety of disciplines that may conduct research on CMC and mental health (see Table 4.1). Automated content analysis is highly dependent on language; thus, only publications in English were included. Included publication formats were journal articles, dissertations, and conference proceedings. After downloading, we excluded all duplicates and entries with missing abstract and/ or title data, resulting in our final sample of 4,118 potentially relevant documents. All our analyses are based on articles’ abstracts and metadata (i.e., title, year of publication, journal title).

Analytical Approach Topic modeling. To process the large sample, we opted for an automated content analysis, specifically topic modeling. Topic modeling is an unsupervised machine learning approach “inferring topics from recurring patterns of word occurrence in documents” (Maier et al., 2018, p. 94). Note that topic models are mixed membership approaches, meaning that documents can be associated with multiple topics (Maier et al., 2018). Given the characteristics of our sample (i.e., topics are likely to be correlated), we estimated a

88   Adrian Meier ET AL. Correlated Topic Model (CTM) based on the text in articles’ abstracts (Blei & Lafferty,  2009). Common preprocessing steps such as word stemming and TFIDF weighting were implemented (Manning, Raghavan, & Schütze, 2008) with the R package tm_0.7-3 (Feinerer, Hornik, & Meyer, 2008). Following a common approach, we estimated 90 topic models from k = 20 to k = 200 in order to find the number of topics k that delivers the best model fit for our data. We then estimated our CTM with the resulting parameter value of k = 110 topics using the R topicmodels_0.2-7 package (Grün & Hornik, 2011). Based on the estimated hyperparameter values, we observed a few dominant topics per document (instead of a high number of equally distributed topics). To avoid skewed results, we selected the two topics with the highest probability, meaning the likelihood that a topic k occurs in a given abstract. Additionally, we only considered topics with a minimum probability of 0.1 in a given abstract. Manual topic selection and merger. After the automated analysis, we manually checked abstracts and titles for all topics that appeared as first or second most probable in more than 50 documents (Maier et al., 2018). Out of the identified 110 topics, we manually selected only those topics that fit our specific research focus for further analysis (see the section on Defining key constructs). This manual topic selection resulted in a reduction of our sample to 15 topics (N = 1780 abstracts). All analyses are based on this reduced sample. We then merged the 15 topics into nine based on our qualitative assessment that they showed high thematic overlap. That is, we retained the algorithmically identified 15 topics, but manually grouped them together on a more general level, based on common research themes investigated in these topics. This merger was done to ensure a parsimonious yet still exhaustive analysis. For each of the nine merged topics, we chose a label that best represents its content on a conceptual level above and beyond the words associated with each topic (see Table 4.2). Publication behavior in the field. Journal names were determined from article metadata. Additionally, we manually coded journals’ disciplinary affiliations for all journals that had published three or more articles in our sample. Coding was based on a journal’s Social Science Citation Index (SSCI) categorization (see end of chapter for URL). In a few cases, journals were not listed in the SSCI; disciplinary affiliations were then coded based on the journal’s self-description taken from its web presence. In a next step, SSCI sub-discipline categories (e.g., developmental psychology, social psychology, clinical psychology) were grouped in broad disciplines (e.g., psychology) to ensure comparability of results across a reasonable number of disciplines. Note that journals can be listed for multiple disciplines in the SSCI and were coded and analyzed accordingly (71 journals belonged to only one discipline, 30 belonged to two disciplines, five belonged to three disciplines, and two belonged to four disciplines). Publication outlets other than journals (e.g., Dissertation Abstracts databases) were not coded for disciplinary affiliation. Concepts. The two key mental health concepts, psychopathology (PTH) and psychological well-being (PWB), were identified via a keyword search based on lists of relevant expressions which we applied to the downloaded abstracts, title, and keywords of all

Computer-Mediated Communication and Mental Health   89 1780 documents (see Table 4.1). The keyword search included terms from our literature search string as well as additional terms originally not included in the literature search string due to high rates of false-positive search results (e.g., “social capital” or “disorder”). We ensured that both concepts were operationalized with roughly the same number of keywords to avoid bias. Software. All data management, cleaning, and analysis was performed using R (R Core Team, 2018). All visualization was done with the R package ggplot2_2.2.1 (Wickham, 2009).

Results Core Topics To answer RQ1, we illustrate each topic based on example research themes derived from documents associated with each of the 15 topics grouped into nine core topics (see Table 4.2). Due to the high number of documents, we can only include exemplary citations for each topic, taken from the complete topic modeling dataset (N = 1780). These only serve to illustrate a topic and do not represent the definitive state of the respective sub-field. However, we generally chose the most recent and most thematically fitting publications in order to provide an accurate description of the current state of the topic. The complete references for these citations can be found in the appendix. Topics are sorted in descending order based on the (aggregated) number of documents associated with each topic. Internet addiction & problematic Internet use. Internet addiction (IA; topic 1a) is a controversially debated, but dominant, approach to the study of CMC and mental health since the earliest days of commercial Internet use (e.g., Young & Rodgers, 1998). It postulates that excessive Internet use—for some—can result from or reflect impulse control, substance abuse, or compulsive disorders (Widyanto & Griffiths,  2006). While some use the terminology interchangeably, problematic or pathological Internet use (PIU; 1b) often represents one of several alternative approaches to this issue (e.g., Caplan, 2003; Kardefelt-Winther, 2014; Tokunaga, 2014). PIU recognizes problematic behaviors surrounding (excessive) Internet use, but does not necessarily see these behaviors as indicative of behavioral addiction. Numerous studies have linked IA/PIU to various psychopathological symptoms and disorders (e.g., Floros, Siomos, Stogiannidou, Giouzepas, & Garyfallos, 2014) or lacking social resources (e.g., Wu et al., 2016). However, beyond providing evidence of construct validity and assessing comorbidity, studies finding negative relationships between IA/PIU and mental health appear somewhat tautological, as its measures often include impaired mental health as a diagnostic criterion. Furthermore, a much contended issue pertains to the causal direction between comorbid psychopathology and IA/PIU (e.g., Carli et al., 2013).

90   Adrian Meier ET AL. Table 4.2  CTM with 15 Manually Selected Topics Merged into Nine Thematically Overlapping Topic Clusters, Sorted by Aggregated Frequencies (k = 110, N = 1780, Max. 2 topics/Document, Prob ≥ 0.1) Topic Label

Most relevant preprocessed words

Freq

1a

Internet addiction & problematic Internet use

addict, selfesteem, impuls, iad, iat, turkey, comorbid, sclr, excess, nonaddict

265

1b

Internet addiction & problematic Internet use

problemat, alcohol, piu, drink, addict, fomo, cellphon, excess, impuls, heavi

167

2a

Facebook & SNS use

facebook, selfesteem, updat, lone, extravers, passiv, gratif, impress, upward, selfpresent

322

2b

Facebook & SNS use

sns, selfpresent, authent, uncertainti, reconnect, passiv, snapchat, offlin, acquaint, tie

3a

Mobile & smartphone use

mobil, phone, spiritu, nurs, leisur, app, send, nomophobia, recreat, smart, dses

233

3b

Mobile & smartphone use

smartphon, selfefficaci, exercis, app, taiwan, disposit, gratif, tablet, locus, multitask

153

4a

Relationships & CMC

attach, style, gay, men, stressor, homoneg, secur, bisexu, romant, insecur

105

4b

Relationships & CMC

selfdisclosur, intimaci, romant, computermedi, disclosur, weibo, avatar, cue, partner, wechat

89

4c

Relationships & CMC

friendship, girl, violenc, date, boy, sibl, domest, violent, grade, parentchild

84

5a

Chatting & texting

loneli, chat, room, selfesteem, selfconcept, ciu, compuls, instant, clariti, teenag

5b

Chatting & texting

text, partner, talk, instant, voic, channel, afford, retic, sms, textmessag,

6

Cyberbullying

victim, cyberbulli, bulli, cyber, aggress, peer, cybervictim, perpetr, selfesteem, harass

185

7

ICT adoption

ict, incom, rural, swb, household, inequ, farm, agricultur, urban, broadband

136

8

Work-related CMC

job, employe, workplac, worker, workrel, worklif, intrus, burnout, supervisor, turnov

115

9

ICT use & sleep

sleep, insomnia, disturb, hygien, sensor, pittsburgh, perfectionist, telepressur, perfection, baselin

70

141 63

72

Facebook & SNS use. With the rise of SNS, specifically Facebook, a second large research topic has emerged. Often, studies extend classic research on CMC and mental health (e.g., on the displacement of face-to-face contact) to the SNS context (e.g., Dienlin, Masur, & Trepte,  2017). While some studies appear largely atheoretical and “effects-driven” (e.g., Huang,  2017; Shakya & Christakis,  2017), increasingly research

Computer-Mediated Communication and Mental Health   91 illuminates how the passive consumption of others’ (often positively biased) selfpresentations on SNS elicits upward social comparison and emotional reactions such as envy (e.g., Blease, 2015; Kross et al., 2013; Tromholt, 2016; Verduyn et al., 2015). Some of this research gathers under the label of “Facebook depression” (Steers, 2016). In contrast to this negative perspective on SNS and mental health, much work has assessed positive mental health effects of social support and social capital derived from SNS use (e.g., Nabi, Prestin, & So, 2013), albeit also finding mixed results (e.g., Utz & Breuer, 2017). Another theme within this broader topic has been the relationship between individuals’ loneliness and their SNS use (i.e., social compensation vs. social enhancement; e.g., Song et al., 2014). It should be noted that the Facebook sub-topic (2a) was considerably more frequent than the general SNS topic (2b), reflecting a “Facebook bias” in this research (see Table 4.2). Mobile & smartphone use. Research on the role of mobile phones (3a) and smartphones (3b) in mental health shows a variety of themes: Epidemiological and medical work, for example, is linking mobile screen time to impaired mental health, especially among adolescent users (e.g., Babic et al., 2017). Early mobile research, in contrast, has identified the emotional attachment that users have to their mobile devices as a double-edged sword (Vincent, 2006), providing positive feelings of connectedness to social ties, but eliciting anxiety once the mobile is absent (sometimes termed “nomophobia”; e.g., Hoffner, Lee, & Park, 2016). Another recent line of research investigates “phubbing,” the snubbing of conversation partners by using mobiles phones during face-to-face talk, which has been linked to reduced quality of social interactions and relationship satisfaction (e.g., Rotondi, Stanca, & Tomasuolo, 2017). Studies have also investigated the positive (Pearson, Mack, & Namanya, 2017) and negative (Xie, Zhao, Xie, & Lei, 2016) sides of mobile phone usage in rural and developing areas. Relationships & CMC. This topic encompasses three—partly overlapping—research foci that are tied together by their common theme of how people use CMC to develop and maintain relationships: attachment theory, self-disclosure, and friendship and ­dating via CMC. Research on attachment theory (4a) investigates how different attachment styles (e.g., anxious vs. avoidant) impact CMC usage in (romantic) relationships (Oldmeadow, Quinn, & Kowert, 2013) and its effects on relationship well-being (e.g., intimacy or satisfaction). Morey, Gentzler, Creasy, Oberhauser, and Westerman (2013), for instance, find that attachment style moderated most of the effects between CMC and relationship quality. Other work has studied how attachment style is related to using CMC for breakups or monitoring of ex-partners via SNS (Weisskirch & Delevi, 2012). Self-disclosure (4b) crucially contributes to maintaining interpersonal relationships and receiving social support, and is hence beneficial for mental health. Often coming from a hyperpersonal perspective (e.g., High & Caplan,  2009), this line of research investigates how social anxiety relates to more disinhibited disclosures in CMC (Schouten, Valkenburg, & Peter, 2007) and provides evidence for social compensation (“poor get richer”) effects (Weidman et al., 2012).

92   Adrian Meier ET AL. Finally, research on friendship and dating via CMC (4c) finds further evidence for such compensation effects (e.g., Desjarlais & Willoughby,  2010; Selfhout, Branje, Delsing, ter Bogt, & Meeus, 2009), but also for social enhancement (“rich get richer”) effects in online dating (Valkenburg & Peter, 2007). In dating relationships among adolescents, the role of social anxiety has further been studied with regard to “electronic intrusion,” that is, overcontrolling behavior towards one’s partner via CMC (Reed, Tolman, Ward, & Safyer, 2016). Chatting & texting. The research clustered around chatting (5a) and texting (5b) investigates several of the issues outlined previously with a specific focus on text-based CMC. Partly in reaction to Kraut et al.’s (1998) survey, researchers have, for instance, assessed the relation between chatting and mental health experimentally and found positive effects—specifically, reductions in depression and loneliness as well as increases in social support and self-esteem (e.g., Shaw & Gant, 2002). However, researchers have also studied chatting and texting from an addiction perspective, sometimes under the label of compulsive Internet use (CIU; van den Eijnden, Meerkerk, Vermulst, Spijkerman, & Engels,  2008), and found CIU to be linked longitudinally to higher depression levels. Text messaging has also been differentially associated with relationship well-being (Park, Lee, & Chung,  2016) and affective well-being (Hall,  2017). Notably, many studies in this topic assess and confirm mental health variables as predictors (e.g., loneliness, social anxiety, or depression symptoms), rather than outcomes of texting behavior (e.g., Coyne, Padilla-Walker, & Holmgren, 2018; Reid & Reid, 2010). Cyberbullying. The phenomenon of cyberbullying (6) has received much research attention over the past 20 years (Chen, Ho, & Lwin, 2017). Researchers have, for instance, studied whether cyberbullying extends or replaces offline bullying (Kubiszewski, Fontaine, Potard, & Auzoult, 2015) and tested whether cyberbullying uniquely contributes to victims’ mental health impairments when controlling for offline bullying (Hase, Goldberg, Smith, Stuck, & Campain, 2015; Sjursø, Fandrem, & Roland, 2014). Typical outcomes of cyber victimization include internalizing and externalizing psychopathology (Schultze-Krumbholz, Jäkel, Schultze, & Scheithauer, 2012), and stress (Wright, 2015). Researchers have also found psychopathology to predict whether adolescents become cyberbullies (Chen et al., 2017). Research on this topic shows a unique focus on children and adolescents, but has also recognized cyberbullying as a prevalent phenomenon in the workplace (Vranjes, Baillien, Vandebosch, Erreygers, & de Witte, 2018) and in the form of online trolling behavior (Hong & Cheng, 2018). ICT adoption. This topic (7) focusses on how the adoption of or access to ICTs affect the well-being of different (marginalized) populations (e.g., elderly, rural inhabitants, lowincome individuals, inhabitants of developing regions; Greyling,  2018; Ihm & Hsieh, 2015; Tseng & Hsieh, 2015). Some of this research comes from a digital divide or digital inequality perspective (e.g., Nie, Sousa-Poza, & Nimrod, 2017), studying differences in access, adoption, and effects depending on various sociodemographic factors. In contrast to emphasizing the desirability of equal access to ICTs, researchers have also proposed that increased ICT access can negatively affect well-being, for instance, by

Computer-Mediated Communication and Mental Health   93 raising material aspirations (Lohmann,  2015). Overall, research on this topic takes a more sociological and economical macro perspective. Work-related CMC. With the radical shift in work-environments towards email and mobile communication, research has been paying attention to the role that CMC plays for workers’ well-being (8). For instance, research has explored how email usage contributes to “technostress” and burnout (Carmago, 2008; Ninaus, Diehl, Terlutter, Chan, & Huang, 2015) or the well-being tradeoffs made when resisting interruptions from emails during work tasks (Russell, Woods, & Banks, 2017). Research by Sonnentag, Reinecke, Mata, and Vorderer (2018), however, shows that the effects of interruptions through messages at work are not uniformly negative and depend on the user’s responsiveness. The impact of incivility in email communication on well-being within organizations is another theme within this topic (e.g., Giumetti, McKibben, Hatfield, Schroeder, & Kowalski, 2012). Concerning positive effects, researchers also recognize the potential of CMC technologies to allow for micro-breaks at work, facilitating recovery and, hence, well-being (Ivarsson & Larsson, 2011). ICT use & sleep. Finally, emerging research increasingly links ICT use to poor sleep quality (9), for example, through sleep displacement (e.g., Rosen, Carrier, Miller, Rokkum, & Ruiz, 2016; Thomée, Eklöf, Gustafsson, Nilsson, & Hagberg, 2007). Poor sleep is a crucial risk factor for various psychopathologies. Research on this topic has found increased social media use, both overall and nighttime-specific, to be linked to decreased sleep quality among adolescents (Woods & Scott, 2016). A study also found that the collapse of social contexts resulting from constant connectivity via ubiquitously used ICTs created “telepressure” among college students (Barber & Santuzzi,  2017), which contributed to poorer sleep hygiene (e.g., not going to bed at a regular time).

Changes over Time Concerning the development of these nine core topics over time (RQ2), some topics did increase particularly sharply in the last decade (see Figure 4.1). While topic 2 “Facebook & SNS use” is only represented with one publication in 1998 and only two in 2008, in the year 2017 we already find 62 publications on the topic. The same is true for topic 1 “Internet addiction & problematic Internet use,” a topic only represented with four publications in 1998 and eleven in 2008, but 74 in 2017. Other topics such as topic 6 “cyberbullying” (2002: n = 1, 2008: n = 1, 2017: n = 28) increased less in the last decade. Overall, we observe a sharp increase in research on CMC and mental health since 1998. Accordingly, H1 was supported.

Publication Behavior in the Field Concerning RQ3, the number of outlets for research on CMC and mental health is very high, with 715 different publication outlets in our final sample. However, the number of

Figure 4.2  Top 20 journals.

Psych. of Pop. Media Culture

Int. J/o Env. Research & Pub. Health

2014

J. of Behavioral Addict.

Int. J/o Ment. Health & Addict.

2012

Conf. Papers American Sociological Assoc.

Journals

Turkish Online J. of Educ. Techn.

T4: T3: T2: T1:

2010

Behaviour & Inform. Tech.

ICT use & sleep Work−related CMC ICT adoption Cyberbullying Chatting & texting

Psychiatry Research

2008

J/o Med. Internet Research

2006

Front. in Psych.

Social Indic. Research

2004

J. of Computer−Mediated Comm.

T9: T8: T7: T6: T5:

J/o Adolescence

2002

Ind. J/o Health & Wellbeing

2000

PLOS ONE

Personality & I.D.

1998

Diss.Abstracts A

Cyberpsychology, B&S

Diss.Abstracts B

Computers in Human Behavior

Articles

Articles

94   Adrian Meier ET AL.

300

200

100

0 2016

Relationships & CMC Mobile & smartphone use Facebook & SNS use Internet addiction & PIU

Figure 4.1  Distribution of nine core topics over time.

150

100

50

0

Computer-Mediated Communication and Mental Health   95 documents per outlet is often low (i.e., the distribution is highly skewed towards a few outlets that publish most of the research in the field). Figure 4.2 displays the output of the 20 outlets with the highest number of documents in our sample. When interpreting these results, it is important to keep in mind that outlets differ both in terms of how far their archives date back, and in their yearly output (affected by number of issues per year and articles per issue). We clearly find two psychological journals, Computers in Human Behavior (n = 174) and Cyberpsychology, Behavior and Social Networking (n = 110) to have published the largest relative share of research on CMC and mental health. Interestingly, genuine communication journals do not play an important role, with only the Journal of Computer Mediated Communication (n = 14) among the top 20 publication outlets. Note that while journals dominate the publication outlets, there is also a  high number of dissertations on CMC and mental health both in the “Sciences and  Engineering” (Diss. Abstract B, n = 133) and “Humanities and Social Sciences” (Diss. Abstracts A, n = 51). We also analyzed the relative importance of disciplines for this research field. Clearly (see Figure 4.3), psychology is the discipline publishing most research on CMC and mental health (n = 510). We thus find support for H2. To answer RQ4, we look at the other disciplines and find psychiatry (117), other social sciences (101), health and medicine (74), and other (the residual category; 70), followed by communication (69), computer and information science (60), and education (55) to considerably contribute to this field. The remaining publications (n = 832) were either scattered over other outlets (e.g., Diss. Abstracts), which we could not clearly classify by discipline, or they were

Articles

150

100

50

0 1998

2000

2002

2004

2006

2008

Education Computer & Information Sciences Communication Other Disciplines

2010

2012

2014

2016

Health & Medical Sciences Social Sciences (Other) Psychiatry Psychology

Figure 4.3  Distribution of articles per discipline over time.

96   Adrian Meier ET AL. published in journals with fewer than three articles on CMC and mental health and hence not included in this analysis. Disciplines vary with regard to their growth rates of publications on CMC and mental health. Communication is not represented in the year 1998, has three articles in 2008, while the rate increased fourfold to 13 articles in 2017. Psychology has a similar output of only three articles in 1998 and 11 in 2008, but an almost sevenfold increase to 73 articles in 2017. The interest of psychology in CMC and mental health research thus seems to have increased particularly sharply.

Mental Health Concepts In order to answer RQ5a and test H3, we searched for terms representing each of the two concepts, psychopathology (PTH) and psychological well-being (PWB), in our sample. In general (see Figure 4.4), we found that research on PTH (n = 1205) is more prevalent than on PWB (n = 808). We thus find support for H3a. However, a number of publications addressed both concepts simultaneously (n = 368)—that is, they included both terms associated with PTH and terms associated with PWB. Note that 135 abstracts could not be classified in our sample as they included neither a term indicative of PTH or of PWB; specifically, the search term “mental health” was considered to capture both concepts and thus not included in the concept analysis (see Table 4.1 for details on the terms used). In order to test H3b we look at the increase over time in number of publications containing each concept. While PTH is represented with seven publications in the year 1998, and 32 in 2008, in 2017 we find 176 publications. PWB is represented with five publications in 1998, 19 in 2008, and 128 in 2017. While the increase rate is the same for both concepts 400

Articles

300

200

100

0 1998

2000

2002

Not Classified

2004

2006

Mixed

2008

2010

Well−Being

2012

2014

Psychopathology

Figure 4.4  Distribution of mental health concepts over time.

2016

Computer-Mediated Communication and Mental Health   97 for the last two decades (factor 25), we find a slightly higher increase for PWB in the last decade (factor 6.7) than for PTH (factor 5.5). Hence, we find weak support for H3b. With regard to RQ5b, we find differences concerning the disciplines that publish research on both concepts (see Table 4.3). While in communication journals the concept PTH (n = 49) is only 1.1 times as common as the concept PWB (n = 37), in psychology the rate is 1.8 (PTH: n = 397; PWB: n = 222), and in psychiatry studies on PTH (n = 90) were three times more common than on PWB (n = 31). Regarding the question of how the two mental health concepts are distributed over topics (RQ5c), we find several noteworthy differences (see Table 4.4). In most topics such as “Internet addiction & PIU,” “mobile & smartphone use,” “relationships & CMC,” Table 4.3  Mental Health Concepts Distributed over Disciplines Discipline

PTH

PWB

Both

Not classified

Communication Psychology Psychiatry Education Health & Medical Sciences Computer & Information Sciences Social Sciences (Other) Other disciplines

49 397 90 40 43 37 65 45

37 222 31 27 30 32 51 32

21 121 16 14 13 14 19 11

4 12 12 2 14 5 4 4

Note: PTH = psychopathology; PWB = psychological well-being. Based on the data from our CTM with manual selection of 15 relevant topics, merged into nine topics based on high ­thematic overlap, k = 110, N = 1780, max. 2 topics/document with topic probability ≥ 0.1.

Table 4.4  Mental Health Concepts Distributed over Topics Topic

PTH

PWB

Both

Not Classified

Internet addiction & PIU Facebook & SNS use Mobile & smartphone use Relationships & CMC Chatting & texting Cyberbullying ICT adoption Work-related CMC ICT use & sleep

355 254 231 194 180 151 47 67 52

138 231 159 113 82 66 96 81 25

88 105 52 50 63 43 17 41 13

27 12 48 21 5 11 10 8 8

Note: PTH = psychopathology; PWB = psychological well-being; based on the data from our CTM with manual selection of 15 relevant topics, merged into nine topics based on high t­ hematic overlap, k = 110, N = 1780, max. 2 topics/document with topic probability ≥ 0.1.

98   Adrian Meier ET AL. “chatting & texting,” “cyberbullying,” and “ICT use & sleep,” we find a clear focus on PTH. On the contrary, research on “Facebook & SNS use” shows an almost balanced distribution of the concepts, while in the topics “ICT adoption” and “work-related ICT use” we find more publications on PWB.

Discussion Summary and Contribution Since the mid-1990s, Internet and ICT use has firmly established itself in the everyday lives of billions of people around the globe (International Telecommunications Union, 2018). As this Handbook summarizes, a large part of our daily social behavior is now mediated by technology. The question of whether and how such computer-­ mediated communication is related to the mental health of users has been the center of much public debate and research attention. With the emergence of new, heterogeneous, and interdisciplinary lines of research on CMC and mental health (Meier & Reinecke, 2020), the challenge of defining and navigating this field has arisen. We face this challenge by presenting this scoping review, which identifies core topics as well as structural properties of the field. Our results underline that research interest in CMC and mental health has increased dramatically in the last 10 years. Beyond the general increase in publication output across disciplines (e.g., Günther & Domahidi, 2017), a likely explanation for this is the radical establishment of social media (e.g., SNS) and smartphones in daily life, and an increase in societies’ and researchers’ concerns surrounding their potential impact (e.g., Twenge, Martin et al., 2018). With regard to the core topics, research on Internet addiction and problematic Internet usage clearly dominates the field. However, there is a variety of topics that offer alternative approaches to the study of CMC and mental health. Specifically, there is a strong and fast growing research focus on Facebook and SNS, as well as on mobile (smart)phone usage, and the role that CMC plays in close interpersonal relationships. Our qualitatively derived topic descriptions imply that, instead of largely atheoretical overpathologizing of everyday life behavior (Billieux, Schimmenti, Khazaal, Maurage, & Heeren, 2015), researchers now also apply more fine-grained theoretical approaches that specify both how CMC can impact mental health (e.g., social comparison or attachment theory) and how mental health can be a predictor rather than an outcome of CMC (e.g., social compensation vs. social enhancement). The topics also highlight that while the focus of the first research decade (i.e., how CMC impacts social resources) is still very much present in the literature (Domahidi, 2018), there are numerous other important connections between CMC and users’ mental health that receive considerable research attention. Our qualitative topic descriptions also further specify the fragmentation and lack of theoretical integration of the research field. While research on some topics such as

Computer-Mediated Communication and Mental Health   99 “ICT adoption” or “ICT use & sleep” seems to focus more on global usage indicators such as access to ICTs or time spent in front of screens, research on other topics investigates the interpersonal communication unfolding within a specific ICT channel (see, e.g., “relationships & CMC”). That is, research on different topics focuses on different aspects and indicators of CMC, with potentially unique implications for its relation to mental health. A much needed systematization and integration of these different operationalizations of CMC would go beyond the focus of this scoping review, representing an important impulse for future research. Concerning mental health, both topics and disciplines differed in how much they addressed the two key concepts, psychopathology and psychological well-being. Overall, the research in this field emphasizes PTH, which may indicate a dominant presumption that CMC is more related to mental illness than to mental thriving. While this could also be interpreted as studies investigating how CMC reduces risks of mental illness (hence also focusing on PTH instead of PWB), the topics identified here suggest that this was not a common approach in our sample. However, some topics showed a focus on PWB, while only research on the topic “Facebook & SNS use” overall equally addressed both PTH and PWB concepts. We also found research on PWB to increase slightly more frequently in the last decade, potentially indicating researchers’ increasing—or reemerging (cf. Hiltz & Turoff,  1978)—recognition of the positive potentials of CMC for PWB. We encourage future researchers to further reflect upon whether a sole focus on negative (PTH) or positive (PWB) markers of mental health is justified for their specific research question and whether their investigation can benefit from a more comprehensive appreciation of the full mental health spectrum (Meier & Reinecke, 2020). Concerning the structural properties of the field, we find clear evidence for a dominance of psychological research. This appears understandable, given that any study of human behavior in relation to mental health requires a thorough understanding of the human psyche. However, it may also affect the kind of research questions that are (not) studied with regard to CMC and mental health. While psychological research typically focusses on an individual’s cognition, affect, and behavior vis-à-vis CMC, sociological research, for instance, may seek to explain relationships between CMC and mental health by investigating differences in users’ network structures (e.g., Haythornthwaite, 2005). Similarly, communication research may add a more detailed conceptualization of different aspects of communication unfolding within CMC channels (e.g., Walther & Parks, 2002), instead of conflating them in global “time spent” or “screen time” indicators of technology use. More research from perspectives beyond psychology may thus help fully understand how CMC affects and is affected by ICT users’ mental health.

Limitations The results of this scoping review need to be interpreted in light of several limitations concerning both sampling and analysis. First, our review only includes publications that explicitly mentioned both CMC and mental health concepts (as operationalized by our

100   Adrian Meier ET AL. search string) in their abstracts. Many more empirical and theoretical articles may inform research on how CMC relates to mental health and vice versa, but fail to mention this in their abstracts. Second, our review relies on a systematically drawn sample of publications on CMC and mental health. Systematic reviews, in general, can hardly include all available literature or even draw a representative sample, as the population of relevant documents is typically unknown. Instead, we aimed to balance the precision of our search (e.g., by excluding certain terms that resulted in high numbers of false-positives) with an adequately comprehensive recall of eligible articles (e.g., by relying on a broad interdisciplinary database search). Still, some research that may be relevant to this field could not be included here. For instance, we explicitly excluded the search term “suicide” due to very high numbers of false-positives in our initial literature searches. Research on how some forms of CMC (e.g., SNS use) may be related to suicide is thus not represented here (e.g., Twenge, Joiner et al., 2018). Third, the accessibility of databases used for sampling is contingent on a researcher’s institutional access to EBSCO Host, which in our case was provided by the Free University of Berlin, Germany. Fourth, each journal’s terms of publication (frequency of issues and number of articles per year), the year in which a journal was launched, and the extent to which older issues are digitized determine the availability and total number of abstracts and metadata online. Not all journal archives are fully digitized; thus, some relevant abstracts may be missing from our dataset. Fifth, we only included research published in English. Research from some parts of the world is likely underrepresented in this review. Concerning our topic modeling analysis, a number of characteristics of this approach need to be taken into account when interpreting the results. First, given the generally increasing rates of scientific outputs (Günther & Domahidi, 2017), a characteristic of our sample is that the number of journals and abstracts has increased over time. As such, the choice of the most relevant words per topic (see Table 4.2) is likely to be skewed towards recently published research. It should also be emphasized that a traditional manual coding of research topics may have resulted in a different set of topics. Topic modeling represents a large-scale, data-driven, and bottom-up approach to the identification of research topics and does not require an a priori coding scheme that predefines what constitutes a research topic. As the two approaches (computational vs. manual) are analytically (bottom-up vs. top-down) and pragmatically different (feasibility for large vs. small samples), they are likely to arrive at different results. None of these arguments represents a limitation in a strict sense, but should be kept in mind when evaluating the topic modeling results. Concerning the impact of different disciplines on the research field, we only assessed disciplines based on journals’ SSCI categories and our manual aggregation of these categories into broad groups of disciplines. However, researchers from various disciplines may publish in journals relevant to their research topics, not just those from their home discipline. For example, communication researchers also publish in psychological journals (e.g., Meier et al., 2016). Accordingly, our statements about disciplinary impact are only based on journals’, not researchers’, disciplinary affiliations. Moreover, 34% of

Computer-Mediated Communication and Mental Health   101 coded journals belonged to more than one discipline, indicating that many journals themselves are somewhat interdisciplinary. Also, while our sample includes over 700 publication outlets, some of these may be duplicates due to slightly different spelling in different databases searched by EBSCO Host. We only deduplicated outlets with three or more documents in our sample. Accordingly, the actual number of unique outlets may be slightly lower. Finally, with regard to our two umbrella constructs, CMC and mental health, we only analyzed the mental health concepts PTH and PWB in detail. A similar analysis with regard to CMC concepts could not be realized, as a consistent typology of CMC concepts is currently missing from the literature and would go beyond the scope of this review.

Future Research Agenda Based on this review, we suggest several directions for future research. First, researchers should reflect more on whether their research question implies a relation between CMC and PTH, or PWB, or both. Addressing both PWB and PTH appears preferable, as it avoids overlooking potential positive associations between CMC and mental health that are difficult to capture with PTH indicators only (and vice versa). Second, future research syntheses on this field should treat PTH and PWB in a more detailed manner than was possible here. For instance, research on CMC and mental health could be further differentiated by whether externalizing versus internalizing PTH or hedonic versus eudaimonic PWB is addressed (Meier & Reinecke, 2020). Third, theory-driven research beyond a psychological and clinical (e.g., addiction) paradigm is much needed to achieve a fuller understanding of the complex relationships between CMC and mental health. Fourth, a more in-depth and systematic synthesis of research from some of the broader topics (e.g., “Facebook & SNS use,” “relationships & CMC,” or “mobile & smartphone use”) appears warranted in order to assess how relationships between specific aspects of CMC (e.g., active vs. passive SNS use) differ with regard to mental health. Finally, while our concept analysis only focused on mental health, we encourage researchers to develop analytical frameworks for the analysis of the various concepts and indicators of CMC that have been studied in relation to mental health. Without a more systematic approach to both umbrella constructs, CMC and mental health, further integration of the fast growing literature is hampered. We believe that our review represents one step in this direction by providing a first higher-level overview of the emerging research field and by charting its development over the last 20 years.

References Baker, D. A., & Algorta, G. P. (2016). The relationship between online social networking and depression: A systematic review of quantitative studies. Cyberpsychology, Behavior, and Social Networking, 19(11), 638–648. https://doi.org/10.1089/cyber.2016.0206

102   Adrian Meier ET AL. Bargh, J.  A., & McKenna, K.  Y.  A. (2004). The internet and social life. Annual Review of Psychology, 55, 573–590. https://doi.org/10.1146/annurev.psych.55.090902.141922 Billieux, J., Schimmenti, A., Khazaal, Y., Maurage, P., & Heeren, A. (2015). Are we overpathologizing everyday life? A tenable blueprint for behavioral addiction research. Journal of Behavioral Addictions, 4(3), 119–123. https://doi.org/10.1556/2006.4.2015.009 Blei, D. M., & Lafferty, J. D. (2009). Topic models. In A. N. Srivastava & M. Sahami (Eds.), Text mining: Classification, clustering, and applications (pp. 71–94). Boca Raton, FL: CRC Press. Borenstein, M., Hedges, L. V., Higgins, J. P.T., & Rothstein, H. R. (2009). Introduction to metaanalysis. Chichester, UK: Wiley. Buckingham, D., & Strandgaard Jensen, H. (2012). Beyond “media panics”: Reconceptualising public debates about children and media. Journal of Children and Media, 6(4), 413–429. https://doi.org/10.1080/17482798.2012.740415 Carli, V., Durkee, T., Wasserman, D., Hadlaczky, G., Despalins, R., Kramarz, E., . . . Kaess, M. (2013). The association between pathological internet use and comorbid psychopathology: A systematic review. Psychopathology, 46(1), 1–13. https://doi.org/10.1159/000337971 Chan, M. (2015). Mobile phones and the good life: Examining the relationships among mobile use, social capital and subjective well-being. New Media & Society, 17(1), 96–113. https://doi. org/10.1177/1461444813516836 Choi, M., & Toma, C.  L. (2014). Social sharing through interpersonal media: Patterns and effects on emotional well-being. Computers in Human Behavior, 36, 530–541. https://doi. org/10.1016/j.chb.2014.04.026 Clayton, R. B., Leshner, G., & Almond, A. (2015). The extended iSelf: The impact of iPhone separation on cognition, emotion, and physiology. Journal of Computer-Mediated Communication, 20(2), 119–135. https://doi.org/10.1111/jcc4.12109 Colquhoun, H. L., Levac, D., O’Brien, K. K., Straus, S., Tricco, A. C., Perrier, L., . . . Moher, D. (2014). Scoping reviews: Time for clarity in definition, methods, and reporting. Journal of Clinical Epidemiology, 67(12), 1291–1294. https://doi.org/10.1016/j.jclinepi.2014.03.013 De Chavez, A.  C., Backett-Milburn, K., Parry, O., & Platt, S. (2005). Understanding and researching wellbeing: Its usage in different disciplines and potential for health research and health promotion. Health Education Journal, 64(1), 70–87. https://doi.org/10.1177/ 001789690506400108 Domahidi, E. (2018). The associations between online media use and users’ perceived social resources: A meta-analysis. Journal of Computer-Mediated Communication, 23, 181–200. https://doi.org/10.1093/jcmc/zmy007 Ellison, N. B., Steinfield, C., & Lampe, C. (2007). The benefits of Facebook “friends:” Social capital and college students’ use of online social network sites. Journal of ComputerMediated Communication, 12(4), 1143–1168. Feinerer, I., Hornik, K., & Meyer, D. (2008). Text mining infrastructure in R.  Journal of Statistical Software, 25(5). https://doi.org/10.18637/jss.v025.i05 Forsman, A. K., & Nordmyr, J. (2015). Psychosocial links between Internet use and mental health in later life: A systematic review of quantitative and qualitative evidence. Journal of Applied Gerontology, 36(12), 1471–1518. https://doi.org/10.1177/0733464815595509 Greenspoon, P. J., & Saklofske, D. H. (2001). Toward an integration of subjective well-being and psychopathology. Social Indicators Research, 54(1), 81–108. https://doi.org/10.1023/ A:1007219227883 Grün, B., & Hornik, K. (2011). Topicmodels: An R package for fitting topic models. Journal of Statistical Software, 40(13). https://doi.org/10.18637/jss.v040.i13

Computer-Mediated Communication and Mental Health   103 Günther, E., & Domahidi, E. (2017). What communication scholars write about: An analysis of 80 years of research in high-impact journals. International Journal of Communication, 11, 3051–3071. Hall, J. A. (2018). When is social media use social interaction? Defining mediated social interaction. New Media & Society, 20(1), 162–179. https://doi.org/10.1177/1461444816660782 Haythornthwaite, C. (2005). Social networks and Internet connectivity effects. Information, Communication & Society, 8(2), 125–147. https://doi.org/10.1080/13691180500146185 Hiltz, S. R., & Turoff, M. (1978). The network nation: Human communication via computer. London: Addison-Wesley. Huang, C. (2010). Internet use and psychological well-being: a meta-analysis. Cyberpsychology, Behavior, and Social Networking, 13(3), 241–249. https://doi.org/10.1089/cyber.2009.0217 Huang, C. (2017). Time spent on social network sites and psychological well-being: A metaanalysis. Cyberpsychology, Behavior, and Social Networking, 20(6), 346–354. Huta, V., & Waterman, A. S. (2014). Eudaimonia and its distinction from hedonia: Developing a classification and terminology for understanding conceptual and operational definitions. Journal of Happiness Studies, 15(6), 1425–1456. https://doi.org/10.1007/s10902-013-9485-0 International Telecommunications Union. (2018). Statistics. Retrieved from https://www.itu. int/en/ITU-D/Statistics/Pages/stat/default.aspx Jensen, J. (1990). Redeeming modernity: Contradictions in media criticism. Newbury Park, CA: Sage. Katz, J. E., & Rice, R. E. (2002). Social consequences of Internet use: Access, involvement, and interaction. Cambridge, MA: MIT Press. Keyes, C. L. M. (2007). Promoting and protecting mental health as flourishing: A complementary strategy for improving national mental health. The American Psychologist, 62(2), 95–108. https://doi.org/10.1037/0003-066X.62.2.95 Kraut, R., Kiesler, S., Boneva, B., Cummings, J., Helgeson, V., & Crawford, A. (2002). Internet paradox revisited. Journal of Social Issues, 58(1), 49–74. https://doi.org/10.1111/ 1540-4560.00248 Kraut, R., Patterson, M., Lundmark, V., Kiesler, S., Mukophadhyay, T., & Scherlis, W. (1998). Internet paradox: A social technology that reduces social involvement and psychological well-being? American Psychologist, 53(9), 1017–1031. https://doi.org/10.1037/ 0003-066X.53.9.1017 Kross, E., Verduyn, P., Demiralp, E., Park, J., Lee, D. S., Lin, N., . . . Sueur, C. (2013). Facebook use predicts declines in subjective well-being in young adults. PLOS One, 8(8), e69841. https://doi.org/10.1371/journal.pone.0069841 Lahey, B. B., Krueger, R. F., Rathouz, P. J., Waldman, I. D., & Zald, D. H. (2017). A hierarchical causal taxonomy of psychopathology across the life span. Psychological Bulletin, 143(2), 142–186. https://doi.org/10.1037/bul0000069 Lee, E.-J., & Oh, S.  Y. (2015). Computer-mediated communication. Retrieved from http:// www.oxfordbibliographies.com/view/document/obo-9780199756841/obo-97801997568410160.xml Liu, D., Ainsworth, S. E., & Baumeister, R. F. (2016). A meta-analysis of social networking online and social capital. Review of General Psychology, 20(4), 369–391. https://doi. org/10.1037/gpr0000091 Maier, D., Waldherr, A., Miltner, P., Wiedemann, G., Niekler, A., Keinert, A., . . . Adam, S. (2018). Applying LDA topic modeling in communication research: Toward a valid and reliable methodology. Communication Methods and Measures, 12(2–3), 93–118. https://doi.org/ 10.1080/19312458.2018.1430754

104   Adrian Meier ET AL. Manning, C.  D., Raghavan, P., & Schütze, H. (2008). Introduction to information retrieval. Cambridge: Cambridge University Press. Mays, N., Roberts, E., & Popay, J. (2001). Synthesising research evidence. In N. Fulop, P. Allen, A.  Clarke, & N.  Black (Eds.), Studying the organisation and delivery of health services: Research methods. London: Routledge. Meier, A., & Reinecke, L. (2020, April 6). Computer-mediated communication, social media, and mental health: A conceptual and empirical meta-review. https://doi.org/10.31234/ osf.io/573ph. Meier, A., Reinecke, L., & Meltzer, C.  E. (2016). “Facebocrastination”? Predictors of using Facebook for procrastination and its effects on students’ well-being. Computers in Human Behavior, 64, 65–76. https://doi.org/10.1016/j.chb.2016.06.011 Mikal, J. P., Rice, R. E., Abeyta, A., & DeVilbiss, J. (2013). Transition, stress and computermediated social support. Computers in Human Behavior, 29(5), A40–A53. https://doi. org/10.1016/j.chb.2012.12.012 Nie, N.  H., Hillygus, D.  S., & Erbring, L. (2002). Internet use, interpersonal relations and sociability: Findings from a detailed time diary study. In B. Wellman & C. Haythornthwaite (Eds.), The Internet in everyday life (pp. 215–243). Malden, MA: Blackwell. Peng, T.-Q., Zhang, L., Zhong, Z.-J., & Zhu, J. J. H. (2013). Mapping the landscape of internet studies: Text mining of social science journal articles 2000–2009. New Media & Society, 15(5), 644–664. https://doi.org/10.1177/1461444812462846 Pham, M. T., Rajić, A., Greig, J. D., Sargeant, J. M., Papadopoulos, A., & McEwen, S. A. (2014). A scoping review of scoping reviews: Advancing the approach and enhancing the consistency. Research Synthesis Methods, 5(4), 371–385. https://doi.org/10.1002/jrsm.1123 Przybylski, A.  K., & Weinstein, N. (2017). A large-scale test of the goldilocks hypothesis: Quantifying the relations between digital-screen use and the mental well-being of adolescents. Psychological Science, 28(2), 204–215. https://doi.org/10.1177/0956797616678438 R Core Team. (2018). R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Retrieved from https://www.R-project.org/ Rauchfleisch, A. (2017). The public sphere as an essentially contested concept: A co-citation analysis of the last 20 years of public sphere research. Communication and the Public, 2(1), 3–18. https://doi.org/10.1177/2057047317691054 Reinecke, L., & Oliver, M. B. (2017). Media use and well-being: Status quo and open questions. In L. Reinecke & M. B. Oliver (Eds.), The Routledge handbook of media use and well-being: International perspectives on theory and research on positive media effects (pp. 3–13). New York, NY: Routledge. Rice, R. E. (1980). The impacts of computer-mediated organizational and interpersonal communication. In M.  Williams (Ed.), Annual review of information science and technology (Vol. 15). Whiteplains, NY: Knowledge Industry Publications. Ryan, R. M., & Deci, E. L. (2001). On happiness and human potentials: A review of research on hedonic and eudaimonic well-being. Annual Review of Psychology, 52(1), 141–166. Song, H., Zmyslinski-Seelig, A., Kim, J., Drent, A., Victor, A., Omori, K., & Allen, M. (2014). Does Facebook make you lonely? A meta analysis. Computers in Human Behavior, 36, 446–452. https://doi.org/10.1016/j.chb.2014.04.011 Tokunaga, R. S., & Rains, S. A. (2010). An evaluation of two characterizations of the relationships between problematic Internet use, time spent using the Internet, and psychosocial problems. Human Communication Research, 36(4), 512–545. https://doi.org/10.1111/ j.1468-2958.2010.01386.x

Computer-Mediated Communication and Mental Health   105 Toma, C. L., & Hancock, J. T. (2013). Self-affirmation underlies Facebook use. Personality & Social Psychology Bulletin, 39(3), 321–331. https://doi.org/10.1177/0146167212474694 Tsafnat, G., Glasziou, P., Choong, M. K., Dunn, A., Galgani, F., & Coiera, E. (2014). Systematic review automation technologies. Systematic Reviews, 3, 74. https://doi.org/10.1186/ 2046-4053-3-74 Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. New York, NY: Basic Books. Twenge, J.  M., Joiner, T.  E., Rogers, M.  L., & Martin, G.  N. (2018). Increases in depressive symptoms, suicide-related outcomes, and suicide rates among U.S. adolescents after 2010 and links to increased new media screen time. Clinical Psychological Science, 6(1), 3–17. https://doi.org/10.1177/2167702617723376 Twenge, J. M., Martin, G. N., & Campbell, W. K. (2018). Decreases in psychological well-being among American adolescents after 2012 and links to screen time during the rise of smartphone technology. Emotion. Advance online publication. https://doi.org/10.1037/emo0000403 Twomey, C., & O’Reilly, G. (2017). Associations of self-presentation on Facebook with mental health and personality variables: A systematic review. Cyberpsychology, Behavior and Social Networking, 20(10), 587–595. https://doi.org/10.1089/cyber.2017.0247 Valkenburg, P.  M., & Peter, J. (2009). Social consequences of the Internet for adolescents. Current Directions in Psychological Science, 18(1), 1–5. https://doi.org/10.1111/j.14678721.2009.01595.x Van der Schuur, W. A., Baumgartner, S. E., Sumter, S. R., & Valkenburg, P. M. (2015). The consequences of media multitasking for youth: A review. Computers in Human Behavior, 53, 204–215. https://doi.org/10.1016/j.chb.2015.06.035 Verduyn, P., Ybarra, O., Résibois, M., Jonides, J., & Kross, E. (2017). Do social network sites enhance or undermine subjective well-being? A critical review. Social Issues and Policy Review, 11(1), 274–302. https://doi.org/10.1111/sipr.12033 Walther, J. B. (1992). Interpersonal effects in computer-mediated interaction. Communication Research, 19(1), 52–90. https://doi.org/10.1177/009365092019001003 Walther, J. B., & Parks, M. R. (2002). Cues filtered out, cues filtered in: Computer-mediated communication and relationships. In M. L. Knapp & J. A. Daly (Eds.), Handbook of interpersonal communication (3rd ed., pp. 529–563). Thousand Oaks, CA: Sage. Wickham, H. (2009). ggplot2: Elegant graphics for data analysis. New York: Springer. Zuccala, A. (2006). Modeling the invisible college. Journal of the American Society for Information Science and Technology, 57(2), 152–168. https://doi.org/10.1002/asi.20256

Appendix: Publications Analyzed from the Topic Modeling Dataset (N = 1780) Used for Topic Description Babic, M. J., Smith, J. J., Morgan, P. J., Eather, N., Plotnikoff, R. C., & Lubans, D. R. (2017). Longitudinal associations between changes in screen-time and mental health outcomes in adolescents. Mental Health and Physical Activity, 12, 124–131. https://doi.org/10.1016/j. mhpa.2017.04.001 Barber, L. K., & Santuzzi, A. M. (2017). Telepressure and college student employment: The costs of staying connected across social contexts. Stress and Health, 33(1), 14–23. https://doi. org/10.1002/smi.2668 Blease, C. R. (2015). Too many ‘friends,’ too few ‘likes’? Evolutionary psychology and ‘Facebook depression’. Review of General Psychology, 19(1), 1–13. https://doi.org/10.1037/gpr0000030

106   Adrian Meier ET AL. Caplan, S. E. (2003). Preference for online social interaction: A theory of problematic Internet use and psychosocial well-being. Communication Research, 30(6), 625–648. https://doi. org/10.1177/0093650203257842 Carli, V., Durkee, T., Wasserman, D., Hadlaczky, G., Despalins, R., Kramarz, E., . . . Kaess, M. (2013). The association between pathological internet use and comorbid psychopathology: A systematic review. Psychopathology, 46(1), 1–13. https://doi.org/10.1159/000337971 Carmago, M.  R. (2008). A grounded theory study of the relationship between e-mail and burnout. Information Research, 13(4), paper 383. Chen, L., Ho, S. S., & Lwin, M. O. (2017). A meta-analysis of factors predicting cyberbullying perpetration and victimization: From the social cognitive and media effects approach. New Media & Society, 19(8), 1194–1213. https://doi.org/10.1177/1461444816634037 Coyne, S. M., Padilla-Walker, L. M., & Holmgren, H. G. (2018). A six-year longitudinal study of texting trajectories during adolescence. Child Development, 89(1), 58–65. https://doi. org/10.1111/cdev.12823 Desjarlais, M., & Willoughby, T. (2010). A longitudinal study of the relation between adolescent boys’ and girls’ computer use with friends and friendship quality: Support for the social compensation or the rich-get-richer hypothesis? Computers in Human Behavior, 26(5), 896–905. https://doi.org/10.1016/j.chb.2010.02.004 Dienlin, T., Masur, P. K., & Trepte, S. (2017). Reinforcement or displacement? The reciprocity of FtF, IM, and SNS communication and their effects on loneliness and life satisfaction. Journal of Computer-Mediated Communication, 22(2), 71–87. https://doi.org/10.1111/ jcc4.12183 Floros, G., Siomos, K., Stogiannidou, A., Giouzepas, I., & Garyfallos, G. (2014). The relationship between personality, defense styles, Internet addiction disorder, and psychopathology in college students. Cyberpsychology, Behavior and Social Networking, 17(10), 672–676. https://doi.org/10.1089/cyber.2014.0182 Giumetti, G. W., McKibben, E. S., Hatfield, A. L., Schroeder, A. N., & Kowalski, R. M. (2012). Cyber incivility @ work: The new age of interpersonal deviance. Cyberpsychology, Behavior and Social Networking, 15(3), 148–154. https://doi.org/10.1089/cyber.2011.0336 Greyling, T. (2018). Internet access and its relationship to subjective well-being in a developing region. South African Journal of Economic and Management Sciences, 21(1), a1841. https://doi.org/10.4102/sajems.v21i1.1841 Hall, J. (2017). The experience of mobile entrapment in daily life. Journal of Media Psychology, 29(3), 148–158. https://doi.org/10.1027/1864-1105/a000228 Hase, C. N., Goldberg, S. B., Smith, D., Stuck, A., & Campain, J. (2015). Impacts of traditional bullying and cyberbullying on the mental health of middle school and high school students. Psychology in the Schools, 52(6), 607–617. https://doi.org/10.1002/pits.21841 High, A. C., & Caplan, S. E. (2009). Social anxiety and computer-mediated communication during initial interactions: Implications for the hyperpersonal perspective. Computers in Human Behavior, 25(2), 475–482. https://doi.org/10.1016/j.chb.2008.10.011 Hoffner, C.  A., Lee, S., & Park, S.  J. (2016). “I miss my mobile phone!”: Self-expansion via mobile phone and responses to phone loss. New Media & Society, 18(11), 2452–2468. https://doi.org/10.1177/1461444815592665 Hong, F.-Y., & Cheng, K.-T. (2018). Correlation between university students’ online trolling behavior and online trolling victimization forms, current conditions, and personality traits. Telematics and Informatics, 35(2), 397–405. https://doi.org/10.1016/j.tele.2017.12.016

Computer-Mediated Communication and Mental Health   107 Huang, C. (2017). Time spent on social network sites and psychological well-being: A metaanalysis. Cyberpsychology, Behavior, and Social Networking, 20(6), 346–354. Ihm, J., & Hsieh, Y. P. (2015). The implications of information and communication technology use for the social well-being of older adults. Information, Communication & Society, 18(10), 1123–1138. https://doi.org/10.1080/1369118X.2015.1019912 Ivarsson, L., & Larsson, P. (2011). Personal Internet usage at work: A source of recovery. Journal of Workplace Rights, 16(1), 63–81. https://doi.org/10.2190/WR.16.1.e Kardefelt-Winther, D. (2014). A conceptual and methodological critique of internet addiction research: Towards a model of compensatory internet use. Computers in Human Behavior, 31, 351–354. https://doi.org/10.1016/j.chb.2013.10.059 Kraut, R., Patterson, M., Lundmark, V., Kiesler, S., Mukophadhyay, T., & Scherlis, W. (1998). Internet paradox: A social technology that reduces social involvement and psychological wellbeing? American Psychologist, 53(9), 1017–1031. https://doi.org/10.1037/0003-066X.53.9.1017 Kross, E., Verduyn, P., Demiralp, E., Park, J., Lee, D. S., Lin, N., . . . Sueur, C. (2013). Facebook use predicts declines in subjective well-being in young adults. PLOS ONE, 8(8), e69841. https://doi.org/10.1371/journal.pone.0069841 Kubiszewski, V., Fontaine, R., Potard, C., & Auzoult, L. (2015). Does cyberbullying overlap with school bullying when taking modality of involvement into account? Computers in Human Behavior, 43, 49–57. https://doi.org/10.1016/j.chb.2014.10.049 Lohmann, S. (2015). Information technologies and subjective well-being: Does the Internet raise material aspirations? Oxford Economic Papers, 67(3), 740–759. https://doi.org/10.1093/ oep/gpv032 Morey, J. N., Gentzler, A. L., Creasy, B., Oberhauser, A. M., & Westerman, D. (2013). Young adults’ use of communication technology within their romantic relationships and associations with attachment style. Computers in Human Behavior, 29(4), 1771–1778. https://doi. org/10.1016/j.chb.2013.02.019 Nabi, R.  L., Prestin, A., & So, J. (2013). Facebook friends with (health) benefits? Exploring social network site use and perceptions of social support, stress, and well-being. Cyberpsychology, Behavior, and Social Networking, 16(10), 721–727. https://doi.org/10.1089/ cyber.2012.0521 Nie, P., Sousa-Poza, A., & Nimrod, G. (2017). Internet use and subjective well-being in China. Social Indicators Research, 132(1), 489–516. https://doi.org/10.1007/s11205-015-1227-8 Ninaus, K., Diehl, S., Terlutter, R., Chan, K., & Huang, A. (2015). Benefits and stressors— Perceived effects of ICT use on employee health and work stress: An exploratory study from Austria and Hong Kong. International Journal of Qualitative Studies on Health and WellBeing, 10, 28838. https://doi.org/10.3402/qhw.v10.28838 Oldmeadow, J. A., Quinn, S., & Kowert, R. (2013). Attachment style, social skills, and Facebook use amongst adults. Computers in Human Behavior, 29(3), 1142–1149. https://doi. org/10.1016/j.chb.2012.10.006 Park, N., Lee, S., & Chung, J. E. (2016). Uses of cellphone texting: An integration of motivations, usage patterns, and psychological outcomes. Computers in Human Behavior, 62, 712–719. https://doi.org/10.1016/j.chb.2016.04.041 Pearson, A.  L., Mack, E., & Namanya, J. (2017). Mobile phones and mental well-being: Initial evidence suggesting the importance of staying connected to family in rural, remote communities in Uganda. PLOS One, 12(1), e0169819. https://doi.org/10.1371/journal. pone.0169819

108   Adrian Meier ET AL. Reed, L. A., Tolman, R. M., Ward, L. M., & Safyer, P. (2016). Keeping tabs: Attachment anxiety and electronic intrusion in high school dating relationships. Computers in Human Behavior, 58, 259–268. https://doi.org/10.1016/j.chb.2015.12.019 Reid, F. J. M., & Reid, D. J. (2010). The expressive and conversational affordances of mobile messaging. Behaviour & Information Technology, 29(1), 3–22. https://doi.org/10.1080/ 01449290701497079 Rosen, L., Carrier, L. M., Miller, A., Rokkum, J., & Ruiz, A. (2016). Sleeping with technology: Cognitive, affective, and technology usage predictors of sleep problems among college students. Sleep Health, 2(1), 49–56. https://doi.org/10.1016/j.sleh.2015.11.003 Rotondi, V., Stanca, L., & Tomasuolo, M. (2017). Connecting alone: Smartphone use, quality of social interactions and well-being. Journal of Economic Psychology, 63, 17–26. https://doi. org/10.1016/j.joep.2017.09.001 Russell, E., Woods, S. A., & Banks, A. P. (2017). Examining conscientiousness as a key resource in resisting email interruptions: Implications for volatile resources and goal achievement. Journal of Occupational and Organizational Psychology, 90(3), 407–435. https://doi. org/10.1111/joop.12177 Schouten, A. P., Valkenburg, P. M., & Peter, J. (2007). Precursors and underlying processes of adolescents’ online self-disclosure: Developing and testing an “Internet-attribute-perception” model. Media Psychology, 10(2), 292–315. https://doi.org/10.1080/15213260701375686 Schultze-Krumbholz, A., Jäkel, A., Schultze, M., & Scheithauer, H. (2012). Emotional and behavioural problems in the context of cyberbullying: A longitudinal study among German adolescents. Emotional and Behavioural Difficulties, 17(3-4), 329–345. https://doi.org/10.108 0/13632752.2012.704317 Selfhout, M. H. W., Branje, S. J. T., Delsing, M., ter Bogt, T. F. M., & Meeus, W. H. J. (2009). Different types of Internet use, depression, and social anxiety: The role of perceived friendship quality. Journal of Adolescence, 32(4), 819–833. https://doi.org/10.1016/j. adolescence.2008.10.011 Shakya, H.  B., & Christakis, N.  A. (2017). Association of Facebook use with compromised well-being: A longitudinal study. American Journal of Epidemiology, 185(3), 203–211. https:// doi.org/10.1093/aje/kww189 Shaw, L. H., & Gant, L. M. (2002). In defense of the Internet: The relationship between Internet communication and depression, loneliness, self-esteem, and perceived social support. CyberPsychology & Behavior, 5(2), 157–171. https://doi.org/10.1089/109493102753770552 Sjursø, I. R., Fandrem, H., & Roland, E. (2014). Emotional problems in traditional and cyber victimization. Journal of School Violence, 15(1), 114–131. https://doi.org/10.1080/15388220.20 14.996718 Song, H., Zmyslinski-Seelig, A., Kim, J., Drent, A., Victor, A., Omori, K., & Allen, M. (2014). Does Facebook make you lonely? A meta analysis. Computers in Human Behavior, 36, 446–452. https://doi.org/10.1016/j.chb.2014.04.011 Sonnentag, S., Reinecke, L., Mata, J., & Vorderer, P. (2018). Feeling interrupted–being responsive: How online messages relate to affect at work. Journal of Organizational Behavior, 39(3), 369–383. https://doi.org/10.1002/job.2239 Steers, M.-L.  N. (2016). ‘It’s complicated’: Facebook’s relationship with the need to belong and  depression. Current Opinion in Psychology, 9, 22–26. https://doi.org/10.1016/j. copsyc.2015.10.007 Thomée, S., Eklöf, M., Gustafsson, E., Nilsson, R., & Hagberg, M. (2007). Prevalence of perceived stress, symptoms of depression and sleep disturbances in relation to information and communication technology (ICT) use among young adults: An explorative prospective

Computer-Mediated Communication and Mental Health   109 study. Computers in Human Behavior, 23(3), 1300–1321. https://doi.org/10.1016/j. chb.2004.12.007 Tokunaga, R. S. (2014). A unique problem or the manifestation of a preexisting disorder? The mediating role of problematic Internet use in the relationships between psychosocial ­problems and functional impairment. Communication Research, 41(4), 531–560. https://doi. org/10.1177/0093650212450910 Tromholt, M. (2016). The Facebook experiment: Quitting Facebook leads to higher levels of well-being. Cyberpsychology, Behavior, and Social Networking, 19(11), 661–666. https://doi. org/10.1089/cyber.2016.0259 Tseng, S.-F., & Hsieh, Y. P. (2015). The implications of networked individualism for social participation. American Behavioral Scientist, 59(9), 1157–1172. https://doi.org/10.1177/ 0002764215580620 Utz, S., & Breuer, J. (2017). The relationship between use of social network sites, online social support, and well-being: Results from a six-wave longitudinal study. Journal of Media Psychology, 29(3), 115–125. https://doi.org/10.1027/1864-1105/a000222 Valkenburg, P. M., & Peter, J. (2007). Who visits online dating sites? Exploring some characteristics of online daters. CyberPsychology & Behavior, 10(6), 849–852. https://doi. org/10.1089/cpb.2007.9941 Van den Eijnden, R.  J.  J.  M., Meerkerk, G.-J., Vermulst, A.  A., Spijkerman, R., & Engels, R.  C.  M.  E. (2008). Online communication, compulsive Internet use, and psychosocial well-being among adolescents: A longitudinal study. Developmental Psychology, 44(3), 655–665. https://doi.org/10.1037/0012-1649.44.3.655 Verduyn, P., Lee, D. S., Park, J., Shablack, H., Orvell, A., Bayer, J., . . . Kross, E. (2015). Passive Facebook usage undermines affective well-being: Experimental and longitudinal evidence. Journal of Experimental Psychology: General, 144(2), 480–488. https://doi.org/10.1037/ xge0000057 Vincent, J. (2006). Emotional attachment and mobile phones. Knowledge, Technology & Policy, 19(1), 39–44. https://doi.org/10.1007/s12130-006-1013-7 Vranjes, I., Baillien, E., Vandebosch, H., Erreygers, S., & de Witte, H. (2018). When workplace bullying goes online: Construction and validation of the inventory of cyberbullying acts at work (ICA-W). European Journal of Work and Organizational Psychology, 27, 28–39. https:// doi.org/10.1080/1359432X.2017.1363185 Weidman, A.  C., Fernandez, K.  C., Levinson, C.  A., Augustine, A.  A., Larsen, R.  J., & Rodebaugh, T.  L. (2012). Compensatory internet use among individuals higher in social anxiety and its implications for well-being. Personality and Individual Differences, 53(3), 191–195. https://doi.org/10.1016/j.paid.2012.03.003. Weisskirch, R. S., & Delevi, R. (2012). Its ovr b/n u n me: Technology use, attachment styles, and gender roles in relationship dissolution. Cyberpsychology, Behavior and Social Networking, 15(9), 486–490. https://doi.org/10.1089/cyber.2012.0169 Widyanto, L., & Griffiths, M. (2006). ‘Internet addiction’: A critical review. International Journal of Mental Health and Addiction, 4(1), 31–51. https://doi.org/10.1007/s11469-006-9009-9 Woods, H. C., & Scott, H. (2016). #Sleepyteens: Social media use in adolescence is associated with poor sleep quality, anxiety, depression and low self-esteem. Journal of Adolescence, 51, 41–49. https://doi.org/10.1016/j.adolescence.2016.05.008 Wright, M.  F. (2015). Cyber victimization and perceived stress. Youth & Society, 47(6), 789–810. https://doi.org/10.1177/0044118X14537088 Wu, X.-S., Zhang, Z.-H., Zhao, F., Wang, W.-J., Li, Y.-F., Bi, L., . . . Sun, Y.-H. (2016). Prevalence of Internet addiction and its association with social support and other related factors among

110   Adrian Meier ET AL. adolescents in China. Journal of Adolescence, 52, 103–111. https://doi.org/10.1016/j. adolescence.2016.07.012 Xie, X., Zhao, F., Xie, J., & Lei, L. (2016). Symbolization of mobile phone and life satisfaction among adolescents in rural areas of China: Mediating of school-related relationships. Computers in Human Behavior, 64, 694–702. https://doi.org/10.1016/j.chb.2016.07.053 Young, K.  S., & Rodgers, R.  C. (1998). The relationship between depression and Internet addiction. CyberPsychology & Behavior, 1(1), 25–28. https://doi.org/10.1089/cpb.1998.1.25

chapter 5

Digita l I nclusion a n d Wom en ’s H e a lth a n d W ell-Bei ng i n Ru r a l Com m u n ities Sharon Wagg, Louise Cooke, and Boyka Simeonova

Introduction Digital inclusion is of global importance as government digital-by-default agendas increasingly recognize the need for society to possess strong digital skills and capabilities to fully benefit from living in a digital world. Yet a global gender digital divide exists where women lack access to information and digital skills, particularly in rural areas (IFLA & TASCHA, 2017). Women are 14% less likely to own a mobile phone than men in low and middle income countries (GSMA, 2015); globally, the proportion of women using the Internet is 12% lower than that of men using the Internet (ITU, 2017a); and while the gender gap in Internet access has narrowed in most regions since 2013, it has widened in Africa, where the proportion of women using the Internet is 25% lower than the proportion of men (ITU, 2017a, p. 3). Digital inclusion initiatives around the world, designed to provide access and the development of digital skills, are critical to bridging the digital divide in local communities (Mervyn et al. 2014). However, the multiple factors that contribute to digital exclusion are complex and make the task of implementing workable digital inclusion solutions particularly challenging for policymakers (Bach et al., 2013). Information literacy helps people make informed choices and decisions about their lives, including the health and well-being of individuals and their families (CILIP, 2018, p. 5). However, as argued by Dunn (2013), “insufficient attention is being paid to the urgency of information literacy as a key component to any strategy to redress the digital

112   Sharon Wagg ET AL. divide” (p. 326), potentially leaving those newly connected to the Internet or with low information literacy vulnerable to poor information content and choices. Anderson and Johnston (2016) argue that without the development of information literacy, “the benefits of digital participation will be significantly diminished” (p. 8). Challenges to access and meaningful use of online information underline the necessity of increased levels of information literacy. “While this may affect both men and women, the challenges are often greater for women (particularly in developing countries) because past information isolation leaves them less equipped to deal with these challenges” (IFLA & TASCHA, 2017, p. 80).

What Is Digital Inclusion? Broadly, digital inclusion refers to the activities necessary to ensure that all individuals and communities, including the most disadvantaged, have access to and meaningful use of information and communication technologies (ICTs). Digital inclusion activities essentially include five key elements: (1) affordable, broadband Internet service, (2) Internet-enabled devices, (3) quality technical support, (4) applications and online content designed to enable and encourage self-sufficiency, participation, and collaboration, and (5) access to digital skills training (NDIA, 2017, n.p.). Such activities are driven by governments to address the digital divide (those without access, skills or the motivation to use ICTs), and implement the digital-by-default agenda (the drive to replace services delivered through face-to-face, telephone and paper-based interactions, with web-based services), and are delivered by a plethora of organizations and community partners (ITU, 2017b; Rhinesmith, 2016). Digital inclusion research emerged from research on the digital divide, a topic widely accepted as a complex and dynamic issue, that continues to evolve, particularly as ICTs evolve and diffuse (Jaeger et al., 2012; Van Dijk, 2005). Digital inclusion is addressed by researchers across various disciplines, but compared to the established area of research on the digital divide, digital inclusion research is relatively new (Jaeger et al.,  2012). Indeed, the Rapid review of evidence for basic digital skills (McGillivray et al., 2017) concluded that there is a notable dearth of academic research in relation to digital inclusion solutions and initiatives, and particularly in relation to the role and responsibilities dig­ital inclusion intermediaries and actors play. Similar to research on the digital divide, digital inclusion is a complex area of enquiry and suffers from conceptual inconsistencies and dichotomies that lead to ambiguities in understanding why and what it takes to be included in the information society (Nemer, 2015).

What Is Information Literacy? The Library and Information Association defines information literacy as “the ability to think critically and make balanced judgements about any information we find and use.

Digital Inclusion and Women’s Health in Rural Communities   113 It empowers us as citizens to develop informed views and to engage fully with society” (CILIP, 2018, p. 3). This definition relates to information in all its forms, including digital and online, reinforcing the relevance and need to consider information literacy when using and accessing the Internet for online information (Anderson & Johnston, 2016; CILIP, 2018; Dunn, 2013). While some scholars advocate information literacy as a set of skills (Andretta, 2005; Burke, 2010), others advocate information literacy as a way of learning (Kuhlthau, 1993), or as an appreciation of the complex ways of interacting with information (Bruce, 2000, p. 97). Yet information literacy research as a concept has traditionally been siloed in the library and information science sector. While there is a significant amount of information literacy research within educational (Corrall, 2008; Secker & Coonan, 2013) and workplace (Lloyd, 2010) settings, and an emerging body of research in information literacy in everyday life contexts (Martzoukou & Abdi, 2017), information literacy research within community settings (relevant to digital inclusion) is barely recognized as a research area (Hepworth & Walton, 2013). However, the CILIP definition emphasizes how information literacy is relevant to everyone in a wide variety of contexts, specifically the contexts of everyday life, health, citizenship, education, and the workplace (Secker, 2018), and as such makes information literacy relevant to digital inclusion and an essential part of this review.

Women’s Health and Well-Being in Rural Communities The importance of digital inclusion and information literacy has been emphasized in a few areas including health and well-being (Ferreira et al., 2016; Park, 2017. It is further emphasized that access to online services could lead to improved health and well-being, particularly in rural areas (Freeman et al., 2016; Hart et al., 2004). However, the specific benefits of digital inclusion and information literacy to women’s health and well-being in rural communities have not been explicated. Therefore, the review aims to examine the literature to outline the specific benefits of digital inclusion initiatives on women’s health and well-being in rural communities.

Rationale for Review This systematic literature review considers research that focuses on the information experiences of women, specifically those who were previously digitally excluded or limited users of the Internet, especially in rural communities, and have benefitted from the support of digital inclusion initiatives and technology. The review provides an opportunity to unpack the complexity of this situation of enquiry by problematizing the concept of digital inclusion; exploring if and how digital inclusion has been linked with the concept of information literacy and digital skills training; providing insight on the role of digital inclusion on women’s health and well-being in rural communities; and revealing tensions and contradictions within digital inclusion practice.

114   Sharon Wagg ET AL. To guide this systematic literature review, the following two questions are addressed. (1) What role do digital inclusion initiatives play with regard to women’s health and wellbeing in rural communities? (2) How have the concepts of digital inclusion with information literacy been linked with regard to digital inclusion skills training? The chapter concludes with an agenda for future research within the realms of digital inclusion and information literacy. The chapter includes the following sections: an outline of the methodology of the systematic literature review; description of the reviewed literature; the findings from the selected papers (with respect to theory and methodologies, terminologies, approaches to digital inclusion initiatives, digital inclusion training, digital inclusion, information literacy, health, and well-being); a brief discussion; and a conclusion.

Methods The review was conducted on journal articles—excluding conference proceedings, PhD theses and book chapters—reporting primary research published worldwide in English language sourced from the Web of Science and Scopus. Search terms included the phrases information literacy, and digital inclusion, combined with the terms rural, gender, health and well-being appearing in the topic. The search yielded 194 results, which following the exclusion of conference proceedings, duplicates, articles that were irrelevant, or in a non-English language, was refined to a final set of 66 journal articles. Articles were identified and selected on the basis of their relevance to digital inclusion and women’s health and well-being in rural communities and links to the concept of information literacy within that context. Due to the multidisciplinary nature of the topic, articles were identified across different research domains such as information science, educational research, computer science, and the broader field of social science research. Drawing on the researcher’s previous experience in digital inclusion and librarianship, a small collection of relevant grey literature (16 items) was also selected to provide richness, context, and currency to the review. These items were predominantly in the form of reports published by thirdsector, corporate, and public policy organizations, such as Development and Access to Information (IFLA & TASCHA, 2017), Lloyds Bank consumer digital index (Lloyds, 2017), and “Smartphone by default” internet users (Ofcom, 2016). Grey literature is cited hereafter with an asterisk (*) to differentiate it from journal articles. The final set of materials (n = 82) of 66 journal articles and 16 grey literature items was coded using thematic analysis. This first involved a general categorization of the articles into a number of foci important on the basis of digital inclusion and information literacy, such as Internet access, digital skills, social inclusion, and learning. The second level of analysis involved the meticulous reading of the texts in order to identify and refine themes and subthemes. Through this process the following themes and subthemes emerged: Theory and Methodologies; Terminology (including subthemes on Digital

Digital Inclusion and Women’s Health in Rural Communities   115 Inclusion, Information Literacy, and Rurality); Approaches to Digital Inclusion (including subthemes on Differentiation of Digital Inclusion Initiatives, Examples of Digital Inclusion Initiatives Intended for Women, the Use of Mobile Technology in Digital Inclusion, Digital Inclusion Frameworks, Measurements and Evaluations); Digital Inclusion Training; and Digital Inclusion, Information Literacy, Health, and Well-Being. Although all the papers were coded, for the purpose of conciseness not every paper is referred to in the text of the analysis; however, a supplementary reference list provides the complete set of analyzed journal articles and grey literature.

Description of the Reviewed Literature The reviewing identified a number of key themes and relationships that paint a complex landscape of enquiry, scope for critique, and opportunities for further research. Journal articles focused across a range of demographics, with a limited number related to just women. Indeed, only a fraction of the academic studies sourced, such as Freeman et al. (2016), Jiménez-Cortés et al. (2015), Martínez-Cantos (2017), Potnis (2015), Rashid (2016), and Rebollo and Vico (2014) specifically link digital inclusion and women’s health and well-being in rural communities. The majority of the journal articles tended to be more focused on the digital divide (Adhikari et al., 2016) and digital inclusion initiatives across a range of sub-groups in developing countries (Correa & Pavez,  2016) and developed countries (Freeman & Park, 2015; Shade, 2014; Turkalj et al., 2013); the development of information literacy (Papen,  2013; Yu et al.,  2017) and health information literacy (Enwald et al.,  2016; Niemelä et al., 2012) or digital literacy skills (Hughes et al., 2017); gender differences in attitudes and use of ICTs and the Internet (Singh, 2017); and the relationships between digital inclusion, digital inequalities, and social inclusion (Park, 2017). Journal articles related to information literacy tended to come from the discipline of information science, although researchers in other fields used varying terminology such as multiliteracy, transliteracy, or digital literacy to describe aspects of information literacy (Aires, 2014). In comparison, journal articles related to digital inclusion came from a wider selection of disciplines such as ICT for Development (ICT4D), Human Computer Interaction, Geography, Education, Health, Rural Studies, and Information Science. There was only a limited crossover between the concepts of digital inclusion and information literacy. Journal articles related to information literacy tended to focus on effective use of the Internet (Berger & Croll,  2012) or Internet/technology adoption (Chiu & Liu, 2017; Yu et al., 2017). Whereas journal articles regarding digital inclusion referred to a plethora of vocabulary related to digital skills and literacies, and technology and infrastructure, the angle of the articles was often influenced by the research discipline of the journal. For example, journal articles from Computer Science and ICT4D

116   Sharon Wagg ET AL. tended to have more of a bias towards digital infrastructure, technology and access (Ferreira et al., 2016; Whitney et al., 2011) whereas Geography focused more on rurality (Roberts et al.,  2015) and Information Science on digital skills and motivation (Thompson & Paul, 2016). Journal articles referred to a plethora of organizations where people would go to access computers and the Internet such as public libraries (Fourie & Meyer, 2016; Real et al., 2014); community centers, cybercafés, and local agencies (Berger & Croll, 2012); telecenters (Ferreira, 2016; Kapondera & Hart, 2016); and education centers and schools (Salinas & Sánchez, 2009; Wei et al., 2013). Bertot et al. (2014) state that public libraries were often the only providers of free broadband Internet service and computer terminals for communities. Overall, the limited number of journal articles specifically on the review topic highlights that there is little academic research in relation to digital inclusion on women’s health and well-being in rural communities. While the majority of the journal articles focused more broadly around the subject of the review, academic research on this topic appears fragmented, meaning research is spread across a range of disciplines and the focus of the articles, theoretical stance, and methods used vary, thus potentially hampering the development of a coherent body of work (Meijer & Bekkers, 2015). The inclusion of some grey literature was essential in addition to the academic literature in order to provide further understanding, richness, and currency. Therefore, the review includes interdisciplinary research in the area and the grey literature, while highlighting gaps and setting an agenda for future research.

Theory and Methods As Table  5.1 summarizes, the studies used a variety of qualitative, quantitative, and mixed methods. While the review highlights some use of theory, only a very small number of journal articles used any underpinning theory (8 out of the 66 journal articles). For example, apart from Diffusion of Innovation Theory (Rogers, 2003), which appears in two articles, all the other theories have only been used in one paper. Activity Theory is discussed by Aires (2014) to explore the opinions of parents and teachers on the Magellan (Magalhães) digital inclusion Initiative in Portugal, to investigate common understandings and contradictions in the dissemination of the digital technologies and digital inclusion in families and schools in rural communities; Diffusion of Innovation (DOI) theory is used in two articles. Correa et al. (2017) use elements of DOI combined with Van Dijk’s (2005) Relational/Network approach to understanding digital inclusion, where consideration of people’s context, position in a community, resources, and social networks are necessary to understand their adoption of ICTs. Kapondera and Hart (2016) invoke DOI as a theoretical framework to examine the factors influencing the use of telecentres in rural areas by means of a case study of Lupaso Community Telecentre, in a remote region of Malawi. Potnis (2015) employs the Global Model of

Digital Inclusion and Women’s Health in Rural Communities   117 Table 5.1  Range of Theories and Methods Identified in Review Theories

Methods

Activity theory (1) Diffusion of innovation theory (2) Global model of human information behaviour (1) Informed learning theory (1) * Institutional theory (1) Media richness theory (1) Relational/network approach (1) * Structuration theory (1)

Action research Case study Ethnography Interviews Literature reviews Observations Questionnaire surveys

Note: (#) = Number of papers using that theory (n = 8 papers); * = same paper

Human Information Behavior as a conceptual model using three constituent constructs—(1) context of information needs, (2) information-seeking behavior, and (3) information processing and use—to examine the information use of poor female mobile phone users in rural India. Hughes et al. (2017) use Informed Learning theory to underpin the development of a new framework to support digital literacy learning through social living labs examined through, a voluntary community organisation in North Queensland, Australia. Madon et al. (2009) apply Institutional theory to analyze three digital inclusion projects to identify processes of institutionalization crucial to the long-term value, sustainability, and scalability of digital inclusion projects. Yu et al. (2017) use Media Richness theory to discover the psychological factors that influence ICT adoption behavior of residents in a rural village in Taiwan. Finally, Structuration theory is used by Correa and Pavez (2016) to explore Internet adoption in isolated rural communities in remote villages in Chile, considering people’s capabilities to choose what they value (i.e., psychological resources, attitudes toward technologies) and social structures (social institutions, cultural norms, and social context).

Terminology Due to the interdisciplinary nature of the review topic, the theme of the need for shared vocabulary and standardization of terminology emerged from the journal articles, ­particularly in relation to the concepts of digital inclusion and information literacy.

Digital Inclusion Very few journal articles defined or attempted to describe or explain the concept of dig­ital inclusion. Indeed, not all journal articles specifically included the phrase “digital inclusion,”

118   Sharon Wagg ET AL. but were clearly focused on research in relation to digital inclusion activities using alternative phrases such as adoption of the Internet and ICT access. Jaeger et al. (2012) define digital inclusion as “the policy developed to close the digital divide” and to “promote digital literacy through outreach to unserved and underserved populations” (p. 3). Thompson et al. (2016) state that digital inclusion is a key component of modern social justice as “the ability of the individual to participate fully in society is increasingly tied to the ability to access and to use digital technologies in a meaningful way for social, political, and economic participation” (p. 93). Hashim et al. (2012) propose that digital inclusion encompasses three areas: access, technology literacy, and content services. According to Rashid (2016), digital inclusion focuses not just on levels of access to ICTs, but also on factors such as motivation, knowledge, and skills that enable individuals to have the ability to meaningfully engage with technology and online information.

Information Literacy Journal articles related to information literacy sometimes included a definition or clarification of the concept such as the Association of College and Research Libraries’ Information Literacy Competency Standards for Higher Education (Dorner & Gorman, 2011), the American Library Association and the Australian and New Zealand Information Literacy Framework (Williamson & Asla, 2009), or the 2005 Alexandria Proclamation on Information Literacy (Jacobs & Berg, 2011, p. 384). Further clarification of the concept of information literacy was provided by Martzoukou and Abdi (2017) within the context of everyday life, stating that information literacy “is regarded as an important condition for civic participation and engagement, informed citizenship, health and well-being” (p. 634). Drawing on theories from information science and new literacy studies, Papen (2013) presents a view of information literacy not primarily as a skill but as a social information practice. Papen argues that researchers studying information literacy need to look beyond people’s abilities to search for and understand information; rather, they need to focus their attention on the contexts within which such information is used. As Yu et al. (2017) highlight, information literacy is about making sense of information found online that is relevant to an individual’s circumstances and specific context, and argue that “information literacy is an important factor in new ICT adoption and increased ICT usage” (p. 206). Information literacy is also clarified in relation to how it helps make informed choices relating to the health and well-being of individuals and families, such as in articles referring to the concepts of health information literacy (HIL) (Martzoukou & Abdi, 2017) and everyday life health information literacy (EHIL) (Niemelä et al., 2012). The presence of HIL “is essential for making health decisions and is considered an important prerequisite for promoting and maintaining an individual’s health” (Martzoukou & Abdi, 2017 p. 649) and for “engaging in an informed dialogue with healthcare professionals” (CILIP, 2018, p. 5).

Digital Inclusion and Women’s Health in Rural Communities   119

Rurality The issue of rurality was discussed within the journal articles but with limited clarification of the actual meaning of the term. Despite the high levels of connectivity in developed countries and growing Internet access in developing countries, digital inclusion in rural areas remains a strong concern for policymakers (Correa & Pavez, 2016; Yueh et al., 2013). Indeed, despite many policymaking efforts that have promoted Internet connection in rural areas, the evidence suggests that digital inclusion is a multifaceted and complex phenomenon that is not “solved” after access is provided (Correa et al., 2017). Boulos et al. (2015) discuss how the higher costs associated with the installation of digital infrastructure for mobile and broadband in rural areas compared to urban areas is overcome through the concept of “distributed cities,” where small neighboring towns and villages (e.g., the Scottish Highlands and Islands) unite together and pool their resources to form a larger “distributed city” and improved economies of scale. Pavez et al. (2017) highlight the importance of understanding rurality, and exploring how people from rural and geographically isolated contexts may experience digital connection differently from an urban perspective. This supports findings by Correa and Pavez (2016) which showed that remote rural communities face specific characteristics, such as lack of economic resources, geographic isolation, an aging population, and outmigration of young people, that need to be considered when thinking about digital inclusion initiatives for their particular context.

Approaches to Digital Inclusion Initiatives The following section provides details of approaches to digital inclusion initiatives, following the subthemes of Differentiation of Digital Inclusion Initiatives; Examples of Digital Inclusion Initiatives Intended for Women; The Use of Mobile Technology in Digital Inclusion; and Digital Inclusion Frameworks, Measurements, and Evaluations.

Differentiation of Digital Inclusion Initiatives: Levels and Approaches When describing approaches to digital inclusion initiatives, journal articles tend to use a macro- or micro-level perspective. Journal articles using a macro-level perspective take a top-down approach to describing digital inclusion and focus primarily on digital inclusion policy and agenda setting issues on a national or international scale (Hughes et al., 2017; Shade, 2014; Martínez-Cantos, 2017). This compares with journal articles taking a

120   Sharon Wagg ET AL. micro-level perspective, which look at specific local or regional digital inclusion projects and case studies (Madon et al., 2009). Some journal articles initially provide a macro perspective and then provide an example of an initiative at micro-level (Berger & Croll, 2012; Broadbent & Papadopoulos, 2013). Digital inclusion initiatives are also described in relation to their activities. For example, Armenta et al. (2012) differentiates digital inclusion initiatives between those that take an access driven/infrastructure approach and those that take a user-centric approach. Indeed, the debate that the provision of technology, infrastructure, and access alone is not enough to get people online is acknowledged in several journal articles (Correa et al., 2017; Freeman & Park, 2015; Haenssgen, 2018; Livingstone & Helsper, 2007). Haenssgen (2018) adds that the techno-centric focus in ICT4D has been criticized for its emphasis on the social embeddedness of technology, user behavior, and different forms of use, yet highlights that the discipline is gradually transitioning towards broader research on technological and social development that permits locally grounded conclusions. Armenta et al. (2012) provide an example of how a techno-centric approach in Mexico was not effective and lacked community participation. Correa and Pavez (2016) note similar findings when evaluating the experiences of individuals in rural communities in Chile that had benefitted from a public/private initiative called Todo Chile Comunicado (All Chile Connected), which provided subsidies for 3G wireless connections. They found a lack of motivation and a level of skepticism among the community participants in adopting the new mobile technologies, again confirming physical access alone is not sufficient. Correa et al. (2017) highlight government top-down approaches to digital inclusion initiatives by discussing programs in Latin America targeting rural areas in Argentina, Bolivia, Brazil, Chile, and Colombia. Their research confirmed that most of these policymaking initiatives focused on the provision of infrastructure; yet while access to both devices and infrastructure connection cannot be dismissed as a logical first step, it does not necessarily entail Internet adoption, particularly in isolated, rural contexts. The researchers recommend that policymakers take into account the social, cultural, and economic context of where these initiatives are implemented. In comparison, Madon et al. (2009) provide a micro-level analysis of three digital inclusion projects: the Akshaya e-literacy project in the state of Kerala in India, a community-based ICT project in South Africa, and a telecenter project in Sau Paulo in Brazil. The researchers describe how the projects changed significantly over time and demonstrate a complex mix of success and failure, and how, while the projects are unique in themselves, they also share common features: • Enrolling government support • Generating linkage to viable revenue streams • Getting symbolic acceptance by the community • Stimulating valuable social activity in relevant social groups The Kerala project, for example, got symbolic acceptance by the community by linking the e-literacy project to Kerala’s development philosophy through grassroots

Digital Inclusion and Women’s Health in Rural Communities   121 campaigning; and stimulated valuable social activity in relevant social groups by widespread participation of groups, such as Muslim women who are often part of the socially excluded. Madon et al. (2009) argue these successful common features are of relevance to digital inclusion projects, particularly in the developing world.

Examples of Digital Inclusion Initiatives Intended for Women The main drivers behind most digital inclusion initiatives aimed at women are related to ensuring access, improving digital literacy, and working towards gender equality and participation of women in the digital world (ITU,  2017a*). ITU’s Gender Digital Inclusion Map (2017b*) provides a list of digital inclusion initiatives from 97 countries around the world aimed at women. In the grey literature, the report Development and access to information (IFLA & TASCHA, 2017*) has a specific focus on women and the need for meaningful access to information and information capabilities, and it provides examples of digital inclusion initiatives, mainly in public libraries. In Uganda, the National Library’s digital skills training program is offered in local languages and is designed for female farmers. In Burkina Faso, the Girls’ Mobile Health Clubs located in village libraries provide access to health information while providing information literacy and technology skills. In Chile, women, young adults, and low-income families receive preferential access to all BiblioRedes, Chile’s national network of some 400 library-based infocentros, which offer free digital literacy classes. Additionally, governments have started to consolidate publicprivate collaboration with different organizations, driving initiatives that empower women through technology. Some examples are Intel’s “She Will Connect” program in Kenya, Nigeria, and South Africa; Mexico’s “Código X;” and India’s “Internet Saathi” (IFLA & TASCHA, 2017*). In most cases these digital inclusion initiatives, through the use of technology, empowered women by ensuring that they have equal access to information and education, enabling them to gain knowledge and confidence and make informed decisions on issues such as family planning and health care. Chile’s network of Infocentros, designed to be women-friendly spaces, is an example of an initiative that has empowered women through the combination of providing a trusted, safe place with digital skills training that has enabled them to develop knowledge and skills which they can use in their everyday life. Importantly, this initiative has stepped away from the “macho culture found in Internet cafés,” enabling women to talk and help each other and get help from directors of the centers (often female), in a way not possible with men (IFLA & TASCHA, p. 81, 2017*). However, for any of this digital inclusion work to happen, social barriers such as cultural demands, illiteracy, and lack of access to education need to be overcome (IFLA & TASCHA, 2017*). The World Wide Web Foundation (2015*) supports this point, stating that “the Internet can support women in making informed choices about their bodies and health, but without adequate access to safe, legal and affordable sexual and reproductive

122   Sharon Wagg ET AL. health services and action against practices such as early marriage, these choices cannot be implemented” (p. 47). As alluded to earlier, only a very small proportion of the journal articles sourced in the review—such as Freeman et al. (2016), Jiménez-Cortés et al. (2015), MartínezCantos (2017), Potnis (2015), Rashid (2016), and Rebollo and Vico (2014)—specifically related to digital inclusion initiatives aimed at women, with reference to health and wellbeing in rural communities. This therefore highlights the limited amount of research on this topic and the potential for further research. The gender digital divide was clearly referenced in the literature and was particularly evidenced in case studies from the developing world and in rural areas (Ferreira et al.,  2016; Rashid,  2016; Rebollo & Vico, 2014). These outlined the information experiences of women, particularly in relation to their access and adoption of using technology and the Internet and the barriers that they faced. A recent report by Intel (2013*) entitled Women and the Web reported that one in five women in India believes the Internet is “not appropriate” for them or useful, and that their families would disapprove. Yet positive aspects about being more connected included how mothers noted that it supports their children with homework and education (Correa et al., 2016). Rashid (2016) states that research on gender and ICTs has for the most part been centered on the concept of the gender digital divide, particularly in relation to access to provision and the fact that proportionally more men than women use the Internet. However, other articles, such as Martínez-Cantos (2017), looked more towards gender differences in attitudes, self-efficacy, and the experiences of men and women in using computers and the Internet. Shade (2014) provides a critical overview of the changing digital inclusion agenda in Canada, describing how that country played an international role in promoting gender equity in access to the Internet. Yet in recent years, despite the continued persistent issue of digital exclusion, the government agenda of online gender equity has significantly diminished and there has been a gradual disinvestment in funding for programs for Internet access. As highlighted by Rashid (2016), to reduce the gender digital divide there is a need for digital inclusion policy interventions to not only focus on the supply-side of providing ICT equipment and connectivity infrastructure, but to also include “a more nuanced understanding of the behavior and use of ICTs by women in meaningful ways to enable them to fulfil specific individual motivations and needs” (p. 327).

The Use of Mobile Technology in Digital Inclusion The use of mobile technology was identified as a key element in digital inclusion activities in the review. In the grey literature, IFLA and TASCHA, (2017*) confirm that “for the billions of people coming online for the first time, mobile phone and increasingly smartphones are their point of entry to the Internet” (p. 31). GSMA’s report Bridging the gender gap: Mobile access and usage in low-and middle-income countries (2015*), and the report Development and access to information (IFLA & TASCHA, 2017*), both provide insights

Digital Inclusion and Women’s Health in Rural Communities   123 into the use of mobile technology by women and its impact on digital inclusion. Although not specifically focused on women, the UK Ofcom report “Smartphone by default” internet users (2016*) provides further insight into the use and behavior of individuals whose only access to the Internet is via a smartphone, and the implications this has in relation to the user experience and digital inclusion. For example, completing online forms (for government services) and creating and editing a document (such as for a CV) via a mobile phone were cited as being particularly challenging. In the Good Things Foundation’s Library Digital Inclusion Fund Action Research project evaluation, the use of mobile technology was a key enabler for the research participants getting online through public library WiFi (Good Things Foundation, 2016c*). The use of mobile technology was also referenced in the academic literature. Correa et al. (2016) found that despite not being able to get good service, many people from Chilean rural communities purchased mobile phones to use when they travelled outside their village. Haenssgen’s (2018) study in rural India argued that households without mobile phones are increasingly disadvantaged in their health care access, stating that “phone diffusion leads [healthcare] providers to expect health-related phone use among the population” (p. 371). Rashid (2016; based on research in developing countries) found that although women rely less on computers and the Internet, they are more likely to use mobile phones compared to men. Yet Potnis’s (2016) research on rural women in India highlighted that women often spoke about rumors and gossip on how mobile phones can cause health problems, thus deterring them from adopting and using mobile phones. Focus group discussions in research by Pavez et al. (2017) also revealed negative perceptions of how the Internet and mobile devices were viewed as intrusive and disruptive to their way of life, with participants referring to the “adverse and harmful consequences attributed to the Internet, including addiction and isolation” (p. 17). Haenssgen (2018) also states that mobile technology has become so pervasive in some domains of Western urban life that it is simply expected of everyone to use it so as to not inconvenience others. Yet as stated by Freeman et al. (2016), not everyone has access, the motivation, or indeed the skills to use online services, and many rural regions struggle with slow or unreliable broadband and mobile phone connectivity.

Digital Inclusion Frameworks, Measurements, and Evaluations Only a limited number of articles focused on the actual process of measuring or evaluating the success and outcomes of digital inclusion initiatives, highlighting a lack of both underpinning theory as well as evaluation procedures to guide digital inclusion research. Smith (2015) provides a conceptual framework for analyzing the success of digital inclusion projects, and Madon et al. (2009) identify three crucial factors that must be considered when planning digital inclusion initiatives: the value, sustainability, and scalability of the project. Armenta et al. (2012) provide a seven-stage framework for rural, underserved and less-privileged populations: (1) identification and evaluation of regional socioeconomic condition, (2) assessment of external factors that impact the region’s sustainable development, (3) identification of those ICT more favorable to

124   Sharon Wagg ET AL. support sustainable development, (4) analysis of financial viability of ICT infrastructure and operations deployment, (5) development and implementation of a technology adoption and training program, (6) development and implementation of and ICT application focused on the regional sustainable development needs, and (7) evaluation of the project. The work of Boulos et al. (2015) related to digital inclusion provides well-being measures calculated through the Organization for Economic Co-operation Development (OECD) Better Life Index for the 34 OECD member countries, and the related OECD Regional Well-Being “How’s life where you are?” tool that covers 362 OECD regions. In addition, digital inclusion research by Jones et al. (2015) include Tennant’s Short Warwick-Edinburgh Mental Well-Being Scale (SWEMWBS) to measure well-being. The grey literature contains examples of “outcomes-based evaluation” of digital inclusion initiatives often in the form of a logic model, as an evaluation and communication tool. According to Rhinesmith and Siefer (2017*), this method is useful for communicating the goals and the “theory of change” underlying the work of digital inclusion initiatives to funders. The grey literature also included two frameworks to measure the  level of people’s digital skills. The UK Essential Digital Skills Framework (Tech Partnership, 2018*) includes five categories of essential digital skills for life and work: communicating, handling information and content, transacting, problem solving, and  being safe and legal online. The European Commission’s Digital Competence Framework for Citizens 2.1 (Carretero et al.,  2017*), includes five competence areas: information and data literacy, communication and collaboration, digital content creation, safety, and problem solving. Both frameworks have been updated to remain relevant. Using the UK Essential Digital Skills Framework (Tech Partnership, 2018*) measures, the Lloyds Bank Digital Index reported that there is a small but increasing digital skills gap between men and women in the United Kingdom (Lloyds Bank, 2017*). Rashid’s (2016) research on gender differences in ICT access and use in five developing countries also involves the development of a digital inclusion index. Based on five broad categories— skills, attitude, frequency of use, location of use, and breadth of use—Rashid developed the index specifically to challenge a commonly held assumption in the discourse on technology and gender that “compared to men women are more likely to be lacking in digital competencies” (p. 326).

Digital Inclusion Training The review identified digital skills training as an important aspect of digital inclusion and the effective use of ICT (Hughes et al.,  2017; Yueh et al.,  2013). For example, Martinez-Cantos (2017) note that the EU, in line with academic research and other political institutions around the world, “considers that digital literacy and associated competences play a key role in the development of the Information Society, and is becoming a priority in initiatives for social inclusion and human capital” (p. 420). As stated by Ferreira (2016) “users need to feel capable of using ICT administered through

Digital Inclusion and Women’s Health in Rural Communities   125 training classes and peer support to overcome lack of experience and to encourage participation” (p. 39). References were made to training and interventions, referring to varying terminology such as digital literacy or digital skills (Martinez-Cantos, 2017) or other inter-related terms such as digital competence (Hatlevik et al., 2015), digital capabilities (Britz et al., 2012), online skills (Zhou & Purushothaman,  2015), Internet literacy (Livingstone & Helsper, 2007), Internet skills (Van Deursen, 2012), computer literacy (Hart et al., 2004), and information literacy (Yu et al., 2017). However, in general few explanations are provided about each of these terms, leaving the reader unclear of the meaning of such terminology. Only a small fraction of the studies linked the concepts of digital inclusion and information literacy. For example, the research of Yu et al. (2017) on understanding factors influencing ICT adoption behavior found that when a digital divide exists, it is important to keep on investing in information literacy development activities for rural communities to help them develop their ICT competence. Wyatt et al. (2005) extend this point by clarifying that while there needs to be an ability to find and make sense of information found online, it is also important to have “the ability to make sense of generic information that is relevant to one’s own circumstances” (p. 213). Approaches to digital inclusion digital skills training are also discussed. Pischetola (2011) emphasizes the need for investment in education and training in schools to use the ICT infrastructure and enhance learning. Berger and Croll (2012), in their work on training in basic Internet skills for special target groups in non-formal educational settings, discuss the trainer/learner relationship and the importance of trust. The researchers highlight a successful intervention in Germany where a female teacher was appointed for a group of female learners to prevent them from feeling intimidated and to help create an open learning atmosphere where any question could be raised without embarrassment. Madon et al.’s (2009) research confirms the importance of this approach, highlighting how a digital inclusion project in Mpumalanga, South Africa failed for a number of reasons, including that the trainers were outsiders whose motives were often suspected. While the review identified the importance of digital skills training, and provided details of specific approaches, there appeared to be a lack of depth in relation to what and how was actually being taught, and this thus provides another opportunity for further research.

Digital Inclusion, Information Literacy, Health, and Well-Being The health and well-being benefits of digital inclusion initiatives received few mentions in the literature (Ferreira et al., 2016; Park, 2017; Rashid, 2016) and did not always relate

126   Sharon Wagg ET AL. specifically to women in rural areas, or provide specific examples of how health and well-being benefits are gained through digital inclusion initiatives. For example, Broadbent and Papadopoulos (2013) found that participants reported some improvement in their sense of well-being attributed to the provision of ICT, citing connecting with relatives, reading news in their own language, and getting access to online services as important conduits to improved health and well-being. Other journal articles referred specifically to health practices. For instance, Freeman et al. (2016) state how poor connectivity inhibits basic health practices, such as contact between patients, physicians, and colleagues, and how rural health services would benefit enormously from effective mobile and Internet services, particularly to communicate with their patients. Hart et al. (2004) highlight how the use of the Internet can increase patients’ knowledge about their health conditions, although patients in their study were often too overwhelmed by the information available on the Internet to make an informed decision about their own care. In the grey literature, as part of their evaluation of the NHS Widening Digital Participation, Good Things Foundation (2016a*) stated there is “a huge crossover between those who are digitally excluded, and those at risk of poor health” (p. 8). Although not specifically aimed at women or rural areas, the project was set up to help improve the digital health skills of people in hard-to-reach communities. Similarly, the English My Way project, also evaluated by Good Things Foundation, designed to help people gain English language skills through a blend of digital tools and face-to-face training sessions, found that participants gained health and well-being benefits (Good Things Foundation, 2016b*). Both projects depended on a network of hyperlocal community organizations and agents who were able to reach out to hard-to-reach communities. Deloitte’s (2014*) report highlights how an empirical study undertaken in rural villages in India to analyze the impact of Internet access on child mortality rates found that villages with Internet access that “provided specific online health information to women during and after pregnancy had 14% lower child mortality rates than villages without the Internet” (p. 19*). As referred to earlier, information literacy is important for health and well-being (Martzoukou & Abdi,  2017) and people’s adoption of the Internet (Yu et al.,  2017). Williamson and Asla (2009) state that information literacy is crucial to the well-being of people in the “fourth age” (a stage of increasing dependence and disability, for those aged 85+). Martzoukou and Abdi’s (2017) work on information literacy in everyday life makes a specific reference to the significant role information literacy can play in both the physical and psychological well-being of women. This is particularly the case in a critical life situation, for example, during pregnancy and childbirth, where the way in which women evaluate different sources of information can have a significant impact. Adekannbi and Adeniran’s (2017) work on the information literacy of women in rural communities in Nigeria discovered that women had limited, basic knowledge of family planning and that the acquisition of information on family planning was accidental, as a majority of research participants did not have access to health centers.

Digital Inclusion and Women’s Health in Rural Communities   127

Discussion The review highlights a number of specific tensions and contradictions in relation to digital inclusion initiatives, definitions, and the relationship with public policy.

Vague and Inconsistent Terminology For example, very few journal articles defined or attempted to describe or explain the concept of digital inclusion, which as evidenced from conducting this review, has led to ambiguities in the understanding and meaning of digital inclusion in academic research. Further confirmation of this tension was revealed by practitioners, funders, policymakers, and other key digital inclusion stakeholders at the 2016 Net Inclusion Summit, who identified a lack of a shared vocabulary in defining digital inclusion (Rhinesmith & Siefer, 2017*). Jaeger et al. (2012) neatly sum up the consequence of this tension stating, “it is a challenge to solve a problem you cannot define, and the inconsistency of definitions has affected policymaking processes that have attempted to address these issues” (p. 4).

Relations between Information Literacy and Digital Inclusion Similarly, tensions in relation to information literacy and how it relates to digital inclusion are also identified through the review. For example, information literacy, despite its association with critical thinking skills (Bingham et al., 2016) and its clear relevance to digital literacy and digital inclusion (Adhikari et al., 2016; Turkalj et al., 2015), continues to be overlooked in digital inclusion policy and practice. This is also confirmed by the lack of linkages found between digital inclusion and information literacy in the review (Wyatt et al., 2005; Yu et al., 2017), as highlighted earlier. The reason for this is partly explained in the work by Jaeger et al. (2012), which explores the inter-relationships between digital literacy, digital inclusion, and public policy, and the fragmented nature of research in this area. They highlight that “while the terms digital divide and digital literacy have entered into common usage, the term digital inclusion is still in its infancy” (p. 3). This suggests that the use of digital inclusion as a term may grow over the forthcoming years, thus providing future opportunities to reveal linkages with information literacy. Another explanation for the lack of linkages found between digital inclusion and information literacy within the review, and a further tension, as alluded to earlier, is that researchers in other fields use varying terminology to describe information literacy and digital inclusion concepts. For example, a small selection of authors including Britz et al. (2012) referred to the application of Amartya Sen’s Capability Approach in relation to an

128   Sharon Wagg ET AL. information-based rights framework, in which an individuals’ ability to use information is influenced by their relative capabilities. Whilst this approach displays similarities with the concepts of information literacy and digital inclusion, it is also highlights the need for a shared vocabulary within digital inclusion research to reduce ambiguities and fragmentation of the research landscape.

Differences between Developing and Developed Country Contexts Contradictions were also revealed within the review. For example, a clear split was identified between digital inclusion initiatives in developing countries and those in developed countries, which were often discussed in contradictory terms. Research in developed countries tended to make a number of assumptions in relation to access, knowledge, and skills. For example, Whitney et al. (2011) ascertained through their research in five European countries that there is increasingly an assumption that people should be able to participate in a wide range of formal activities such as eGovernment, eHealth and eEducation via their computers and mobile phones. Research in developing countries, however, tended to be more about access and infrastructure; how access does not necessarily entail Internet adoption, particularly in isolated contexts; and how digital inclusion needs the support of reliable broadband and electricity (Correa, et al., 2017; Pavez et al., 2017; Potnis, 2015). Contradictions were also highlighted in relation to digital inclusion in public libraries in developed countries. For example, Jaeger (2012) states that libraries report acrossthe-board increases in the use of their public-access technologies, Wi-Fi, training classes, and online resources. Indeed Real et al. (2014) state that public libraries—and rural public libraries in particular—are still the primary source of broadband access for many, highlighting the importance of public libraries for digital inclusion activities. Yet as highlighted by Fourie and Meyer (2016), Jaeger (2012), and Real et al. (2014), this increase in use has occurred concurrently with dramatic decreases in library budgets, government support, and well-trained staff.

Complexity of and Theoretical Approaches toward Initiatives Another major insight identified from the review is the tension regarding the need to better understand the complexity of digital inclusion initiatives (Madon et al., 2009). For example, only a small number of journal articles, as noted earlier, contain an underpinning theory to guide the research and attempt to unpick the complexity of digital inclusion projects. This in turn has led to clear gaps in digital inclusion research, such as

Digital Inclusion and Women’s Health in Rural Communities   129 the lack of insight on the content of digital skills training, leaving scope for criticism, but also providing opportunities for future research into this area.

Conclusion This review provides a number of contributions to the existing literature on digital inclusion and information literacy. First, while the review confirms that there is a global gender digital divide where women lack access to information and digital skills, particularly in rural areas, there is limited research with regard to the role of digital inclusion in women’s health and well-being in rural communities. Second, the review identifies that digital inclusion initiatives are attempting to close the digital divide by providing infrastructure and access to digital technologies; by building capabilities and skills in how to use such technologies and online information; and that mobile technology is playing an increasing role in digital inclusion initiatives. Third, from the limited research that does exist, the review confirms that digital inclusion has the potential to contribute to the improvement of women’s health and well-being in rural communities and that information literacy can play a key role in digital inclusion. Fourth, the review confirms that digital inclusion is a complex area of enquiry, and that digital inclusion research appears fragmented and requires more depth (particularly in relation to terminology, digital skills training, linkages with information literacy and use of theory). Indeed, the inclusion of some grey literature was essential in the review in order to provide further understanding, richness and currency. Finally, the review reveals that significant tensions and contradictions exist within digital inclusion practice and policy. The review does come with its limitations. This review was limited to using two databases, and a selection of grey literature, and so is by no means exhaustive. The exclusion of books and conference papers rendered the search more manageable, as did the omission of the phrase “digital divide” from the search terms which, if included, would have produced a far greater number of articles but perhaps with less specific relevance. The identification of such issues in the literature and limitations of this study helps identify a future research agenda. First, there is a need for further systematic reviews across more databases and grey literature on the research topic with inclusion of a greater number of search terms/phrases. Second, there is opportunity for further research, particularly in relation to (1) the processes and mechanisms of digital inclusion initiatives, (2) digital inclusion digital skills training where the concepts of information literacy and digital inclusion are brought together, and (3) the experiences of women who have benefitted from digital inclusion initiatives. Finally, there is scope to incorporate more underpinning research theory in digital inclusion research to make sense of this complex situation of enquiry and provide a deeper foundation for both shaping research in this area as well as in understanding and evaluating the process and results.

130   Sharon Wagg ET AL.

References Not in Review Database Anderson, A. & Johnston, B. (2016). From information literacy to social epistemology: Insights from psychology. Cambridge, UK: Chandos Publishing. Andretta, S. (2005). Information literacy: A practitioner’s guide. Oxford, UK: Chandos Publishing. Bach, A., Shaffer, G., & Wolfson, T. (2013). Digital human capital: Developing a framework for understanding the economic impact of digital exclusion in low-income communities. Journal of Information Policy, 3, 247–266. Bruce, C. (2000). Information literacy research: Dimensions of the emerging collective consciousness. Australian Academic & Research Libraries, 31(2), 91–109. Burke, M. (2010). Overcoming challenges of the technological age by teaching information literacy skills. Community & Junior College Libraries, 16(4), 247–254. Corrall, S. (2008). Information literacy strategy development in higher education: An exploratory study. International Journal of Information Management, 28(1), 26–37. Dunn, H. S. (2013). Information literacy and the digital divide: Challenging e-exclusion in the Global South. In Information Resources Management Association (2013) digital literacy: Concepts, methodologies, tools, and applications (pp. 20–38). Hershey, PA: IGI Global. Hepworth, M., & Walton, G. (2013). Developing people’s information capabilities: Fostering information literacy in educational, workplace and community contexts. Bingley, UK: Emerald Publishing. Kuhlthau, C. (1993). A principle of uncertainty for information seeking. Journal of Documentation, 49, 339–355. Lloyd, A. (2010). Framing information literacy as information practice: Site ontology and practice theory. Journal of Documentation, 66(2), 245–258. McGillivray, D., Jenkins, N., & Mamattah, S. (2017). Rapid review of evidence for basic digital skills. School of Media, Culture & Society, University of the West of Scotland, Ayr, Scotland. https://digitalparticipation.storage.googleapis.com/reports/Tackling_Digital_Exclusion_ Literature_Review.pdf Meijer, A., & Bekkers, V. (2015). A metatheory of e-government: Creating some order in a fragmented research field. Government Information Quarterly, 32, 237–245. Mervyn, K., Simon, A., & Allen, D. K. (2014). Digital inclusion and social inclusion: A tale of two cities. Information, Communication & Society, 17(9), 1086–1104. NDIA (2017). Definitions. National Digital Inclusion Alliance.https://www.digitalinclusion. org/definitions/ Nemer, D. (2015). From digital divide to digital inclusion and beyond: A positional review. Journal of Community Informatics, 11(1). http://ci-journal.org/index.php/ciej/article/view/1030 Rhinesmith, C. (2016). Digital inclusion and meaningful broadband adoption initiatives. Evanston, IL: Benton Foundation. https://www.benton.org/publications/digital-inclusionand-meaningful-broadband-adoption-initiatives Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: Free Press. Secker, J. & Coonan, E. (2013). Rethinking information literacy: A practical framework for supporting learning. London: Facet. Secker, J. (2018). The revised CILIP definition of information literacy. Journal of Information Literacy, 12(1), 156–158. Van Dijk, J.  A.  G.  M. (2005). The deepening divide: Inequality in the information society. Thousand Oaks, CA: Sage Publications.

Digital Inclusion and Women’s Health in Rural Communities   131 Appendix: Publications Analyzed: Journal Articles (N = 66) and Grey Literature (N = 16) * Grey literature ** Journal articles not cited in the text

Adekannbi, J. O., & Adeniran, O. M. (2017). Information literacy of women on family planning in rural communities of Oyo State Nigeria. Information Development, 33, 351–360. doi:10.1177/0266666916661387 Adhikari, J., Mathrani, A., & Scogings, C. (2016). Bring your own devices classroom: Exploring the issue of digital divide in the teaching and learning contexts. Interactive Technology and Smart Education, 13(4), 323–343. doi:10.1108/ITSE-04-2016-0007 Aires, L. (2014). From dissemination to the domestication of digital technologies in rural communities: Narratives of parents and teachers. Mind, Culture, and Activity, 21(4), 337–352. doi:10.1080/10749039.2014.947654 Armenta, A., Serrano, A., Cabrera, M., & Conte, R. (2012). The new digital divide: The confluence of broadband penetration, sustainable development, technology adoption and community participation. Information Technology for Development, 18, 345–353. doi:10.1080/02 681102.2011.625925 Berger, A., & Croll, J. (2012). Training in basic Internet skills for special target groups in nonformal educational settings: Conclusions from three pilot projects. Research in Learning Technology, 20(4).https://journal.alt.ac.uk/index.php/rlt/article/view/1330 Bingham, T.  J., Wirjapranata, J., & Chinnery, S. (2016). Merging information literacy and evidence-based practice for social work students. New Library World, 117(3/4), 201–213. Blackstock, O. J., Cunningham, C. O., Haughton, L. J., Garner, R. Y., Norwood, C., & Horvath, K.  J. (2016). Higher eHealth literacy is associated with HIV risk behaviors among HIVinfected women who use the Internet. Journal of the Association of Nurses in Aids Care, 27(1), 102–108. ** Boulos, K., Tsouros, A. D., & Holopainen, A. (2015). Social, innovative and smart cities are happy and resilient: Insights from the WHO EURO 2014 International Healthy Cities Conference. International Journal of Health Geographics, 14(3).https://ij-healthgeographics. biomedcentral.com/articles/10.1186/1476-072X-14-3 Britz, J., Hoffmann, A., Ponelis, S., Zimmer, M., & Lor, P. (2012). On considering the application of Amartya Sen’s capability approach to an information-based rights framework. Information Development, 29(2), 106–113. doi:10.1177/0266666912454025 Broadbent, R., & Papadopoulos, T. (2013). Impact and benefits of digital inclusion for social housing residents. Community Development, 44(1), 55–67. Carretero, S., Vuorikari, R., & Punie, Y. (2017). DigComp 2.1: The Digital Competence Framework for Citizens with eight proficiency levels and examples of use. EUR 28558 EN, doi:10.2760/38842 https://ec.europa.eu/jrc/en/publication/eur-scientific-and-technicalresearch-reports/digcomp-21-digital-competence-framework-citizens-eight-proficiencylevels-and-examples-use * Chiu, C. J., & Liu, C. W. (2017). Understanding older adult’s technology adoption and withdrawal for elderly care and education: Mixed method analysis from national survey. Journal of Medical Internet Research, 19(11), e374. CILIP (2018). What is information literacy. Library and Information Professionals Association.http://www.cilip.org.uk/?page=informationliteracy *

132   Sharon Wagg ET AL. Correa, T., & Pavez, I. (2016). Digital inclusion in rural areas: A qualitative exploration of challenges faced by people from isolated communities. Journal of Computer-Mediated Communication, 21, 247–263. doi:10.1111/jcc4.12154 Correa, T., Pavez, I., & Contreras, J. (2017). Beyond access: A relational and resource-based model of household Internet adoption in isolated communities. Telecommunications Policy, 41, 757–768. Crawford, J., & Irving, C. (2007). Information literacy: The link between secondary and tertiary education project and its wider implications. Journal of Librarianship and Information Science, 39(1), 17–26. doi:10.1177/0961000607074812 ** Deloitte (2014). Value of connectivity: Economic and social benefits of expanding internet access. https://www2.deloitte.com/content/dam/Deloitte/ie/Documents/Technology MediaCommunications/2014_uk_tmt_value_of_connectivity_deloitte_ireland.pdf * Dorner, D. G., & Gorman, G. E. (2011). Contextual factors affecting learning in Laos and the implications for information literacy education. Information Research, 16, 1–23. Enwald, H., Hirvonen, N., Huotari, M.  L., Korpelainen, R., Pyky, R., Savolainen, M., & Niemelä, R. (2016). Everyday health information literacy among young men compared with adults with high risk for metabolic syndrome: A cross-sectional population-based study. Journal of Information Science, 42(3), 344–355. https://doi.org/10.1177/0165551516628449 Eynon, R., & Helsper, E. (2015). Family dynamics and Internet use in Britain: What role do children play in adults’ engagement with the Internet? Information, Communication & Society, 18(2), 156–171. doi:10.1080/1369118X.2014.942344 ** Ferreira, S. M., Sayago, S., & Blat, J. (2016). Going beyond telecenters to foster the digital inclusion of older people in Brazil: Lessons learned from a rapid ethnographical study. Information Technology for Development, 22, sup 1, 26–46. doi:10.1080/02681102.2015.1091974 Fotopoulou, A. (2016). Digital and networked by default? Women’s organisations and the social imaginary of networked feminism. New Media & Society, 18(6), 989–1005. ** Fourie, I., and Meyer, A. (2016). Role of libraries in developing an informed and educated nation. Library Hi Tech, 34, 422–432. doi:10.1108/LHT-01-2016-0009 Freeman, J., & Park, S. (2015). Rural realities: Digital communication challenges for rural Australian local governments. Transforming Government: People, Process and Policy, 9(4), 465–479. Freeman, J., Park, S., Middleton, C., & Allen, M. (2016). The importance of broadband for socio-economic development: A perspective from rural Australia. Australasian Journal of Information Systems, 20, 1–18. Gerli, P., Wainwright, D., & Whalley, J. (2017). Infrastructure investment on the margins of the market: The role of niche infrastructure providers in the UK. Telecommunications Policy, 41, 743–756. ** Good Things Foundation. (2016a). Health & digital: Reducing inequalities, improving society: An evaluation of the Widening Digital Participation programme.https://www.goodthingsfoundation.org/sites/default/files/research-publications/improving_digital_health_skills_ report_2016_1.pdf * Good Things Foundation. (2016b). English My Way Phase 2 Evaluation: Final evaluation report.https://www.goodthingsfoundation.org/sites/default/files/research-publications/ emw-phase-2-evaluation-report_-_rev_a.pdf * Good Things Foundation. (2016c). Library Digital Inclusion Fund Action Research Project evaluation final report.https://www.goodthingsfoundation.org/sites/default/files/researchpublications/library_digital_inclusion_fund_action_research_project_final_report.pdf *

Digital Inclusion and Women’s Health in Rural Communities   133 GSMA. (2015). Bridging the gender gap: Mobile access and usage in low-and middle-income countries.https://www.gsma.com/mobilefordevelopment/wp-content/uploads/2016/02/ GSM0001_03232015_GSMAReport_NEWGRAYS-Web.pdf * Haenssgen, M. J. (2018). The struggle for digital inclusion: Phones, healthcare, and marginalisation in rural India. World Development, 104, 358–374. Hart, A., Henwood, F., & Wyatt, S. (2004). The role of the Internet in patient-practitioner relationships: Findings from a qualitative research study. Journal of Medical Internet Research, 6(3), 1. Hashim, R., Kartika, S. I., Ustadi, Y. A., Merican, F. M., & Fuzi, S. F. S. M. (2012). Digital inclusion and lifestyle transformation among the Orang Asli: Sacrificing culture for modernity? Asian Social Science, 8(12), 80–87. Hatlevik, O. E., Ottestad, G., & Throndsen, I. (2015). Predictors of digital competence in 7th grade: A multilevel analysis paper. Journal of Computer Assisted Learning, 31, 220–231. doi:10.1111/jcal.12065 Hughes, H., Wolf, R., & Foth, M. (2017). Informed digital learning through social living labs as participatory methodology: The case of Food Rescue Townsville. Information and Learning Science, 118(9/10), 518–534. IFLA & TASCHA. (2017). Development and access to information. International Federation of  Library Associations and Institutions, and Technology & Social Change Group. University of Washington.https://da2i.ifla.org/sites/da2i.ifla.org/files/uploads/docs/da2i2017-full-report.pdf * Intel. (2013). Women and the Web. https://www.intel.com/content/dam/www/public/us/en/ documents/pdf/women-and-the-web.pdf * ITU. (2017a). ICT facts and figures. International Telecommunication Union.http://www.itu. int/en/ITU-D/Statistics/Documents/facts/ICTFactsFigures2017.pdf * ITU. (2017b). Gender digital inclusion map. International Telecommunication Union.https:// www.itu.int/en/action/gender-equality/Pages/equalsGDImap.aspx * Jacobs, H. L. M., & Berg, S. (2011). Reconnecting information literacy policy with the core values of librarianship. Library Trends, 60(2), 383–394. Jaeger, P. T., Bertot, J. C., Thompson, K. M., Katz, S. M., & DeCoster, E. J. (2012). The intersection of public policy and public access: Digital divides, digital literacy, digital inclusion, and public libraries. Public Library Quarterly, 31(1), 1–20. doi:10.1080/01616846.2012.654728 Jiménez-Cortés, M., Rebollo-Catalán, M.  A., García-Pérez, R., & Buzón-García, O. (2015). Social network user motivation: An analysis of rural women’s profiles. RELIEVE, 21(1), art. 2. doi:10.7203/relieve.21.1.5153 Jones, R. B., Ashurst, E. J., Atkey, J., & Duffy, B. (2015). Older people going online: Its value and before-after evaluation of volunteer support. Journal of Medical Internet Research, 17(5), 1. doi:10.2196/jmir.3943 Julien, H., & Breu, R. D. (2005). Instructional practices in Canadian public libraries. Library & Information Science Research, 27, 281–301. doi:10.1016/j.lisr.2005.04.002 ** Kapondera, S. K., & Hart, G. (2016). The use of multipurpose community telecentres and their services in Malawi: The case of Lupaso Community Telecentre. South African Journal of Libraries and Information Science, 82(1), 13–25. doi:10.7553/82-1-1589 Livingstone, S., & Helsper, E. (2007). Gradations in digital inclusion: Children, young people and the digital divide. New Media & Society, 9(4), 671–696. Lloyds Bank (2017). Lloyds Bank consumer digital index.https://www.thetechpartnership.com/ globalassets/pdfs/research-2017/consumerdigitalindex_mar17.pdf *

134   Sharon Wagg ET AL. Madon, S., Reinhard, N., Roode, D., & Walsham, G. (2009). Digital inclusion projects in developing countries: Processes of institutionalization. Information Technology for Development, 15, 95–107. doi:10.1002/itdj.20108 Martínez-Cantos, J. L. (2017). Digital skills gaps: A pending subject for gender digital inclusion in the European Union. European Journal of Communication, 32(5), 419–438. Martzoukou, K., & Abdi, E. S. (2017). Towards an everyday life information literacy mind-set: A review of literature. Journal of Documentation, 73(4), 634–665. Mehra, B. (2017). Mobilization of rural libraries toward political and economic change in the aftermath of the 2016 Presidential election. Library Quarterly: Information, Community, Policy, 87(4), 369–390. ** Niemelä, R., Ek, S., Eriksson- Backa, K., & Huotari, M. L. (2012). A screening tool for assessing everyday health information literacy. Libri, 62, 125–134. Ofcom (2016). “Smartphone by default” internet users.https://www.ofcom.org.uk/__data/ assets/pdf_file/0028/62929/smarphone_by_default_2016.pdf * Papen, U. (2013). Conceptualising information literacy as social practice: A study of pregnant women’s information practices. Information Research, 18(2), n2. Park, S. (2017). Digital inequalities in rural Australia: A double jeopardy of remoteness and social exclusion. Journal of Rural Studies, 54, 399–407. Pavez, I., Correa, T., & Contreras, J. (2017). Meanings of (dis)connection: Exploring non-users in isolated rural communities with internet access infrastructure. Poetics, 63, 11–21. Pischetola, M. (2011). Digital media and learning evolution: A research on sustainable local empowerment. Global Media Journal, 11(18), 1–14. Potnis, D. D. (2015). Beyond access to information: Understanding the use of information by poor female mobile users in rural India. Information Society, 31, 83–93. doi:10.1080/0197224 3.2014.976687 Rashid, A. T. (2016). Digital inclusion and social inequality: Gender differences in ICT access and use in five developing countries. Gender, Technology and Development, 20(3), 306–332. Real, B., Bertot, J. C., & Jaeger, P. T. (2014). Rural public libraries and digital inclusion: Issues and challenges. Information Technology and Libraries, 33(1), 6–24. doi:10.6017/ital.v33i1.5141 Rebollo, A. M, &.Vico, A. (2014). Perceived social support as a factor of rural women’s digital inclusion in online social networks. Comunicar, 22, 173–180. doi:10.3916/C43-2014–17 Renteria, C. (2015). How transformational mobile banking optimizes household expenditures: A case study from rural communities in Mexico. Information Technologies & International Development, 11, 39–54.** Rhinesmith, C. & Siefer, A. (2017). Digital inclusion: Outcomes-based evaluation. Evanston, IL: Benton Foundation.https://www.benton.org/publications/digital-inclusion-outcomesbased-evaluation * Roberts, E., Farrington, J., & Skerratt, S. (2015). Evaluating new digital technologies through a framework of resilience. Scottish Geographical Journal, 131(3–4), 253–264. Salinas, A., & Sánchez, J. (2009). Digital inclusion in Chile: Internet in rural schools. International Journal of Educational Development, 29, 573–582. doi:10.1016/j. ijedudev.2009.04.003 Sanchez, M. S. O., & Sanchez, M. R. F. (2017). Digital technologies and rural women’s entrepreneurship. Prisma Social, 18, 259–277. ** Shade, L.  R. (2014). Missing in action: Gender in Canada’s digital economy agenda. Signs: Journal of Women in Culture and Society, 39(4), 887–896.

Digital Inclusion and Women’s Health in Rural Communities   135 Singh, S. (2017). Bridging the gender digital divide in developing countries. Journal of Children and Media, 11(2), 245–247. doi:10.1080/17482798.2017.1305604 Smith, C. (2015). An analysis of digital inclusion projects: Three crucial factors and four key components. Journal of Information Technology Education: Research, 14(1), 179–188. Stevenson, S.  A., & Domsy, C. (2016). Redeploying public librarians to the front-lines: Prioritizing digital inclusion. Library Review, 65, 370–385. doi:10.1108/LR-02-2016-0015 ** Tech Partnership (2018). Basic digital skills framework consultation. https://www.thetechpartnership.com/basic-digital-skills/basic-digital-skills-framework/ * Thompson, K.  M., & Paul, A. (2016). “I am not sure how much it will be helpful for me”: Factors for digital inclusion among middle-class women in India. Library Quarterly: Information, Community, Policy, 86(1), 93–106. Turkalj, D., Bilos, A., & Kelic, I. (2015). Integration of Croatian farmers in the EU information society—Issues and implications. Ekonomski Vjesnik, 28, 41–52. doi:http://www.efos.unios. hr/ekonomski-vjesnik/en/table-of-contents-by-issue/ Van Deursen, A.  J.  A.  M. (2012). Internet skill-related problems in accessing online health information. International Journal of Medical Informatics, 81, 61–72. Wei, Z., Jiang, G., Niu, T., Zou, T., & Dong, E. (2013). A tale of two counties: How two school libraries in rural western China serve local needs. Library Trends, 62, 205–233. doi:10.1353/ lib.2013.0029 Whitney, G., Keith, S., Buhler, C., Hewer, S., Lhotska, L., Miesenberger, K., . . . Engelen, J. (2011). Twenty five years of training and education in ICT design for all and assistive technology. Technology and Disability, 23(3), 163–170. Williamson, K., & Asla, T. (2009). Information behavior of people in the fourth age: Implications for the conceptualization of information literacy. Library & Information Science Research, 31, 76–83. World Wide Web Foundation. (2015). Women’s Rights Online: Translating Access into Empowerment—Global Report October 2015. http://webfoundation.org/docs/2015/10/womensrights-online21102015.pdf * Wyatt, S., Henwood, F., Hart, A., & Smith, J. (2005). The digital divide, health information and everyday life. New Media & Society, 7, 199–218. doi:10.1177/1461444805050747 Yu, T.-K., Lin, M.-L., & Liao, Y.-K. (2017). Understanding factors influencing information communication technology adoption behavior: The moderators of information literacy and digital skills. Computers in Human Behavior, 71, 196–208. doi:10.1016/j.chb.2017.02.005 Yueh, H. P., Chen, T. L., Chiu, L. A., & Lin, W. C. (2013). Exploring factors affecting learner’s perception of learning information and communication technology: A HLM analysis of a national farmers’ training program in Taiwan. Educational Technology and Society, 16, 231–242. Zhou, C., & Purushothaman, A. (2015). The need to foster creativity and digital inclusion among women users in developing context: Addressing second order digital divide in online skills. iJET (International Journal of Emerging Technologies in Learning), 10(3), 69–74.

chapter 6

Digita l Tech nol ogy for Older Peopl e A Review of Recent Research Helen Petrie and Jenny S. Darzentas

Introduction One of the great challenges facing the world today is the aging of the population. The United Nations (2017) estimates that in 2017 there were 962 million people aged 60 or over worldwide, but by 2050 there will be 2.1 billion people in that group, a rise from 12.7% to 21.5% of the population. Currently, Europe and Japan have the greatest percentage of population aged 60 or over (over 25%), but by 2050 all regions of the world except Africa will have nearly a quarter or more of their populations aged over 60. An important consequence of the aging population is that the ratio of people of working age to older people (the Potential Support Ratio, PSR) is declining. Thus, there will be fewer people of working age to care and support the older population. Europe currently has a PSR of approximately 4 younger people for each older one, although many European countries have a PSR of less than 3 younger/older, and Japan’s is the lowest at 2.1 (United Nations, 2017). Digital technologies are often presented as a major solution to this increasing problem of providing care to older people. Increasingly, particularly in the wealthier countries, it is expected that older people will be cared for and will care for themselves using digital technologies. However, the relationship between digital technologies and older people is rather more complex than many commentators suggest. First, old age spans from people in their 60s to people well over 100 years old. People currently in their 60s may have been using digital technologies for many years, whereas those much older may have little experience with these technologies. So the acceptance and familiarity of digital technologies may be very different for different cohorts of older people. Second, we can think of two different ways of “being digital” for older people. In the first way of “being digital”, their use of mainstream technologies, older people both need

Digital Technology for Older People   137 and want to be able to use the many mainstream digital technologies that have emerged in recent years and that are continuing to emerge. For example, automated teller machines (ATMs) are now the most common way of withdrawing cash from one’s bank account. Twenty years ago, people over 60 rarely used ATMs (van Schaik, Petrie & Kirby, 1995). But as these machines have become more and more common, and bank branches less common, everyone, including older people, needs to use them, whether they wish to or not. Consequently, banks and ATM manufacturers have had to consider the needs of older users, for example that very short time-outs may not be appropriate, and that text, button sizes and colors need to be suitable for older eyes and fingers. In addition, older people want to use many mainstream digital technologies. For example, they often realize that the best way to communicate with their children and grandchildren is via email, Skype, or a social networking site (e.g., Sayago, Forbes & Blat, 2012). In the second way of “being digital”, the use of technologies specially developed to assist older people with problems they encounter in their daily lives, technologies have provided many opportunities for assisting older people in overcoming such problems. This may be as simple as an electronic pillbox that reminds older people to take their medicines, or complex systems involving the use of the global positioning system of satellites (GPS) to assist older people in navigating unfamiliar environments (Petrie, Johnson, & Strothotte, 1997) and to monitor and locate older people with dementia, who may wander and become confused (Jönsson & Svensk, 1995). So, these digital technologies must be acceptable and usable by older people. This chapter presents a review of recent research on digital technologies for older people, highlighting research in both of these ways of “being digital”. First, we will present the scope of our review and an overview of the 16 different research topics that emerged from this review. Then we will consider in more depth four of the topics of research, three related to the first way of “being digital”, that is older people’s use of mainstream digital technologies. These three emerged most frequently in our review, namely: older people’s 1) interaction with digital technologies, 2) lived experience of digital technologies, and 3) use of digital technologies for communication and social interaction. The fourth topic to be considered in depth is related to the second way of “being digital”: i.e. digital technologies that are specially developed to assist older people, with the specific topic being 4) that of monitoring their welfare. Finally, we reflect on some of the overarching themes that emerged from the research, some of the limitations of recent research, and further areas of research that need to be undertaken.

Scope of the Review As part of an ongoing review of research on technology for disabled and older people (Petrie, Gallagher, & Darzentas, 2014), research about older people published in a selection of peer-reviewed conferences and journals between 2005 and 2017 was identified for this chapter. Conferences and journals were chosen that deal primarily with the design

138   Helen Petrie and Jenny S. Darzentas and  evaluation of technologies with the target users, rather than with the technical implementation of the technologies. This means that papers should provide insight into what area of need or interest for older people is being addressed, how older people were involved in the research, and what outcomes were achieved. A range of mainstream outlets in human-computer interaction and human factors, as well as specialist outlets in gerontology and rehabilitation technology, were selected (see Table 6.1). Outlets were selected for inclusion based on their Impact Factor (Thomson Reuters, 2013); journals with the highest impact factors for their sector were chosen. The Australian Research Council’s (2012) rankings of journals and conferences were also used in the decisions. Papers were included if they used words relevant to older people and technology in the title, abstract, or keywords. Terms included “older people”, “older adults,” or “elders” in mainstream papers (which were by definition about technology, although this was checked) and in addition terms such as “computer/s” and “assistive technology” in the specialist papers. Table 6.2 provides the search terms and how they were used. Only papers published in English were included. There is not a well-established definition of when old age begins and therefore at what age people become “older” or “elderly”. Typically, 60 or 65 years of age are used to indicate the start point of old age in chronological terms, although it is well accepted that there are wide individual differences in the aging process. Therefore, we made no attempt to impose a definition of older people Table 6.1  Mainstream and Specialist Outlets Included in the Review Mainstream outlets Journals ACM Transactions on Computer Human Interaction (ToCHI) Behaviour and Information Technology Human Computer Interaction Human Factors International Journal of Human-Computer Studies Conferences ACM Conference on Human Factors in Computing Systems (CHI) British Computer Society Interaction Specialist Group Conference (BCS HCI) IFIP TC 13 Conference on Human-Computer Interaction (INTERACT) Nordic Conference on Human-Computer Interaction (NordiCHI) Specialist outlets Journals ACM Transactions on Accessible Computing (ToACCESS) Educational Gerontology Technology and Disability Universal Access to the Information Society Conferences ACM Conference on Computers and Accessibility (ASSETS) International Conference on Computers Helping People with Special Needs (ICCHP)

Digital Technology for Older People   139 Table 6.2  Terms Related to Older People Used to Select Papers for Inclusion in the Review Category

Term

Referring to older people in general

aging (ambiguous alone, only used in conjunction with other terms) aging population elder/s elderly (people) geriatric/s grandparents older adult/s senior adult/s

Specific conditions related to aging

Alzheimer’s dementia Parkinson’s disease (if the emphasis is on the disabilities related to Parkinson’s)

Technology (if they later included reference to older people)

assistive technology/ies cognitive prosthetic/s web accessibility

on the selection of papers; if the paper stated it was about older people, it was included in the review. This process identified 407 papers. See the appendix at the end of the chapter for the full list of references analyzed. An initial analysis, based on the area of need or interest of older people, rather than on the technology deployed, grouped the research conducted into 16 topics (see Table 6.3).

Uses of Mainstream Technologies by and for Older People Four of these 16 topics were chosen for more detailed exploration in the remainder of this chapter.

Topic 1: Older People’s Interaction with Mainstream Digital Technologies This topic investigated how older people physically interact with mainstream digital technologies, the problems they might encounter, and different solutions that have been developed to overcome these problems. It was the most frequent in the research reviewed,

140   Helen Petrie and Jenny S. Darzentas Table 6.3  The 16 Topics of the Research in the Papers Reviewed Topic

Number of papers

*Interaction with digital technologies: physical use of technologies, for example use of mouse, touchscreen, voice input

90

*Older people’s lived experience of digital technologies: adoption, attitudes, holistic experiences, abandonment of digital technologies

58

*Communication and social interaction: using digital technologies to communicate with others and for social interaction, including use of email, social networking sites, online communities

52

Methods for research with/about older people: including problems of using existing methods and methodological innovations

39

Access and use of information: including access to and use of the Web, eBooks, eKiosks and other digital forms of information

37

Education and training (of older people and other stakeholders): education and training of older people (e.g., in computer skills), also education and training of others (e.g., doctors, nurses, carers, engineers) in working with older people

28

*Monitoring older people’s welfare: use of digital technologies to monitor older people’s movements, vital signs, home environments

28

Activities of everyday living: using digital technologies to support all kinds of everyday living activities, from writing a cheque to dispensing medicines

22

Mobility and wayfinding: indoor and outdoor mobility and wayfinding, also older drivers and driving cessation

20

Health and well being: use of digital technologies to support all aspects of health and well-being

17

Support for carers/others: use of digital technologies to support those who are supporting older people, including carers, family members, healthcare staff

14

Games and leisure: digital technologies for leisure, including video games for older people, digital versions of traditional games

13

Memory: digital technologies to support memory problems in older people, (e.g., medication reminders, calendars and appointment alert systems)

9

Exercise: digital technologies to support exercise in older people, for general fitness and rehabilitation exercise

8

Understanding general user requirements for using digital technologies: studies investigate older people’s acceptance, use across a range of digital technologies

7

Rehabilitation: digital technologies to support rehabilitation programs for older people

6

Notes 1: * indicates topics discussed in detail in this chapter. 2: Papers often covered several topics, so the numbers are greater than the total of 407 publications.

Digital Technology for Older People   141 with 90 papers addressing it. This topic included four subtopics: physical interaction, spoken dialogue interaction, multimodality, and security. Physical interaction. Physical interaction might include pointing and clicking with a mouse, or tapping on a touchscreen; research considered how these interaction styles might be adapted for older users. Hwang and colleagues (Hwang, Hollinworth, & Williams, 2013) proposed several ways in which items on a computer display, such as icons, can be expanded to make them easier for older people to select. They found that using such techniques substantially improved selection time and reduced error rates. Sayago and Blat (2008) found that older people were not concerned about how fast they could interact, but were very concerned about not making errors, so this second result is particularly important. Jochems, Vetter, and Schlick (2013) compared younger and older people in their use of mouse, touchscreen, and eye gaze control, and found that both groups were fastest with a touchscreen, particularly the older group. They also investigated combining eye gaze with input from a keyboard, speech input, or a foot pedal, finding that the best combination was eye gaze with keyboard confirmation. In research on touchscreen interaction specifically, Apted, Kay, and Quigley (2006) introduced older people to interaction with a tabletop computer using a digital photograph sharing application. They found that older people coped well with tasks in this application, although they took longer to complete them than did younger people. They also understood the new interface elements, although initially they had some difficulty with one of the elements, a copy operation. This was overcome with further training. Lepicard and Vigouroux took a more experimental approach to touchscreen interaction, investigating one-hand versus two-hand interaction (Lepicard & Vigouroux, 2010), and single-touch versus multi-touch interaction (Lepicard & Vigouroux, 2012). In relation to number of hands, older people were faster and more accurate when using one hand than using two hands. In relation to multi-touch, both younger and older people had more difficulty with multi-touch interaction, but especially the older people. Nicolau and Jorge (2012) investigated text entry via a touchscreen, comparing mobile phone input with tablet computer input. Older participants made more errors on a mobile than on a tablet; although input speed, but not accuracy, correlated with experience with the QWERTY keyboard. On the mobile in particular, the amount of hand tremor (a common problem for older people) correlated strongly with less accuracy. Wulf, Garschall, Klein et al. (2014) investigated younger and older people’s gesture performance on a tablet touchscreen, including dragging, pinching and rotating. Older people were slower than younger people, but both groups were more accurate when the tablet was in portrait rather than landscape orientation. Finally, Muskens, van Lent, Vijfvinkel et al. (2014) designed touchscreen applications to be particularly usable by older people. They successfully eliminated problems with button size, navigation, readability of fonts, and gesture execution. They found that older people had strong preferences for designs with low numbers of icons, direct input, no deep hierarchies, large buttons with immediate feedback, clear notification that screens have changed, and bright colors. Spoken dialogue interaction. In terms of more natural interaction paradigms, only two papers reviewed investigated spoken dialogue interaction for older people.

142   Helen Petrie and Jenny S. Darzentas Wolters, Kilgour, MacPherson et al. (2015) explored a bottom-up approach to adapting spoken dialogue systems for older people. In an analysis of a corpus of spoken interactions between an intelligent computer agent and both younger and older people, they found two main groups of people, a factual group and a social group. The factual users adapted quickly to the dialogue system and interacted with it efficiently; the social users treated the system more like a human being and did not change their interaction style when it did not understand their requests. Almost all the social users were older, although about a third of older users were factual in style. The authors concluded that spoken dialogue systems need to adapt to users based on observed behavior of users, not age per se. Vacher, Caffiau, Portet et al. (2015) also found that older users were more inclined to treat a spoken dialogue system as a human being and were disturbed by the rigid grammars needed to use such systems. Multimodality. Four papers explored multimodal aspects of interaction for older people. Carrasco, Epelde, Moreno et al. (2008) and Diaz-Orueta, Etxaniz, Gonzalez et al. (2014) studied the use of avatars for older people with Alzheimer’s disease, using a TV screen to display the avatar and a TV remote control for input. Both studies found that older people readily understood this interaction metaphor and were able to interact successfully in simple dialogues. Nunes, Kerwin, and Silva (2012) also tested a TV platform for interaction but used text and icons rather than a visual avatar. Again, they found that older people could interact successfully with the system, although there were interesting usability problems, for example around understanding standard icons for the video player. On the basis of a number of evaluations, the authors produced a set of guidelines for TV-based applications for older people. Finally, Warnock, McGee-Lennon, and Brewster (2013) investigated using multimodal notifications for home care reminder systems for older people. There were no particular differences between younger and older people in their reactions to textual, pictographic, abstract visual, speech, sound, tactile, and olfactory notifications in the context of playing a game in a laboratory setting. Security. A number of papers investigated security issues in interacting with digital technologies for older people. Renaud and Ramsay (2007) explored authentication mechanisms that would be easier for older users but equally secure, including recognition of handwritten numerals and doodles. Nicholson, Coventry, and Briggs (2013a) compared face-based and picture-based authentication systems, finding that older people performed better with the face-based authentication while younger people performed better with the picture-based authentication. However, in further work, Nicholson, Coventry, and Briggs (2013b) reported that older people performed better with age-appropriate faces.

Topic 2: Older People’s Lived Experience of Digital Technologies This topic addresses older people’s lived experience of digital technologies. It includes research into how older people understand meaningful practices with technology as

Digital Technology for Older People   143 well as issues such as quality of life, well-being, and aging-in-place, and how these might impact existing and future technologies. It was the second most frequent topic in the research reviewed, with 58 papers addressing it. Research reviewed often addressed issues of older people’s acceptance of, and motivation to use digital technologies, including research about understanding values (Briggs & Thomas, 2015); how older people account for their difficulties in learning to use a computer (Turner, Turner, & van de Walle, 2007); older people’s concerns about using mobile phones (Kurniawan, 2008); their emailing practices and the barriers to using email (Sayago & Blat, 2010); their use and sharing of YouTube videos (Sayago et al.,  2012); their perceptions of telecare (Bentley, Powell, Orrell et al.,  2014); and their attitudes toward robots as supportive devices (Pigini, Facal, Blasi et al.,  2012; Scopelliti, Giuliani, & Fornara,  2005). These issues constituted four subtopics: types of technology, acceptance of technology, value of technology, and the importance of stigma. Types of technology. A number of papers discussed a wide variety of experiences with different types of digital technology: the Internet (Briggs & Thomas,  2015; Larsson, Larsson-Lund, & Nilsson, 2013); videos (Ferreira, Sayago, & Blat, 2014); email (Sayago & Blat, 2010); telecare systems (Šimšík, Galajdová, Siman et al., 2012); a television-based information system (Ferreira et al., 2014); various features of smart homes (Brajnik & Giachin,  2014; Leitner, Fercher, Felfernig et al.,  2012); and domestic robots (Heerink, Kröse, Wielinga et al., 2009; Pigini et al., 2012; Scopelliti et al., 2005). As one example, Sáenz-de-Urturi, Zapirain, and Zorrilla (2015) investigated the suitability of a Kinectbased game for rehabilitation exercises for older users. Since the participants in their study included wheelchair users, people with Parkinson’s, people with one hand, and people with vision impairments, they needed to adapt the technology and make the games configurable for people in different situations. After testing the prototype, they also made other adjustments to the game presentation (e.g., animated instructions rather than text to read, larger fonts for scores). The game required reaching out for objects, thus creating exercises to use the arms, as well as activating cognitive processes, because players have to recognize the objects to catch from amongst other objects. They found that participants became absorbed in the game and engaged in the exercises as part of the game. Acceptance of technologies. Much research has attempted to investigate acceptance of digital technologies (Bentley et al., 2014; Heerink et al., 2009). Researchers have been able to develop nuanced accounts of barriers to take-up. For example, Kurniawan’s (2008) survey of older people revealed that the role of mobile phones was perceived to increase their feelings of safety, particularly when they felt themselves in vulnerable, or potentially vulnerable, situations such as being alone, going out, getting lost, or being in trouble. Consequently, the phones were not used primarily in their communication or entertainment capacities. The study also reported various problems with learning to use the devices. Heerink et al. (2009) investigated whether social abilities of robots and screen agents would influence their use by older people. In an experiment with two types of agent, an onscreen avatar and a tabletop robot, implemented in a highly sociable and a less sociable condition. They found that older people were more comfortable with the more social agent, particularly with the robot. They concluded that social

144   Helen Petrie and Jenny S. Darzentas abilities are important to interaction and need to be implemented in intelligent support technologies for use by older people. Value of technology. Sayago and Blat (2008; also Sayago et al., 2012), studying older people at a computer club, found they had great motivation to learn. They wanted to use email and share videos to maintain social communication, especially with their families. For this motivated group, the researchers concluded that reducing cognitive load was more important in the design of systems for these older people than interface design (e.g., screen size, button size). For example, to reduce cognitive load in their use of YouTube each older people made use of familiar practices, such as copying and pasting links from emails, rather than querying the search engine; if using the search function, they typed complete sentences into the search box, rather than first using categories to narrow down the search (Sayago et al., 2012). In another study with older people learning to use computers, Turner et al. (2007) investigated the values older people placed on this activity. In practice this meant understanding the ways older people viewed their experiences and accounted for their learning difficulties and those of their peer group. Seven valuebased explanations emerged: alienation (“not my world”); lack of fit with one’s identity (“I worked with people not machines”); agency (the computer being in control, rather than the person; in addition, the pressure to use technology); anxiety; belief in being too old to learn; being too busy; and finally, questioning the purpose of learning to use computers. The researchers concluded that it is important to seek out older people’s values and understandings of themselves in relation to digital technologies and help them to reframe these values in more optimistic and positive ways. Larsson et al. (2013) investigated how older people’s perceptions and experiences of Internet activities reflected more generally on their being able to participate in society. Older people perceived that not undertaking Internet-based activities implied being moved to the sidelines. For instance, one interviewee explained that in a group she participates in, the group leader sent out information by email, forgetting that not all participants have access to this technology. Participants also noted that services with traditional delivery, such as health services, now take much longer compared to Internetbased delivery. This study was conducted with older people who were open to technology, but who also cited conditions that are required for them to engage in Internet activities, such as support and continual use (so that they remember what they have learnt), as well as problems with trust (e.g., buying online). A further issue was whether some Internet activities (e.g., social networking sites) are useful for older people, since the most commonly cited need was that of communication with family and friends, which they accomplished via email and video links. These findings were also echoed in other research with older people learning computer skills. Wanting to see whether the commonly used measures of usability, such as time to complete task, were relevant for older people, Sayago and Blat (2008) found slow task completion was not an issue. The participants valued accuracy more than efficiency and wanted to take their time. For them, it was important not to make mistakes, for they often found they could not recover from mistakes without asking for help. The researchers noted the importance of self-efficacy: they reported how one participant said that she

Digital Technology for Older People   145 enjoyed feeling of being competent despite never using a computer before and having a low level of education (Sayago et al., 2012). The importance of stigma. Bentley et al. (2014) discovered that stigma was a strong reason for non-acceptance of digital technologies by older people. They investigated whether telecare products, such as pendants and panic button systems, were considered acceptable by older people who were not current users of such products. They found resistance to these systems: older people saw them as symbols of old age and loss of autonomy and claimed that their designs were stigmatizing as well as impractical. However, the overall concept of being able to summon help was considered useful and important, and participants acknowledged that they might use such systems in the future. Other studies of home deployment of technologies also found that stigma was a concern. For instance, Doyle, Bailey, Scanaill et al. (2014) explained that an alertness awareness cushion was specifically designed to fit in with the home environment, and not look like a piece of assistive technology.

Topic 3: Older People’s Use of Digital Technology for Communication and Social Interaction Communication and social interaction are very important activities for health and well-being in later life, and the lessening of these activities poses risks as serious as those for cigarette smoking, high blood pressure, and obesity (Cohen, Underwood, & Gottlieb, 2000). Therefore, it is not surprising that there was a considerable amount of research on this topic with 52 papers addressing this topic in the papers reviewed. Subtopics included social networking, facilitating interaction, motivations for interaction, intergenerational interaction, communication habits, obstacles to communication, reminiscing, and loneliness. Social networking. Nine papers investigated the use of social networking sites (SNSs). A network analytic approach comparing different age groups (Arjan, Pfeil, & Zaphiris, 2008) revealed a number of interesting differences, including that, compared to younger people, older people had smaller networks of friends in SNSs and a greater variety in the age of their friends, and represented themselves in more formal ways. A study of older people in the UK and in Cyprus revealed the effect of different cultures in their attitudes to and use of online social support communities (Michailidou, Parmaxi, & Zaphiris, 2015). Older people in the UK who used such communities were happy to interact with people outside their family, but were reluctant to reveal too much about themselves, due to their fears about security in online situations. On the other hand, older people in Cyprus mostly used such communities to interact with family members: they were generally aware of and confident about online security issues, having discussed them with their families. Other studies about SNSs (Gibson, Moncur, Forbes et al., 2010; Lehtinen, Näsänen, & Sarvas,  2009) investigated older people’s attitudes to such sites after the researchers demonstrated and helped them register on the network. In both these studies, the older

146   Helen Petrie and Jenny S. Darzentas users (located in Finland and Scotland) reported they did not feel they needed this channel of communication: they were happy communicating with people they knew by email. Privacy was a problem for two reasons: older people were wary of giving information about themselves, and they worried about information accidentally becoming public. They also felt it was not socially acceptable to broadcast information about themselves on SNSs. Results from an online questionnaire (Prieto & Leahy, 2012) supported these findings, noting that the main reasons for not using SNSs were privacy issues, complexity of their use, and friends not using them. Also, most older people who were users of SNSs had been using them for less than five years and got to know about them from family more than from friends. However, the researchers suggest that SNSs might be a more interesting way of introducing older people to computer usage than by browsing websites. Studying older users and their social interactions off and online, Harley, Howland, Harris et al. (2014) noted that older people were often passive users on Facebook, logging in to see what family members were doing, especially younger members who did not use email to communicate with them, but did not themselves post on Facebook. Norval, Arnott, and Hanson (2014) proposed recommendations for making SNSs more usable for older people who had expressed interest in using such sites, but for whom the complexity of the applications was a barrier. Finally, Coelho, Rito, Luz et al., (2015) investigated the problem of easier interactions for older people on SNSs, using familiar technologies like television and alternative interaction types such as speech and tapping on tablets. They also identified the functions that older people most value: to share photos and television content with family and close friends, and to be able to manage different groups of acquaintances. Facilitating interaction. On the theme of facilitating interaction with communication and social interaction technologies, Spreicer, Ehrenstrasser, and Tellioğlu (2012) investigated tangible interfaces (interfaces that include physical objects, using tokens to represent different functions, e.g., for calling or for sending photos), and explored the idea of personalized tokens that could be placed on a surface to initiate actions. The result was a playful interface design, using familiar objects. Older participants in workshops reacted to this concept very positively. They supplied meaningful objects from their own collections, the researchers enhanced these with RFID tags, and when placed on a special surface, these objects would, for example, start a Skype call, or send an email. One much appreciated aspect of the design was a reduced demand for space in the homes of older people, a need supported by other studies (e.g., Doyle, Skrba, McDonnell et al., 2010). Motivations for interaction. A number of papers investigated motivations for communication and social interaction with digital technologies by investigating what older people did with existing mainstream technologies. Conci, Pianesi, and Zancanaro (2009) showed that older people perceived mobile phones to be primarily a utilitarian device for enhancing safety, and that support with use was needed even with practice. Unlike younger users, older users showed little enjoyment or fulfilment in using their phones. Trying to understand the needs of people transitioning from working to retirement, Salovaara, Lehmuskallio, Hedman et al. (2010) showed that for many older people, the Internet, and in particular, email and online calendars, had become important

Digital Technology for Older People   147 means of maintaining and even initiating new social contacts. The older people felt that these tools helped them to cope with the stresses and conflicts of the transition from working to retirement which involved new activities and commitments, housing arrangements, etc. Intergenerational interaction. A number of papers investigated the theme of intergenerational interaction. Staying in contact with grandchildren is a major motivation for older people to engage with and learn to use digital technologies. Studies by Vutborg, Kjeldskov, Vetere et al. (2010) and Fuchsberger, Sellner, Moser et al. (2012) described systems to facilitate interactions between grandparents and grandchildren. Fuchsberger and colleagues were able to show that the motivation to use a technology for this purpose was very strong, even though users had low computer skills and found it difficult to use. Gamliel and Gambay (2014) investigated intergenerational teaching programs in schools, where children and older people taught one another, and found that older people showed strong learning motivation and took the assignments set by the children about learning how to use technology very seriously. Finally, in research about encouraging older people’s social interaction amongst themselves at a community center by playing games, Mubin, Shadid, and Mahmud (2008) reported that the participants were keen to include their grandchildren in the activity. Communication habits. Researchers have also examined the nature of older people’s habits with their communication technologies. Many older people prefer to sustain close relationships that are meaningful to them, rather than seek to make new acquaintances (Lindley, Harper, & Sellen, 2008). Older people are prepared to spend time keeping in touch with valued friends and maintaining family links (Lindley, Harper, & Sellen, 2009). Sokoler and Svensson (2007) concentrated on ways to include technologies for enabling social interaction that would not stigmatize older users as lonely people craving companionship. Dowds and Masthoff (2015) described a system to provide live video feeds for people who are unable to visit each other in person. The idea is that the “window on the outside world” will be stimulating and may lead to a desire to participate online in other activities. Doyle et al. (2010) reported on the deployment of a touchscreen device for communication activities. The device was designed to broadcast some content, with health suggested as being of particular interest to older people. The hypothesis was that the broadcast content would act as the trigger to begin interactions, as older people might send messages or call one another to comment on the broadcast program. In fact, it was found that the broadcasts were not much attended to, partly due to the fixed broadcast times. However, the long deployment period (7-9 weeks) yielded much information about how older people felt about such communication. While they agreed it would be useful for people who are housebound, particularly for calling and messaging, they were concerned about issues of disturbance and availability. Finally, Otjacques, Krier, Feltz et al. (2009) conducted an exploratory study in a large residential care facility about a social activities management system, allowing residents to book places on outings and events. The researchers noted a tendency for the physical spaces where the technologies are installed to become face-to-face meeting places for residents. Obstacles to communication. Several papers discussed specific health obstacles to communication, such as aphasia and dementia, and how technology could be used to

148   Helen Petrie and Jenny S. Darzentas help with these. Tixier and Lewkowicz (2015) aimed to increase communication between family carers of people with Alzheimer’s disease, often spouses and hence older people themselves. Their results showed that online support could help facilitate meeting arrangements and help continue communication between carers of people with Alzheimer’s. Such support would help reinforce the interaction between carers that was already taking place, but only maintained via face-to face meetings. Kalman, Geraghty, Thompson et al. (2012) attempted to indirectly diagnose aphasia, showing that it could be reliably detected in online messages. Mahmud, Limpens, and Martens (2013) investigated the design of a tool for manipulating digital photographs to be used to communicate everyday happenings and stories. Both the researchers dealing with aphasia and those dealing with dementia sought to stimulate social interactions, making use of technologies to initiate reminiscence, which has been shown to be beneficial for older people. Reminiscing. Supporting reminiscing in older people as a way of stimulating basic social interactions was the goal of Nijhof, van Hoof, van Rijn et al. (2014) and Siriaraya and Ang (2014). Nijhof and colleagues compared two games, one supported by technologically enhanced objects, made to look like familiar objects such as a television or a telephone. When these enhanced objects were switched on they played a fragment of music or a movie clip, to trigger memories. The researchers studied older people’s responses to these enhanced objects, such as smiling, laughing out loud, making gestures, singing, and answering with a short answer or with a story. There were no significant differences between responses to the technologically enhanced game and a traditional one. The facilitators of the activities, who were staff in institutions where the players lived, gave feedback on the designs. For instance, they noted that the television, as a visual tool, was the most successful of the triggers, whereas the telephone was confusing, because when it rang and was answered, it started playing music instead of a voice being heard. The staff felt that the enhanced objects could help trigger more responses with less prompting by the facilitators if different types of content were used (more general subjects, like nature and animals). They could be very useful in helping to bring more novel approaches into stimulating communication with the older people. Siriaraya and Ang (2014) created a virtual world environment for people with dementia in a care home. They found that older people were attracted to the wonderland character of the virtual world, and that it triggered reactions from some residents who began to talk and reminisce, and to tell stories, helped and encouraged by the care staff. Loneliness. Finally, although research on communication and social interaction often mentioned social isolation, only one study specifically considered loneliness amongst older people. Van der Heide, Willems, Spreeuwenberg et al. (2012) investigated mitigating loneliness with a television-based system allowing older people living in­de­ pend­ently to interact with carers, family and friends. A large number of older people (130) completed a questionnaire at the beginning of the study and again a year later. Their responses were assessed in relation to both emotional loneliness (missing an intimate relationship) and social loneliness (missing a wider social network). Analysis showed that use of the system for social interaction was positive and that feelings of both emotional and (more so) social loneliness were reduced.

Digital Technology for Older People   149

Topic 4: Using Digital Technologies to Assist Older People: Monitoring Older People’s Welfare Research on monitoring older people’s welfare has investigated various aspects of safety and security. These included ways to monitor sleep, wandering, falls, and risky behaviors, via recording vital signs or tracking people’s movements. Indications that people may need help included irregular pulse or not moving. Since falls are a major source of accidents and anxiety about them is high, fall prevention is an active area of research (Kepski & Kwolek,  2012; Oberzaucher, Jagos, Zödl et al.,  2010; Schikhof, Mulder, & Choenni, 2010). Older people were also monitored for activity patterns, to determine behaviors that might be unusual, for instance spending a long time in the corridor or opening the front door. Other possibilities included monitoring activities of daily living by interpreting data from sensors placed in various parts of the home (e.g., on the fridge, in the bathroom) (Lexis, Everink, van der Heide et al., 2013), or interpreting sleep behavior (Carey-Smith, Evans, & Orpwood, 2013; Nijhof, van Gemert Pijnen, de Jong et al., 2012), as well as locating people who may have wandered out of the house, or who may be exhibiting erratic behaviors, e.g., not completing normal routines. Technologies for monitoring, quality of life, ethical concerns, and beneficiaries of monitoring systems were the subtopics. Technologies for monitoring. The term monitoring conjures up visions of people being under surveillance by closed circuit TV cameras, possibly without their knowledge, but in fact a range of technologies has been developed that track older people’s movements and vital functions in ways that are transparent to the users. Some are designed to enable older people to control their home environment with smart home technologies (Abascal, de Castro, LaFuente & Cia, 2008). Such technologies can be configured to individuals’ needs: for example, where mobility is an issue, they can enable remote opening and closing of windows, curtains and doors, or remote checking of who is at the front door. Similarly, environmental sensor-activated systems (Lexis et al., 2013) can be set up to switch on lights as an older person comes into a room, to keep rooms at appropriate temperatures, to check on appliances to ensure they are not left on, etc. Other monitoring technologies include a range of wearables such as watches, belts, or pendants (Ahanathapillai, Amor & James,  2015; Holliday, Ward, Fielden et al.,  2015; Nijhof et al., 2012), and even shoe insoles (Oberzaucher et al., 2010). The purpose of these is to monitor vital signs (e.g., heart beat or pulse), to send alerts (e.g., time to take medication, call for help in an emergency), or to monitor gait to prevent falls (as noted earlier, a common and serious occurrence amongst older people). Recently robotic devices (e.g., Mehdi & Berns, 2014) have been developed to search autonomously for an older person, to check their status, rather than have them under constant human supervision. Although most of the technologies are for indoor use, in private homes (e.g., Abascal et al., 2008; Casas, Marin, Robinet et al., 2008; Lozano, Hernáez, Picón et al., 2010; Orpwood, Gibbs, Adlam et al., 2005), or in assisted care settings (e.g., Lexis et al., 2013; Martin, Nugent, Wallace et al., 2007; Schikhof et al., 2010), some have been

150   Helen Petrie and Jenny S. Darzentas developed for outdoor use, to allow people to move outside but still be protected from getting lost when wandering (Boulos, Anastasiou, Bekiaris et al., 2011; Wan, Müller, Wulf et al., 2014). Quality of life. Beyond concerns with physical safety and security, research in the monitoring topic investigated general quality of life. For instance, Schikhof et al. (2010) found that care staff in an assisted living facility expressed a concern about their charges, who were older people with dementia, having panic attacks while alone in their rooms. One of the technological solutions proposed and tested as a result was a system to detect if older people with dementia in the facility were having a panic attack: if an attack were detected, the system would help the care staff to quickly intervene to comfort and reassure them. Ethical concerns. Ethics was an important recurring subtopic. Some papers addressed this only in passing, as it was not the main thrust of the work being reported. Nevertheless, it was an important dimension of the type of work being undertaken. For example, there is a fine divide between tracking, monitoring, and surveillance (Holzinger, Searle, Kleingerger et al., 2008). Other papers treated this theme more fully: for instance, investigating the notion of trust (Ahanathapillai et al., 2015), and the ambivalence of feelings regarding freedom versus monitoring (Boström, Kjellström, & Björklund, 2013), while Casas, Marco, and Falcó et al. (2006) developed the basis for an ethics framework associated with digital technologies for older people. Beneficiaries of monitoring systems. In many cases the end-users of the monitoring systems were not older people, but family members, informal and professional carers, and nursing staff. Older people being monitored had a largely passive role, although in some cases they were in control of the system (Boulos et al., 2011; Holliday et al., 2015; Lexis et al., 2013). The primary beneficiaries of the monitoring system were, however, considered to be the older people who were being monitored. Such systems aimed to give them a sense of safety and well-being (Orpwood et al., 2005; Schikhof et al., 2010). However, there was also benefit for caregivers, for example, to professional care staff for better management of their time that necessarily had to be divided between a number of older people (Schikhof et al., 2010) and to be better able to tailor care (Boström et al., 2013; Carey-Smith et al., 2013; Lexis et al., 2013; Nijhof et al., 2012), and to give some peace of mind to families and carers.

Reflections on the Research on Uses of Digital Technology for Older People Particular Subtopics within Topics Our review of recent research on the use of digital technologies for older people shows that this is a vibrant area of research, with much activity on many different topics and over 400 papers identified. The four topics chosen for detailed discussion in this chapter

Digital Technology for Older People   151 demonstrate a range of themes and subtopics within them that together give a good representation of the questions investigated by researchers. In the first topic, older people’s interaction with digital technologies, in addition to the obvious subtopics corresponding to the interaction types based on current technologies such as natural dialogue and touch, there are also intriguing insights, such as the lack of concern for speed and greater ease with face-based than picture-based authentication systems. The second topic, older people’s lived experience of digital technologies, illustrated the wide range of technologies, both established and emergent, being investigated, from technologies deployed in smart homes, to those employed in health and well-being. In terms of particular subtopics of interest within this topic, researchers investigated people’s acceptance of technologies, developing a deeper understanding their value systems and beliefs. This included their dislike of technologies that declared too obviously that they needed assistance. The third topic, older people’s use of digital technology for communication and social interaction, moved to a specific application area, although involving many types of dig­ ital technologies. Social networking sites (SNSs) featured prominently, and age differences in their use were particularly interesting, showing that older people have smaller networks of friends, and were mostly passive users of such systems, feeling that to broadcast information about oneself publicly is not socially acceptable. Often researchers investigated ways to facilitate digital interaction for older people, for example in terms of interface design. But researchers also investigated older people’s motivations for using these technologies, for instance, for keeping in touch with their families, particularly grandchildren. They sought to understand better what older people’s communication habits are, and also obstacles to communication. They found that reminiscing, a wellknown technique to encourage social communication, could be encouraged with some technologies, and that loneliness could be reduced. Finally, the fourth topic, monitoring older people’s welfare, was chosen as an example of an area in which emerging technologies are being deployed in the care of older people. Besides the range of ways to monitor, subtopics that emerged from this topic were quality of life and ethical concerns, but also a call for clarity about acknowledging who are the beneficiaries of such systems.

Two Themes across the Four Topics Although the papers discussed in detail in this chapter covered four different topics, there were a number of themes that recurred across the topics. Here, we highlight two of these themes, control and familiarity. The issue of who is in control of digital technologies came up in many ways throughout the review. For instance, in the longitudinal study by Leitner et al. (2012) in which older people kept equipment for 36 months, they were able to pick and choose what they wanted installed, gradually gaining confidence and knowing they could ask for components to be removed. Also, Lexis et al. (2013) helped older people to understand the kind of data that was being collected from sensors in their bathrooms. They had imagined it

152   Helen Petrie and Jenny S. Darzentas might be photographs but were shown that it was just numbers. Orpwood et al. (2005) found that a common problem for older people with dementia is flooding caused by bathroom or kitchen taps being left on. An engineering solution to this problem could use a sensor so that the water supply turns off when the water reaches too high a level. However, this would take control away from the older person and could confuse them, as they would find later that the taps no longer work. Instead, a system of reminder messages triggered by the sensor was proposed. Scopelliti et al. (2005) found that older people were more apprehensive than younger people at the prospect of a robot in the home, and so would like to be in control of it. Accordingly, they expressed preferences for robots to be small, slow moving, with limited autonomy, and with fixed well-defined tasks. The requirement that it is important that older people feel in control of their environment was also highlighted in the research by Doyle et al. (2014) and Pigini et al. (2012). One of the technologies deployed in people’s homes in the study by Doyle et al. (2014) for a balance and exercise system was meant to use a chair, but this was cumbersome and took up too much space. The kitchen sink was then proposed by the participants themselves as a stable place to hold onto while doing exercises, even if this meant the camera and screen had to be positioned in the kitchen. Thus, the older people reconfigured the positioning of the new technology themselves. Pirgini et al. (2012) found that older people voiced fears that a robot might be uncontrollable and clumsy, and damage or break things. The older people voiced strong psychological attachments to their homes, furniture, and ornaments, and said they would prefer no technology rather than technology they could not control and thus that might harm those possessions. The second theme, familiarity, in the context of this review refers to building upon older people’s existing knowledge and learning strategies (Ballegaard, Pedersen, & Bardram, 2006; Lehtinen et al., 2009). There was much support for the idea that at different stages in their lives people use different strategies when learning to use technology: trial and error is favored by young people, the reading of instructions and manuals by older people; and as well, older people often prefer to ask experts for help (Larsson et al., 2013; Leitner et al., 2012). This was found in numerous settings and technologies, from older people’s behavior in computer classes to their learning to use home monitoring systems. Following the principle of familiarity also means that the cognitive load to learn new routines will lessen the negative impact the perceived utility of the technology (Heerink et al., 2009; Sayago and Blat, 2010). Familiarity also referred to the technology fitting in with people’s routines or their physical environments. For example, the importance of building on objects and systems that people are already familiar with was discussed by Doyle et al. (2014) and Holzinger, Schaupp, and Eder-Halbedl (2008). That of fitting new technologies appropriately into older people’s lived routines was discussed by Dickinson and Gregor (2006) and Orpwood et al. (2005).

Limitations and Future Research It is important to note some of the limitations of the studies, as well as areas that could benefit from further research. In all the disciplines with research on older people, typically,

Digital Technology for Older People   153 chronological age is used as the measure of old age. But this is not a reliable guide, since there is a great deal of heterogeneity between older people, and even if a more specific set of groupings is sometimes used, for example “young-old”, “old-old”, and “oldest-old” (Petrie, 2001), people vary in experience and abilities. Particularly at present, in terms of digital literacy, someone aged 60 may have used computers in the workplace, while someone aged 80 may not. The questionnaire developed by Arning and Ziefle (2008) to assess computer experience and expertise was an interesting attempt to address this problem. Also, physical and mental health varies considerably across older people of different ages, and even with the same age. A further aspect that could offer more nuanced understandings of older people’s use of digital technologies is to address contextual and cultural differences in research. It was clear that many of the studies investigated technological practices embedded in a particular societal and organizational setting, such as residential assisted care homes in Holland (Nijhof et al.,  2014), occupational therapy in Sweden (Molin, Pettersson, & Jonsson et al.,  2007), and computer classes for older people in Spain (Sayago & Blat, 2008). There were also instances in the research when it was clear that cultural practices had an important effect on the outcomes of studies. The higher amount of religious content watched on television in Brazil compared to Spain meant the proposed interactive television service that was based around religious content was of more interest to older people in Brazil but did not work so well in Spain (Ferreira et al., 2014). Older people in the UK and in Cyprus revealed the effect of different cultures in their attitudes to and use of online social support communities (Michailidou et al., 2015). The different levels of Internet penetration in different countries was also important. Older people in Denmark mentioned that airline tickets could only be booked online and many government services were online (Ferreira et al., 2014), and Internet-based healthcare services were available for older people in Sweden (Larsson et al., 2013), but such services are not yet available in other countries. Other culture-based attitudes were noted by Pigini et al. (2012). Although older people in Germany, Italy, and Spain attached a similar level of importance to food preparation, so that the suggestion to have a robot help prepare meals by heating food in a microwave was considered a useless function, participants from Germany and Italy objected more to the proposed robot cooking functions than did their Spanish counterparts. Such results highlight the challenges of cultural influences on digital technology use and attitudes amongst older people. It is also important that researchers disseminate their results back to the appropriate diverse disciplines. Awareness of issues and updates are important within disciplines, and across disciplines, as seen by papers that dealt with lack of awareness about telecare and fall alert systems (Bentley et al., 2014), or were about health and social care professionals who need help to bridge gaps in organizational knowledge about what technology is available and how to determine what is suitable for their older people (Molin et al., 2007). We have attempted to illustrate the range of research on digital technologies for older people. Research in this area is particularly challenging as it needs to draw on work from many disciplines as different as gerontology and engineering. It is also vital to work very closely with the relevant users, older people themselves but also other stakeholders such

154   Helen Petrie and Jenny S. Darzentas as family members, carers, and professionals, to ensure that digital technologies are useful to, and are acceptable, understandable, and usable by, older people and their caregivers.

Conclusion This review has shown that research on digital technologies for older people, both the use of mainstream technologies and the use of specially developed technologies, is a very diverse area of endeavor, with many lines of research on a wide range of themes. Research ranges from studies that are developing new methods to help older people physically interact with digital technologies to those exploring the meanings of digital technologies for older people. As with all research, the more we explore these topics, the more questions we raise.

Acknowledgments The research for this chapter has been partly funded by the European Union under the Marie Skłodowska-Curie Action Experienced Researcher Fellowship Program, as part of the Education and Engagement for inclusive Design and Development of Digital Systems and Services Project (E2D3S2, Grant No. 706396). We would like to thank Bláithín Gallagher and Leonardo Sandoval for their help in gathering material for this chapter.

References Abascal, J., de Castro, I. F., Lafuente, A., & Cia, J. M. (2008). Adaptive interfaces for supportive ambient intelligence environments. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 30–37). Ahanathapillai, V., Amor, J. D., & James, C. J. (2015). Assistive technology to monitor activity, health and wellbeing in old age: The wrist wearable unit in the USEFIL project. Technology and Disability, 27(1–2), 17–29. Apted, T., Kay, J., & Quigley, A. (2006). Tabletop sharing of digital photographs for the elderly. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI 2006) (pp. 781–790). Arjan, R., Pfeil, U., & Zaphiris, P. (2008). Age differences in online social networking. In Extended abstracts on human factors in computing systems (CHI EA ‘08) (pp. 2739–2744). Arning, K., & Ziefle, M. (2008). Development and validation of a computer expertise questionnaire for older adults. Behaviour and Information Technology, 27(1), 89–93. Australian Research Council. (2012). Excellence in research for Australia (ERA): ERA outcomes 2012. Available at: http://www.arc.gov.au/era-outcomes-2012 Ballegaard, S. A., Pedersen, J. B., & Bardram, J. E. (2006). Where to, Roberta? Reflecting on the role of technology in assisted living. In Proceedings of the 4th Nordic conference on humancomputer interaction (NordiCHI ‘06) (pp. 373–376). Bentley, C. L., Powell, L. A, Orrell, A., & Mountain, G. A. (2014). Addressing design and suitability barriers to telecare use: Has anything changed? Technology and Disability, 26, 221–235.

Digital Technology for Older People   155 Boström, M., Kjellström, S., & Björklund, A. (2013). Older persons have ambivalent feelings about the use of monitoring technologies. Technology and Disability, 25(2), 117–125. Boulos, M. N. K., Anastasiou, A., Bekiaris, E., & Panou, M. (2011). Geo-enabled technologies for independent living: Examples from four European projects. Technology and Disability, 23(1), 7–17. Brajnik, G., & Giachin, C. (2014). Using sketches and storyboards to assess impact of age difference in user experience. International Journal of Human-Computer Studies, 72, 552–566. Briggs, P., & Thomas, L. (2015). An inclusive, value sensitive design perspective on future identity technologies. ACM Transactions on Computer-Human Interaction, 22(5), art 23. Carey-Smith, B., Evans, N.  M., & Orpwood, R. (2013). A user-centred design process to develop technology to improve sleep quality in residential care homes. Technology and Disability, 25(1), 49–58. Carrasco, E., Epelde, G., Moreno, A., Ortiz, A., Garcia, I., Buiza, C. et al. (2008). Natural interaction between avatars and persons with Alzheimer’s disease. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 38–45). Casas, R., Marco, Á., Falcó, J. L., Artigas, J. I., & Abascal, J. (2006). Ethically aware design of a location system for people with dementia. In Proceedings of 10th computers helping people with special needs (ICCHP ‘06) (pp. 777–784). Casas, R., Marín, R. B., Robinet, A., Delgado, A. R., Yarza, A. R., McGinn, J. et al. (2008). User modelling in ambient intelligence for elderly and disabled people. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 114–122). Coelho, J., Rito, F., Luz, N., & Duarte, C. (2015). Prototyping TV and tablet Facebook interfaces for older adults. In J.  Abascal, S.  Barbosa, M.  Fetter, T.  Gross, P.  Palanque, & M. Winckler (Eds.), Proceedings of human-computer interaction (INTERACT 2015). Lecture Notes in Computer Science, 9296 (pp. 110–128). Cohen, S., Underwood, L. G., & Gottlieb, B. H. (2000). Social support measurement and intervention: A guide for health and social scientists. Oxford, UK: Oxford University Press. Conci, M., Pianesi, F., & Zancanaro, M. (2009). Useful, social and enjoyable: Mobile phone adoption by older people. In T. Gross et al. (Eds.), Proceedings of human-computer interaction (INTERACT 2009). Lecture Notes in Computer Science, 5726, (pp. 63–76). Diaz-Orueta, U., Etxaniz, A., Gonzalez, M. F., Buiza, C., Urdaneta, E., & Yanguas, J. (2014). Role of cognitive and functional performance in the interactions between elderly people with cognitive decline and an avatar on TV. Universal Access to the Information Society, 13, 89–97. Dickinson, A., & Gregor, P. (2006). Computer use has no demonstrated impact on the wellbeing of older adults. International Journal of Human-Computer Studies, 64(8), 744–753. Dowds, G., & Masthoff, J. (2015). A virtual “window to the outside world”: Initial design and plans for evaluation. In Proceedings of the 2015 British HCI conference (British HCI ‘15) (pp. 265–266). Doyle, J., Bailey, C., Scanaill, C. Ni, & Berg, F. (2014). Lessons learned in deploying in­de­pend­ ent living technologies to older adults’ homes. Universal Access in the Information. Society, 13(2), 191–204. Doyle, J., Skrba, Z., McDonnell, R., & Arent, B. (2010). Designing a touch screen communication device to support social interaction amongst older adults. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 177–185). Ferreira, S. M., Sayago, S., & Blat, J. (2014). Towards iTV services for older people: Exploring their interactions with online video portals in different cultural backgrounds. Technology and Disability, 26(4), 199–209.

156   Helen Petrie and Jenny S. Darzentas Fuchsberger, V., Sellner, W., Moser, C., & Tscheligi, M. (2012). Benefits and hurdles for older adults in intergenerational online interactions. In: Proceedings of 13th international conference on computers helping people with special needs (ICCHP 2012) (pp. 697–704). Gamliel, T., & Gabay, N. (2014). Knowledge exchange, social interactions, and empowerment in an intergenerational technology program at school. Educational Gerontology, 40(8), 597–617. Gibson, L., Moncur, W., Forbes, P., Arnott, J., Martin, C., & Bhachu, A. S. (2010). Designing social networking sites for older adults. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 186–194). Harley, D., Howland, K., Harris, E., & Redlich, C. (2014). Online communities for older users: What can we learn from local community interactions to create social sites that work for older people. In Proceedings of the 28th international BCS human computer interaction conference on HCI 2014 (BCS-HCI ‘14) (pp. 42–51). Heerink, M., Kröse, B., Wielinga, B., & Evers, V. (2009). Measuring the influence of social abilities on acceptance of an interface robot and a screen agent by elderly users. In Proceedings of the 23rd British HCI group annual conference on people and computers: celebrating people and technology (pp. 430–439). Holliday, N., Ward, G., Fielden, S., & Williams, S. (2015). Exploration of information needs and development of resources to inform and support those at risk of falling. Technology and Disability, 27(1–2), 31–40. Holzinger, A., Schaupp, K., & Eder-Halbedl, W. (2008). An investigation on acceptance of ubiquitous devices for the elderly in a geriatric hospital environment: Using the example of person tracking. In Proceedings of the 11th international conference on computers helping people with special needs (ICCHP 2008) (pp. 22–29). Holzinger, A., Searle, G., Kleingerger, T., Seffah, A., & Javahery, H. (2008). Investigating usability metrics for the design and development of applications for the elderly. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP 2008) (pp. 98–105). Hwang, F., Hollinworth, N., & Williams, N. (2013). Effects of target expansion on selection performance in older computer users. ACM Transactions on Accessible Computing, 5(1), 1. Jochems, N., Vetter, S., & Schlick, C. (2013). A comparative study of information input devices for aging computer users. Behaviour and Information Technology, 32(9), 902–919. Jönsson, B., & Svensk, A. (1995). Isaac—a personal digital assistant for the differently abled. In I.  Placencia-Porrero, & R.  Puig de la Bellacasa (Eds.), The European context for assistive technology (Proceedings of the 2nd TIDE Congress) (pp. 356–361). Amsterdam: IOS Press. Kalman, Y. M., Geraghty, K., Thompson, C. K., & Gergle, D. (2012). Detecting linguistic HCI markers in an online aphasia support group. In Proceedings of the 14th International ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 65–70). Kepski, M., & Kwolek, B. (2012). Fall detection on embedded platform using Kinect and wireless accelerometer. In Proceedings of 13th international conference on computers helping people with special Needs (ICCHP ‘12) (pp. 407–414). Kurniawan, S. (2008). Older people and mobile phones: A multi-method investigation. International Journal of Human-Computer Studies, 66(12), 889–901. Larsson, E., Larsson-Lund, M., & Nilsson, I. (2013). Internet Based Activities (IBAs): Seniors’ experiences of the conditions required for the performance of and the influence of these conditions on their own participation in society. Educational Gerontology, 93(3), 155–167. Lehtinen, V., Näsänen, J., & Sarvas, R. (2009). “A little silly and empty-headed”: Older adults’ understandings of social networking sites. In Proceedings of British HCI 2009 (pp. 45–54).

Digital Technology for Older People   157 Leitner, G., Fercher, A. J., Felfernig, A., & Hitz, M. (2012). Reducing the entry threshold of AAL systems: Preliminary results from Casa Vecchia. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP 2012 (pp. 709–715). Lepicard, G., & Vigouroux, N. (2010). Touch screen user interfaces for older subjects: Effect of the targets number and the two hands use. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 592–599). Lepicard, G., & Vigouroux, N. (2012). Comparison between single-touch and multi-touch interaction for older people. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 658–665). Lexis, M., Everink, I., van der Heide, L., Spreeuwenberg, M., Willems, C., & de Witte, L. (2013). Activity monitoring technology to support homecare delivery to frail and psychogeriatric elderly persons living at home alone. Technology and Disability, 25(3), 189–197. Lindley, S. E., Harper, R., & Sellen, A. (2008). Designing for elders: Exploring the complexity of relationships in later life. In Proceedings of BCS HCI 2008 (pp. 77–86). Lindley, S. E., Harper, R., & Sellen, A. (2009). Desiring to be in touch in a changing communications landscape: Attitudes of older adults. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ‘09) (pp. 1693–1702). Lozano, H.  Hernáez, I., Picón, A., Camarena, J., & Navas, E. (2010). Audio classification techniques in home environments for elderly/dependant people. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 320–323). Mahmud, A. A., Limpens, Y., & Martens, J.-B. (2013). Expressing through digital photographs: an assistive tool for persons with aphasia. Universal Access in the Information Society, 12(3), 309–326. Martin, S., Nugent, C., Wallace, J., Kernohan, G., McCreight, B., & Mulvenna, M. (2007). Using context awareness within the “Smart home” environment to support social care for adults with dementia. Technology and Disability, 19(2,3), 143–152. Mehdi, S. A. & Berns, K. (2014). Behaviour-based search of human by an autonomous indoor mobile robot in simulation. Universal Access Information Society, 13, 45–58. Michailidou, E., Parmaxi, A., & Zaphiris, P. (2015). Culture effects in online social support for older people: perceptions and experience. Universal Access in the Information Society, 14(2), 281–293. Molin, G., Pettersson, C., Jonsson, O., & Keijer, U. (2007). Living at home with acquired cognitive impairment—Can assistive technology help? Technology and Disability, 19(2, 3), 91–101. Mubin, O., Shahid, S., & Mahmud, A. A. (2008). Walk 2 Win: Towards designing a mobile game for elderly’s social engagement. In Proceedings of the 22nd British HCI conference (BCS-HCI ‘08) (pp. 11–14). Muskens, L., van Lent, R., Vijfvinkel, A., van Cann, P., & Shahid, S. (2014). Never too old to use a tablet: Designing tablet applications for the cognitively and physically impaired elderly. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 391–398). Nicholson, J., Coventry, L., & Briggs, P. (2013a). Faces and pictures: Understanding age differences in two types of graphical authentications. International Journal of Human-Computer Studies, 71(10), 958–966. Nicholson, J., Coventry, L., & Briggs, P. (2013b). Age-related performance issues for PIN and face-based authentication systems. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ‘13) (pp. 323–332).

158   Helen Petrie and Jenny S. Darzentas Nicolau, H., & Jorge, J. (2012). Elderly text-entry performance on touchscreens. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 127–134). Nijhof, N., van Gemert-Pijnen, J. E. W. C., de Jong, G. E. N., Ankoné, J. W., & Seydel, E. R. (2012). How assistive technology can support dementia care: A study about the effects of the IST Vivago watch on patients’ sleeping behavior and the care delivery process in a nursing home. Technology and Disability, 24(2), 103–115. Nijhof, N., van Hoof, J., van Rijn, H., & van Gemert-Pijnen, J. E. W. C. (2014). The behavioral outcomes of a technology-supported leisure activity in people with dementia. Technology and Disability, 25(4), 263–273. Norval, C., Arnott, J. L., & Hanson, V. L. (2014). What’s on your mind? Investigating recommendations for inclusive social networking and older adults. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ‘14) (pp. 3923–3932). Nunes, F., Kerwin, M., & Silva, P. A. (2012). Design recommendations for TV user interfaces for older adults: Findings from the eCAALYX project. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 41–48). Oberzaucher, J., Jagos, H., Zödl, C., Hlauschek, W., & Zagler, W. (2010). Using a wearable insole gait analyzing system for automated mobility assessment for older people. In Proceedings of the 12th international conference on computers helping people with special Needs (ICCHP’10) (pp. 600–603). Orpwood, R., Gibbs, C., Adlam, T., Faulkner, R., & Meegahawatte, D. (2005). The design of smart homes for people with dementia—user-interface aspects. Universal Access in the Information Society, 4(2), 156–164. Otjacques, B., Krier, M., Feltz, F., Ferring, D., & Hoffmann, M. (2009). Helping older people to manage their social activities at the retirement home. In Proceedings of the 23rd British HCI group annual conference on people and computers (BCS-HCI ‘09) (pp. 375–380). Petrie, H. (2001). Accessibility and usability requirements for ICTs for disabled and elderly people: A functional classification approach. In J. G. Abascal & C. Nicolle (Eds.), Inclusive guidelines for human computer interaction (pp. 47–78). London: Taylor and Francis. 0-748,409-48-3. Petrie, H., Gallagher, B., & Darzentas, J. (2014). A critical review of eight years of research on technologies for disabled and older people. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 260–266). Petrie, H., Johnson, V., Strothotte, T., Michel, R., Raab, A., Reichert, L., & Schalt, A. (1997). User-centred design in the development of a navigational aid for blind travellers. In Proceedings of human-computer interaction: INTERACT ‘97 (IFP TC13 international ­conference on human-computer interaction) (pp. 220–227). London: Chapman and Hall. ISBN 0 412 80,950 8. Pigini, L., Facal, D., Blasi, L., & Andrich, R. (2012). Service robots in elderly care at home: Users’ needs and perceptions as a basis for concept development. Technology and Disability, 24(4), 303–312. Prieto, G., & Leahy, D. (2012). Online social networks and older people. In Proceedings of the 13th international conference on computers helping people with special needs (ICCHP’12) (pp. 666–672). Renaud, K., & Ramsay, J. (2007). Now what was that password again? A more flexible way of identifying and authenticating our seniors. Behaviour and Information Technology, 26(4), 309–322.

Digital Technology for Older People   159 Sáenz-de-Urturi, Z., Zapirain, B. G., & Zorrilla, A. M. (2015). Elderly user experience to improve a Kinect-based game playability. Behaviour and Information Technology, 34(11), 1040–1051. Salovaara, A., Lehmuskallio, A., Hedman, L., Valkonen, P., & Näsänen, J. (2010). Information technologies and transitions in the lives of 55–65-year-olds: The case of colliding life interests. International Journal of Human-Computing Studies, 68(11), 803–821. Scopelliti, M., Giuliani, M. V., & Fornara, F. (2005). Robots in a domestic setting: A psychological approach. Universal Access to the Information Society, 4(2), 146–155. Sayago, S., & Blat, J. (2008). Exploring the role of time and errors in real-life usability for older people and ICT. In Proceedings of the 11th international conference on computers helping people with special needs (ICCHP 2008) (pp. 46–53). Sayago, S., & Blat, J. (2010). Telling the story of older people e-mailing: An ethnographical study. International Journal of Human-Computer Studies, 68(1–2), 105–120. Sayago, S., Forbes, P., & Blat, J. (2012). Older people’s social sharing practices in YouTube through an ethnographical lens. In Proceedings of the 26th annual BCS interaction specialist group conference on people and computers (BCS-HCI ‘12) (pp. 185–194). Schikhof, Y., Mulder, I., & Choenni, S. (2010). Who will watch (over) me? Humane monitoring in dementia care. International Journal of Human-Computer Studies, 68(6), 410–422. Šimšík, D., Galajdová, A., Siman, D., Bujňák, J., Andrášová, M., & Novák, M. (2012). MonAMI platform in elderly household environment: Architecture, installation, implementation, trials and results. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP 2012) (pp. 419–422). Siriaraya, P., & Ang, C. S. (2014). Recreating living experiences from past memories through virtual worlds for people with dementia. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ‘14) (pp. 3977–3986). Sokoler, T., & Svensson, M. S. (2007). Embracing ambiguity in the design of non-stigmatizing digital technology for social interaction among senior citizens. Behaviour and Information Technology, 26(4), 297–307. Spreicer, W., Ehrenstrasser, L., & Tellioğlu, H. (2012). kommTUi: Designing communication for elderly. In Proceedings of the 13th international conference on computers helping people with special needs (ICCHP’12) (pp. 705–708). Thomson Reuters. (2013). The Thomson Reuters impact factor. Available at: http://ipsciencehelp.thomsonreuters.com/inCites2Live/indicatorsGroup/aboutHandbook/usingCitationIndicatorsWisely/jif.html Tixier, M., & Lewkowicz, M. (2015). Looking for respite and support: Technological opportunities for spousal caregivers. In Proceedings of the 33rd Annual ACM conference on human factors in computing systems (CHI ‘15) (pp. 1155–1158). Turner, P., Turner, S., & Van De Walle, G. (2007). How older people account for their experiences with interactive technology. Behaviour and Information Technology, 26(4), 287–296. United Nations (2017). World population prospects: Key findings and advance tables (2017 revision). New York: United Nations. Vacher, M., Caffiau, S., Portet, F., Meillon, B., Roux, C., Elias, E., Lecouteux, B., & Chahuara, P. (2015). Evaluation of a context-aware voice interface for ambient assisted living: Qualitative user study vs quantitative system evaluation. ACM Transactions on Accessible Computing, 7(2), art. 5. van der Heide, L. A., Willems, C. G., Spreeuwenberg, M. D., De Witte, L. P., & Rietman, J. (2012). Implementation of CareTV in care for the elderly: The effects on feelings of loneliness and safety and future challenges. Technology and Disability, 24(4), 283–291.

160   Helen Petrie and Jenny S. Darzentas van Schaik, P., Petrie, H., & Kirby, V. (1995). Task performance and technology acceptance: The use of an automatic teller machine by elderly people. In Proceedings of third European conference for the advancement of rehabilitation technology (ECART3) (pp. 62–64). Vutborg, R., Kjeldskov, J., Vetere, F., & Pedell, S. (2010). Family storytelling for grandparents and grandchildren living apart. In Proceedings of NordiCHI 2010 (pp. 531–540). Wan, L., Müller, C., Wulf, V., & Randall, D. W. (2014). Addressing the subtleties in dementia care: Pre-study and evaluation of a GPS monitoring system. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ‘14) (pp. 3987–3996). Warnock, D., McGee-Lennon, M., & Brewster, S. (2013). Multiple notification modalities and older users. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI ‘13) (pp. 1091–1094). Wolters, M.  K., Kilgour, J., MacPherson, S.  E., Dzikovska, M., & Moore, J.  D. (2015). The CADENCE corpus: A new resource for inclusive voice interface design. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI 2015) (pp. 3963–3966). Wulf, L., Garschall, M., Klein, M., & Tscheligi, M. (2014). The influence of age and device orientation on the performance of touch gestures. In Proceedings of the 14th international conference on computers helping people with special needs (ICCHP’14) (pp. 583–590).

Appendix: Publications Analyzed Abascal, J., de Castro, I. F., Lafuente, A., & Cia, J. M. (2008). Adaptive interfaces for supportive ambient intelligence environments. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 30–37). Adam, S., Mukasa, K. S., Breiner, K., & Trapp, M. (2008). An apartment-based metaphor for intuitive interaction with ambient assisted living applications. In Proceedings of the 22nd British HCI group annual conference (BCS-HCI ‘08) (pp. 67–75). Ahanathapillai, V., Amor, J. D., & James, C. J. (2015). Assistive technology to monitor activity, health and well being in old age: The wrist wearable unit in the USEFIL project. Technology and Disability, 27(1–2), 17–29. Ahmad, D., Komninos, A., & Baillie, L. (2008). Future mobile health systems: Designing personal mobile applications to assist self diagnosis. In Proceedings of the 22nd British HCI group annual conference (BCS-HCI ‘08) (pp. 39–42). Aksan, N., Dawson, J. D., Emerson, J. L., Yu, L., Uc, E. Y., Anderson, S. W., & Rizzo, M. (2013). Naturalistic distraction and driving safety in older drivers. Human Factors, 55(4), 841–853. Albinet, C., Tomporowski, P. D., & Beasman, K. (2006). Aging and concurrent task performance: Cognitive demand and motor control. Educational Gerontology, 32(9), 689–706. Alelis, G., Bobrowicz, A., & Ang, C.  S. (2015). Comparison of engagement and emotional responses of older and younger adults interacting with 3D cultural heritage artefacts on personal devices. Behaviour and Information Technology, 34(11), 1064–1078. Al Mahmud, A., Limpens, Y., & Martens, J. B. (2013). Expressing through digital photographs: An assistive tool for persons with aphasia. Universal Access in the Information Society, 12(3), 309–326. Al Mahmud, A., Mubin, O., Shahid, S., & Martens, J. B. (2008). Designing and evaluation the tabletop game experience for senior citizens. In K. Tollmar, & B. Jönsson (Eds.), Proceedings of the 5th Nordic conference on human-computer interaction (NordiCHI ‘08) (pp. 403–406).

Digital Technology for Older People   161 Almer, S., Kolbitsch, J., Oberzaucher, J., & Ebner, M. (2012). Assessment test framework for collecting and evaluating fall-related data using mobile devices. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 83–90). Arning, K., & Ziefle, M. (2008). Development and validation of a computer expertise questionnaire for older adults. Behaviour and Information Technology, 27(4), 325–329. Arning, K., & Ziefle, M. (2009). Effects of age, cognitive and personal factors on PDA menu navigation performance. Behaviour and Information Technology, 28(3), 251–268. Astell, A., Alm, N., Dye, R., Gowans, G., Vaughan, P., & Ellis, M. (2014). Digital video games for older adults with cognitive impairment. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 264–271). Augusto, J., Mulvenna, M., Zheng, H., Wang, H., Martin, S., McCullagh, P., & Wallace, J. (2014). Night optimised care technology for users needing assisted lifestyles. Behaviour and Information Technology, 33(12), 1261–1277. Aula, A. (2005). User study on older adults’ use of the Web and search engines. Universal Access in the Information Society, 4(1), 67–81. Ayoade, M., Uzor, S., & Baillie, L. (2013). The development and evaluation of an interactive system for age related musculoskeletal rehabilitation in the home. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part IV (INTERACT 2013). Lecture Notes in Computer Science 8117 (pp. 1–18). Baecker, R., Sellen, K., Crosskey, S., Boscart, V., & Neves, B. (2014). Technology to reduce social isolation and loneliness. In Proceedings of the 15th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘14) (pp. 27–34). Bagalkot, N., Nassi, E., & Sokoler, T. (2010). Facilitating continuity: Exploring the role of dig­ ital technology in physical rehabilitation. In E. Hvannberg, M. K. Lárusdóttir, A. Blandford, & J. Gulliksen (Eds.), Proceedings of the 6th Nordic conference on human-computer interaction (NordiCHI ‘10) (pp. 42–51). Baharin, H., Rintel, S., & Viller, S. (2013). Rhythms of the domestic soundscape: Ethnomethodological soundwalks for phatic technology design. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction Part IV (INTERACT 2013). Lecture Notes in Computer Science 8117 (pp. 463–470). Baharin, H., Viller, S., & Rintel, S. (2015). SonicAIR: Supporting independent living with reciprocal ambient audio awareness. ACM Transactions on Computer-Human Interaction, 22(4), Art. 18. Ballegaard, S. A., Bunde-Pedersen, J., & Bardram, J. E. (2006). Where to, Roberta? Reflecting on the role of technology in assisted living. In A. Mørch, K. Morgan, T. Bratteteig, G. Ghosh, & D. Svanaes (Eds.), Proceedings of the 4th Nordic conference on human-computer interaction (NordiCHI ‘06) (pp. 373–376). Bauer, S. M., & Lane, J. P. (2006). Convergence of assistive devices and mainstream products: Keys to university participation in research, development and commercialization. Technology and Disability, 18(2), 67–77. Beach, S., Schulz, R., Downs, J., Matthews, J., Barron, B., & Seelman, K. (2009). Disability, age, and informational privacy attitudes in quality of life technology applications: Results from a national web survey. ACM Transactions on Accessible Computing, 2(1), Art. 5. Bechtold, U., & Sotoudeh, M. (2008). Participative approaches for “Technology and Autonomous Living”. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 78–81).

162   Helen Petrie and Jenny S. Darzentas Beer, J. M., Smarr, C.-A., Fisk, A. D., & Rogers, W. A. (2015). Younger and older users’ recognition of virtual agent facial expressions. International Journal of Human-Computer Studies, 75(March), 1–20. Bentley, C. L. (2014). Addressing design and suitability barriers to telecare use: Has anything changed? Technology and Disability, 26(4), 221–235. Berkowsky, R. W., Cotton, S. R., Yost, E. A., & Winstead, V. P. (2013). Attitudes towards and limitations to ICT use in assisted and independent living communities: Findings from a specially-designed technological intervention. Educational Gerontology, 39(11), 797–811. Bertera, E. M . (2014). Storytelling slide shows to improve diabetes and high blood pressure knowledge and self-efficacy: Three-year results among community dwelling older African Americans. Educational Gerontology, 40(11), 785–800. Bertera, E. M., Bertera, R. L., Morgan, R., Wuertz, E., & Attey, A. M. O. (2007). Training older adults to access health information. Educational Gerontology, 33(6), 483–500. Bidwell, N. J., & Jay Siya, M. J. (2013). Situating asynchronous voice in rural Africa. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part III (INTERACT 2013). Lecture Notes in Computer Science 8117 (pp. 36–53). Blanco-Gonzalo, R., Sanchez-Reillo, R., Martinez-Normand, L., Fernandez-Saavedra, B., & Liu-Jimenez, J. (2015). Accessible mobile biometrics for elderly. In Proceedings of the 17th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘15) (pp. 419–420). Bobillier Chaumon, M.  E., Michel, C., Tarpin Bernard, F., & Croisile, B. (2014). Can ICT improve the quality of life of elderly adults living in residential home care units? From actual impacts to hidden artefacts. Behaviour and Information Technology, 33(6), 574–590. Boechler, P. M., Foth, D., & Watchorn, R. (2007). Educational technology research with older adults: Adjustments in protocol, materials, and procedures. Educational Gerontology, 33(3), 221–235. Boström, M., Kjellström, S., & Björklund, A. (2013). Older persons have ambivalent feelings about the use of monitoring technologies. Technology and Disability, 25(2), 17–25. Boulos, M. N. K., Anastasiou, A., Bekiaris, E., & Panou, M. (2011). Geo-enabled technologies for independent living: Examples from four European projects. Technology and Disability, 23(1), 7–17. Boulton-Lewis, G. M., Buys, L., Lovie-Kitchin, J., Barnett, K., & David, N. L. (2007). Ageing, learning, and computer technology in Australia. Educational Gerontology, 33(3), 253–270. Brajnik, G., & Giachin, C. (2014). Using sketches and storyboards to assess impact of age difference in user experience. International Journal of Human-Computer Studies, 72(6), 552–566. Brandtzaeg, P. B., Heim, J., & Karahasanović, A. (2011). Understanding the new digital divide: A typology of Internet users in Europe. International Journal of Human-Computer Studies, 69(3), 123–138. Briggs, P., & Thomas, L. (2015). An inclusive, value sensitive design perspective on future identity technologies. ACM Transactions on Computer-Human Interaction, 22(5), Art. 23. Brown, P. S., & Hanks, R. S. (2008). Implementing an online writing assessment strategy for gerontology. Educational Gerontology, 34(5), 397–399. Bruder, C., Blessing, L., & Wandke, H. (2014). Adaptive training interfaces for less-experienced, elderly users of electronic devices. Behaviour and Information Technology, 33(1), 4–15. Cabreira, A. T., & Hwang, F. (2016). How do novice older users evaluate and perform mid-air gesture interaction for the first time? In S.  Björk, E.  Eriksson, M.  Fjeld, S.  Bødker,

Digital Technology for Older People   163 W. Barendregt, & M. Obaid (Eds.) Proceedings of the 9th Nordic conference on human-computer interaction (NordiCHI ‘16) (Art. 122). Cahill, S., Begley, E., Faulkner, J. P., & Hagen, I. (2007). “It gives me a sense of independence”— Findings from Ireland on the use and usefulness of assistive technology for people with dementia. Technology and Disability, 19(2–3), 133–142. Cahill, S., Macijauskiene, J., Nygård, A.-M., Faulkner, J.-P., & Hagen, I. (2007). Technology in dementia care. Technology and Disability, 19(2–3), 55–60. Caird, J.  K., Chisholm, S.  L., & Lockhart, J. (2007). Do in-vehicle advanced signs enhance older and younger driver’s intersection performance? Driving simulation and eye movement results. International Journal of Human-Computer Studies, 66(3), 132–144. Caird, J. K., Edwards, C. J., Creaser, J. I., & Horrey, W. J. (2005). Older driver failures of attention at intersections: Using change blindness methods to assess turn accuracy. Human Factors, 47(2), 235–249. Caprani, N., Doyle, J., Komaba, Y., & Inomata, A. (2015). Exploring healthcare professionals’ preferences for visualising sensory data. In Proceedings of the 2015 British human computer interaction conference (British HCI 2015) (pp. 26–34). Carey-Smith, B. E., Evans, N. M., & Orpwood, R. D. (2013). A user-centred design process to develop technology to improve sleep quality in residential care homes. Technology and Disability, 25(1), 49–58. Carmichael, A., Rice, M., MacMillan, F., & Kirk, A. (2010). Investigating a DTV-based physical activity application to facilitate wellbeing in older adults. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 278–288). Caroux, L., Consel, C., Dupuy, L., & Sauzeon, H. (2014). Verification of daily activities of older adults: A simple, non-intrusive, low-cost approach. In Proceedings of the 16th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘14) (pp. 43–50). Carrasco, E., Epelde, G., Moreno, A., Ortiz, A., Garcia, I., Buiza, C., . . . Arruti, A. (2008). Natural interaction between avatars and persons with Alzheimer’s disease. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 38–45). Carrington, P., Kuber, R., Anthony, L., Hurst, A., & Prasad, S. (2012). Developing an interface to support procedural memory training using a participatory-based approach. In Proceedings of the 26th annual BCS interaction specialist group conference on people and computers (BCS-HCI ‘12) (pp. 333–338). Casas, R. (2008). User modelling in ambient intelligence for elderly and disabled people. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 114–122). Casas, R., Marco, A. Falco, J. Artigas, J., & Abascal, J. (2006). Ethically aware design of a location system for people with dementia. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 777–784). Castilla, D., Garcia-Palacios, A., Breton-Lopez, J., Miralles, I., Banos, R.  M., Etchemendy, E.,. . . Botella, C. (2013). Process of design and usability evaluation of a telepsychology web and virtual reality system for the elderly: Butler. International Journal of Human-Computer Studies, 71(3), 350–362. Cesta, A., Cortellessa, G., Giuliani, V., Pecora, F., Rasconi, R., Scopellitti, M., & Tiberio, L. (2007). Proactive assistive technology: An empirical study. In C. Baranauskas, P. Palanque, J. Abascal, & S. D. J. Barbosa. (Eds.), Proceedings Human-computer interaction (INTERACT 2007). Lecture Notes in Computer Science, 4662, (pp. 255–268).

164   Helen Petrie and Jenny S. Darzentas Chaffin, A. J., & Harlow, S. D. (2005). Cognitive learning applied to older adult learners and technology. Educational Gerontology, 31(4), 301–329. Chase, C. A. (2010). An intergenerational email pal project on attitudes of college students toward older adults. Educational Gerontology, 37(1), 27–37. Chen, J.-M., Chen, Y.-C., & Chen, Y.-C. (2012). The K-shape learning project for senior citizens. Educational Gerontology, 38(12), 841–853. Chen, S.-Y. (2008). Reading practices and profiles of older adults in Taiwan. Educational Gerontology, 34(5), 427–441. Chou, W. H., Lai, Y.-T., & Liu, K.-H. (2013). User requirements of social media for the elderly: A case study in Taiwan. Behaviour and Information Technology, 32(9), 920–937. Chu, C., Rebola, C.  B., & Kao, J. (2015). BUMP: Bridging unmet modes of participation. In Proceedings of the 2015 British human computer interaction conference (British HCI 2015) (pp. 261–262). Chung, J., Chaudhuri, S., Le, T., Chi, N.-C., Thompson, H. J., & Demiris, G. (2015). The use of think-aloud to evaluate a navigation structure for a multimedia health and wellness application for older adults and their caregivers. Educational Gerontology, 41(12), 916–929. Coelho, J., & Duarte, C. (2015). Socially networked or isolated? Differentiating older adults and the role of tablets and television. In J.  Abascal, S.  Barbosa, M.  Fetter, T.  Gross, P.  Palanque, & M.  Winckler (Eds.), Proceedings Human-computer interaction, Part I (INTERACT 2015). Lecture Notes in Computer Science, 9296, (pp. 129–146). Coelho, J., Rito, F., Luz, N., & Duarte, C. (2015). Prototyping TV and tablet Facebook interfaces for older adults. In J.  Abascal, S.  Barbosa, M.  Fetter, T.  Gross, P.  Palanque, & M.  Winckler (Eds.), Proceedings Human-computer interaction, Part 1 (INTERACT 2015). Lecture Notes in Computer Science, 9296 (pp. 110–128). Conci, M., Pianesi, F., & Zancanaro, M. (2009). Useful, social and enjoyable: Mobile phone adoption by older people. In T. Gross, J. Gulliksen, P. Kotzé, L. Oestreicher, P. Planaque, R.  Oliveira Prates & M.  Winckler (Eds.), Proceedings Human-computer interaction (INTERACT 2009). Lecture Notes in Computer Science 5726 (pp. 63–76). Convertino, G., Farooq, U., Rosson, M. B., Carroll, J. M., & Meyer, B. J. F. (2007). Supporting intergenerational groups in computer supported cooperative work (CSCW). Behaviour and Information Technology, 26(4), 275–285. Cornejo, R., Tentori, M., & Favela, J. (2013). Enriching in-person encounters through social media: A study on family connectedness for the elderly. International Journal of HumanComputer Studies, 71(9), 889–899. Cresci, M. K., Jarosz, P. A., & Templin, T. A. (2012). Are health answers online for older adults? Educational Gerontology, 38(1), 10–19. Cresci, M. K. & Novak, J. M. (2012). Information technologies as health management tools: Urban elders’ interest and ability in using the Internet. Educational Gerontology, 38(7), 491–506. Cresci, M. K., Yarandi, H. N., & Morell, R. W. (2010). Pro-nets versus no-nets: Differences in urban older adults’ predilections for Internet use. Educational Gerontology, 36(6), 500–520. Crete-Nishihata, M., Baecker, R.  M., Massimi, M., Ptak, D., Campigotto, R., Kaufman, L. D., . . . Black, S. E. (2012). Reconstructing the past: Personal memory technologies are not just personal and not just for memory. Human-Computer Interaction, 27(1–2), 92–123. Czaja, S.  J., Schulz, R., Perdomo, D., & Nair, S.  N. (2014). The feasibility and efficacy of technology-based support groups among family caregivers of persons with dementia. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 455–458).

Digital Technology for Older People   165 Dahn, I., Ferdinand, P., & Lachmann, P. (2014). Supporting senior citizen using tablet computers. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 323–330). Dale, O. (2010). Usability and usefulness of GPS based localization technology used in dementia care. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 300–307). Darroch, I., Goodman, J., Brewster, S., & Gray, P. (2005). The effect of age and font size on reading text on handheld computers. In M. F. Costabile & F. Paternò (Eds.), Proceedings Human-computer interaction (INTERACT 2005), Lecture Notes in Computer Science (LNCS), 3585, (pp. 253–266). Davidse, R. J., Hagenzieker, M. P., van Wolffelaar, P. C., & Brouwer, W. H. (2009). Effects of in-car support on mental workload and driving performance of older drivers. Human Factors, 51(4), 463–476. de Beer, R., Keijers, R., Shahid, S., Al Mahmud, A., & Mubin, O. (2010). PMD: Designing a portable medicine dispenser for persons suffering from Alzheimer’s Disease. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 332–335). DeCoster, V. A., & George, L. (2005). An empowerment approach for elders living with diabetes: A pilot study of a community-based self-help group—The Diabetes Club. Educational Gerontology, 31(9), 699–733. Diaz-Orueta, U., Etxaniz, A., Gonzalez, M. F. Buiza, C., Urdaneta, E., Yanguas, J., Carrasco, E., & Epelde, G. (2014). Role of cognitive and functional performance in the interactions between elderly people with cognitive decline and an avatar on TV. Universal Access in the Information Society, 13(1), 89–97. Dickinson, A., Arnott, J., & Prior, S. (2007). Methods for human–computer interaction research with older people. Behaviour and Information Technology, 26(4), 343–352. Dickinson, A., Eisma, R., Gregor, P., Syme, A., & Mile, S. (2005). Strategies for teaching older people to use the World Wide Web. Universal Access in the Information Society, 4(1), 3–15. Dickinson, A., & Gregor, P. (2006). Computer use has no demonstrated impact on the wellbeing of older adults. International Journal of Human-Computer Studies, 64(8), 744–753. Dickinson, A., & Hill, R. L. (2007). Keeping in touch: Talking to older people about computers and communication. Educational Gerontology, 33(8), 613–630. Dickinson, A., Smith, M., Arnott, J., Newell, A., & Hill, R. (2007). Approaches to web search and navigation for older computer novices. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI 2006) (pp. 281–290). Diño, M.  J.  S., & de Guzman, A.  B. (2015). Using partial least squares (PLS) in predicting behavioural intention for telehealth use among Filipino elderly. Educational Gerontology, 41(1), 53–68. Dobosz, K., Dobosz, M., Fiolka, T., Wojaczek, M., & Depta, T. (2014). Tablets in the rehabilitation of memory impairment. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 399–402). Dogruel, L., Joeckel, S., & Bowman, N. D. (2015). The user and acceptance of new media entertainment technology by elderly users: Development of an expanded technology acceptance model. Behaviour and Information Technology, 34(11), 1052–1063. Donorflo, L. K. M., & Healy, C. (2008). Teaching an interactive television course on adulthood and aging: Making it happen. Educational Gerontology, 34(6), 531–549.

166   Helen Petrie and Jenny S. Darzentas Dorin, M. (2007). Online education of older adults and its relation to life satisfaction. Educational Gerontology, 33(2), 127–143. Dowds, G., & Masthoff, J. (2015). A virtual “Window to the Outside World”: Initial design and plans for evaluation. In Proceedings of the 2015 British human computer interaction conference (British HCI 2015) (pp. 265–266). Doyle, J., Bailey, C., Ni Scanaill, C., & van den Berg, F. (2014). Lessons learned in deploying independent living technologies to older adults’ homes. Universal Access in the Information Society, 13(2), 191–204. Doyle, J., O’Mullane, B., McGee, S., & Knapp, R. B. (2012). YourWellness: Designing an application to support positive emotional wellbeing in older adults. In Proceedings of the 26th annual BCS interaction specialist group conference on people and computers (BCS-HCI ‘12) (pp. 221–226). Doyle, J., Sassu, A., & McDonagh, T. (2013). “We were all the same age once”: Experiences of intergenerational app. design. In Proceedings of the 27th international BCS human computer interaction conference (BCS-HCI ‘13) (Art. 24). Doyle, J., Skrba, Z., McDonnell, R., & Arent, B. (2010). Designing a touch screen communication device to support social interaction amongst older adults. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 177–185). Duff, P. & Dolphin, C. (2007). Cost-benefit analysis of assistive technology to support independence for people with dementia–Part 1: Development of a methodological approach to the ENABLE cost-benefit analysis. Technology and Disability, 19(2–3), 73–78. Duff, P. & Dolphin, C. (2007). Cost-benefit analysis of assistive technology to support independence for people with dementia–Part 2: Results from employing the ENABLE costbenefit model in practice. Technology and Disability, 19(2–3), 79–90. Eaton, J., & Salari, S. (2005). Environments for lifelong learning in senior centers. Educational Gerontology, 31(6), 461–480. Edlund, C., & Björklund, A. (2011). Family caregivers’ conceptions of usage of and information on products, technology and Web-based services. Technology and Disability, 23(4), 205–214. Ejupi, A., Gschwind, Y.  J., Valenzuela, T., Lord, S.  R., & Delbaere, K. (2016). A Kinect and inertial sensor-based system for the self-assessment of fall risk: A home-based study in older people. Human-Computer Interaction, 31(3–4), 261–293. Elias, S. M., Smith, W. L., & Barney, C. E. (2012). Age as a moderator of attitude towards technology in the workplace: Work motivation and overall job satisfaction. Behaviour and Information Technology, 31(5), 453–467. Elton, E., & Nicolle, C. (2013). Designing inclusive products for everyday environments: The effects of everyday cold temperatures on older adults’ dexterity. Universal Access in the Information Society, 12(3), 247–261. Ezer, N., Fisk, A. D., & Rogers, W. (2008). Age-related differences in reliance behaviour attributable to costs within a human-decision aid system. Human Factors, 50(6), 853–863. Fan, C., Forlizzi, J., & Dey, A. (2012). Considerations for technology that support physical activity by older adults. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 33–40). Fang, Y.-M. & Chang, C.-C. (2016). Users’ psychological perception and perceived readability of wearable devices for elderly people. Behaviour and Information Technology, 35(3), 225–232. Farina, K., & Nitche, M. (2015). Outside the brick: Exploring prototyping for the elderly. In Proceedings of the 2015 British Human Computer Interaction Conference (British HCI 2015) (pp. 11–17).

Digital Technology for Older People   167 Fernandes, F., Rodrigues, A., Duarte, C., Jion-Neira, R., & Carrico, L. (2014). Web accessibility of mobile and desktop representations. In Proceedings of the 28th International BCS human computer interaction conference on HCI (BCS-HCI ‘14) (pp. 195–200). Ferreira, S. M., Sayago, S., & Blat, J. (2014). Towards iTV services for older people: Exploring their interactions with online video portals in different cultural backgrounds. Technology and Disability, 26(4), 199–209. Ferreira, S. M., Sayago, S., & Blat, J. (2017). Older people’s production and appropriation of digital videos: An ethnographic study. Behaviour and Information Technology, 36(6), 557–574. Ferreira, S., Torres, A., Mealha, O., & Veloso, A. (2015). Effects on older adults in information and communication technologies considering psychosocial variables. Educational Gerontology, 41(7), 482–493. Fezzani, K., Albinet, C., Thon, B., & Marquie, J. C. (2010). The effect of motor difficulty on the acquisition of a computer task: A comparison between young and older adults. Behaviour and Information Technology, 29(2), 115–124. Fried-Oken, M., Rowland, C., Baker, G., Dixon, M., Mills, C., Schultz, D., & Oken, B. (2009). The effect of voice output on AAC-supported conversations of persons with Alzheimer’s disease. ACM Transactions on Accessible Computing, 1(3), Art. 15. Fronemann, N., Weisener, A., Pollmann, K., & Peisnner, M. (2016). Happily ever after: Positive aging through positive design. In S. Björk, E. Eriksson, M. Fjeld, S. Bødker, W. Barendregt, & M. Obaid, (Eds.), Proceedings of the 9th Nordic conference on human-computer interaction (NordiCHI ‘16) (Art. 105). Fuchsberger, V., Sellner, W., Moser, C., & Tscheligi, M. (2012). Benefits and hurdles for older adults in intergenerational online interactions. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 697–704). Gagliardi, C., Mazzarini, G., Papa, R., Giuli, C., & Marcellini, F. (2008). Designing a learning program to link old and disabled people to computers. Educational Gerontology, 34(1), 15–29. Gamliel, T. & Gabay, N. (2014). Knowledge exchange, social interactions, and empowerment in an intergenerational technology program at school. Educational Gerontology, 40(8), 597–617. Gao, Q., & Sun, Q. (2015). Examining the usability of touch screen gestures for older and younger adults. Human Factors, 57(5), 835–863. Gappa, H., Nordbrock, G., Mohamad, Y., Pullmann, J., & Velasco, C. A. (2012). Controlled natural language sentence building as a model for designing user interfaces for rule editing in assisted living systems–A user study. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 415–418). Gappa, H., Nordbrock, G., Thelen, M., Pullmann, J., Mohamad, Y., & Velasco, C. A. (2014). Extended scaffolding by remote collaborative interaction to support people with dementia in independent living–A user study. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 445–450). Gatto, S.  L., & Tak, S.  H. (2008). Computer, internet, and e-mail use among older adults: Benefits and barriers. Educational Gerontology, 34(9), 800–811. Gibson, L., Moncur, W., Forbes, P., Arnott, J., Martin, C., & Bhachu, A. S. (2010). Designing social networking sites for older adults. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 186–194). Gil, S. R. & Sánchez Martín, V. (2014). ELDERS-UP! Adaptive system for enabling the elderly collaborative knowledge transference to small companies. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 148–151).

168   Helen Petrie and Jenny S. Darzentas Giusti, L., Mencarini, E., & Zancanaro, M. (2010). “Luckily, I don’t need it”: Elderly and the use of artifacts for time management. In E.  Hvannberg, M.  K.  Lárusdóttir, A.  Blandford & J. Gulliksen (Eds.), Proceedings of the 6th Nordic conference on human-computer interaction (NordiCHI ‘10) (pp. 198–206). González, A. M., Ramírez, P., & Viadel, V. (2012). Attitudes of the elderly toward information and communications technologies. Educational Gerontology, 38(9), 585–594. Goodman, J., Brewster, S. A., & Gray, P. (2005). How can we best use landmarks to support older people in navigation. Behaviour and Information Technology, 24(1), 3–20. Grabowski, P. J., & Mason, A. H. (2014). Age differences in the control of a precision reach to grasp task within a desktop virtual environment. International Journal of Human-Computer Studies, 72(4), 383–392. Gramss, D., & Struve, D. (2009). Instructional videos for supporting older adults who use interactive systems. Educational Gerontology, 35(2), 164–176. Grill, T., Osswald, S., & Tscheligi, M. (2012). Task complexity and user model attributes: An analysis of user model attributes for elderly drivers. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 642–649). Grimes, G. A., Hough, M. G. Mazur, E., & Signorella, M. L. (2010). Older adults’ knowledge of internet hazards. Educational Gerontology, 36(3), 173–192. Gudur, R. R., Blackler, A., Popovic, V., & Mahar, D. (2013). Ageing, technology anxiety and intuitive use of complex interfaces. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part III (INTERACT 2013). Lecture Notes in Computer Science, 8117, (pp. 564–581). Gusi, N., Prieto, J., Forte, D., Gomez, I., & González-Guerreroc, J.-L. (2008). Needs, interests, and limitations for the promotion of health and exercise by a web site for sighted and blind elderly people: A qualitative exploratory study. Educational Gerontology, 34(6), 449–461. Hakobyan, L., Lumsden, J., & O’Sullivan, D. (2014). Participatory research with older adults with AMD: Co-designing a SMART diet diary app. In Proceedings of the 28th International BCS human computer interaction conference on HCI (BCS-HCI ‘14) (pp. 32–41). Hakobyan, H., Lumsden, J., O’Sullivan, D., & Bartlett, H. (2012). Understanding the IT-related attitudes and needs of persons with age-related macular degeneration: A case study. In Proceedings of the 26th annual BCS interaction specialist group conference on people and computers (BCS-HCI ‘12) (pp. 239–244). Hakobyan, L., Lumsden, J., O’Sullivan, D., & Bartlett, H. (2013). Designing a mobile diet diary application with and for older adults with AMD: A case study. In Proceedings of the 27th International BCS human computer interaction conference (BCS-HCI ‘13) (Art. 17). Hallewell Haslwanter, J. D., & Fitzpatrick, G. (2013). The development of a sensor-based system for older people: A case study. In Proceedings of the 27th International BCS human computer interaction conference (BCS-HCI ‘13) (Art. 11). Hancock, H. E., Fisk, A. D., & Rogers, W. A. (2005). Comprehending product warning information: Age-related effects and the roles of memory, inferencing and knowledge. Human Factors, 47(2), 219–234. Hanson, V. L., & Crayne, S. (2005). Personalization of Web browsing: Adaptations to meet the needs of older adults. Universal Access in the Information Society, 4(1), 46–58. Harada, S., Sato, D., Takagi, H., & Asakawa, C. (2013). Characteristics of elderly user behaviour on mobile multi-touch devices. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part IV (INTERACT 2013). Lecture Notes in Computer Science, 8117, (pp. 323–341).

Digital Technology for Older People   169 Hardt, J. H., & Hollis-Sawyer, L. (2007). Older adults seeking healthcare information on the Internet. Educational Gerontology, 33(7), 561–572. Harley, D., Howland, K., Harris, E., & Redlich, C. (2014). Online communities for older users: What can we learn from local community interactions to create social sites that work for older people. In Proceedings of the 28th International BCS human computer interaction conference on HCI (BCS-HCI ‘14) (pp. 32–41). Hart, T. A., Chaparro, B. S., & Halcomb, C. G. (2008). Evaluating websites for older adults: Adherence to “senior-friendly” guidelines and end-user performance. Behaviour and Information Technology, 27(3), 191–199. Hawthorn, D. (2007). Interface design and engagement with older people. Behaviour and Information Technology, 26(4), 333–341. Heerink, M., Krose, B., Wielinga, B., & Evers, V. (2009). Measuring the influence of social abilities on acceptance of an interface robot and a screen agent by elderly users. In Proceedings of the 23rd British HCI group annual conference on people and computers (BCSHCI ‘09) (pp. 430–439). Hendriks, N., Truyen, F., & Duval, E. (2013). Designing with dementia: Guidelines for participatory design together with persons with dementia. In P. Kotzé, G. Marsden, G. Lindgaard, J.  Wesson, & M.  Winckler (Eds.), Proceedings Human-computer interaction, Part I (INTERACT 2013). Lecture Notes in Computer Science, 8117, (pp. 649–666). Hernández-Encuentra, E., Pousadaa, M., & Gómez-Zúñigaa, B. (2009). ICT and older people: Beyond usability. Educational Gerontology, 35(3), 225–245. Hirotomi, T., Hosomi, Y., & Yano, H. (2008). Brake control assist on a four-castered walker for old people. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 1269–1276). Hogeboom, D. L., McDermotta, R. J., Perrina, K. M., Osmana, H., & Bell-Ellison, B. A. (2010). Internet use and social networking among middle aged and older adults. Educational Gerontology, 36(2), 93–111. Holliday, N., Ward, G., Fielden, S., & Williams, S. (2015). Exploration of information needs and development of resources to inform and support those at risk of falling. Technology and Disability, 27(1–2), 31–40. Hollinworth, N., & Hwang, F. (2009). Learning how older adults undertake computer tasks. In Proceedings of the 11th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘09) (pp. 245–246). Holllinworth, N., & Hwang, F. (2011). Investigating familiar interactions to help older adults learn computer applications more easily. In Proceedings of the 25th international BCS human computer interaction conference (BCS HCI ‘11) (pp. 473–478). Holzinger, A., Sammer, P., & Hofmann-Wellenhof, R. (2006). Mobile computing in medicine: Designing mobile questionnaires for elderly and partially sighted people. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 732–739). Holzinger,  A., Schaupp, K., & Eder-Halbedl,  W. (2008). An investigation on acceptance of ubiquitous devices for the elderly in a geriatric hospital environment: Using the example of person tracking. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 22–29). Holzinger, A., Searle, G., Kleinberger, T., Seffah, A., & Javahery, H. (2008). Investigating usability metrics for the design and development of applications for the elderly. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 98–105).

170   Helen Petrie and Jenny S. Darzentas Hoshi, K., Ohberg, F., & Nyberg, A. (2011). Designing blended reality space: Conceptual foundations and applications. In Proceedings of the 25th international BCS human computer interaction conference (BCS HCI ‘11) (pp. 217–226). Huber, L. & Watson, C. (2014). Technology: Education and training needs of older adults. Educational Gerontology, 40(1), 16–25. Huldtgren, A., Mertl, F., Vormann, A., & Geiger, C. (2015). Probing the potential of multimedia artefacts to support communication of people with dementia In J. Abascal, S. Barbosa, M. Fetter, T. Gross, P. Palanque, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part III (INTERACT 2015). Lecture Notes in Computer Science, 9296, (pp. 71–79). Hurtienne, J., Horn, A. M., Langdon, P. M., & Clarkson, P. J. (2013). Facets of prior experience and the effectiveness of inclusive design. Universal Access in the Information Society, 12(3), 297–308. Hwang, F., Hollingworth, N., & Williams, N. (2013). Effects of target expansion on section performance in older computer users. ACM Transactions on Accessible Computing, 5(1), Art. 1. Hwang, M.-Y., Hong, J.-C., Hao, Y.-W., & Jong, J. T. (2011). Elders’ usability, dependability, and flow experiences on embodied interactive video games. Educational Gerontology, 37(8), 715–731. Iacono, I., & Marti, P. (2014). Engaging older people with participatory design. In V. Roto, J. Häkkilä, Lárusdóttir, K. Väänänen-Vainio-Mattila, O. Juhlin, T. Olsson, & E. Hvannberg (Eds.), Proceedings of the 8th Nordic conference on human-computer interaction (NordiCHI ‘14) (pp. 859–864). Infeld, D. L., & Adams, W. C. (2013). Using the Internet for gerontology education: Assessing and improving Wikipedia. Educational Gerontology, 39(10), 707–716. Inoue, K., Sakuma, N., Okada, M., Sasaki, C., Nakamura, M., & Wada, K. (2014). Effective application of PALRO: A humanoid type robot for people with dementia. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 451–454). Iwarsson, S., & Wilson, G. (2006). Environmental barriers, functional limitations, and housing satisfaction among older people in Sweden: A longitudinal perspective on housing accessibility. Technology and Disability, 18(2), 57–66. Jang, Y., Yoon, H., Marti, C. N., & Kim, M. T. (2015). Aging IQ intervention with older Korean Americans: A comparison of internet-based and in-class education. Educational Gerontology, 41(9), 642–652. Jastrzembski, T., Charness, N., Holley, P., & Feddon, J. (2005). Input devices for web browsing: Age and hand effects. Universal Access in the Information Society, 4(1), 39–45. Jochems, N., Vetter, S., & Schlick, C. (2013). A comparative study of information input devices for aging computer users. Behaviour and Information Technology, 32(9), 902–919. Johansson, K., Lundberg, S., & Borell, L. (2011). “The Cognitive Kitchen”–Key principles and suggestions for design that includes older adults with cognitive impairments as kitchen users. Technology and Disability, 23(1), 29–40. Jung, Y., Peng, W., Moran, M., Jin, S.-A., McLaughlin, M., Cody, M., Jordan-Marsh, M., Albright, J., & Silverstein, M. (2010). Low-income minority seniors’ enrollment in a cybercafé: Psychological barriers to crossing the digital divide. Educational Gerontology, 36(3), 193–212. Kahana, B., Kahan, E., Lovegree, L., & Seckin, G. (2006). Compensatory use of computers by disabled older adults. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 766–769).

Digital Technology for Older People   171 Kaklanis, N., Moschonas, P., Moustakas, K., & Tsovaras, D. (2013). Virtual user models for the elderly and disabled for automatic simulated accessibility and ergonomy evaluation of designs. Universal Access in the Information Society, 12(4), 403–425. Kaklanis, N., Moustakas, K., & Tzovaras, D. (2008). A methodology for generating virtual user models of elderly and disabled for the accessibility assessment of new products. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 295–302). Kalman, Y.  M., Geraghty, K., Thompson, C.  K., & Gergle, D. (2012). Detecting linguistic HCI markers in an online aphasia support group. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 65–70). Kamiliaris, A., Kondepudi, S., & Danial, N. (2015). Understanding the activities and areas of concern of elderly population: The case of Singapore. Technology and Disability, 27(4), 141–153. Kamollimsakul, S., Petrie, H., & Power, C. (2014). Web accessibility for older readers: Effects of font type and font size on skim reading webpages in Thai. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 332–339). Kepski, M., & Kwolek, B. (2012). Fall detection on embedded platform using Kinect and wireless accelerometer. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 407–414). Kim, H. J., Jarochawski, B., & Ryu, D. H. (2006). A proposal for a home-based health monitoring system for the elderly or disabled. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 473–479). Kim, Y. B. (2006). A ubiquitous social community portal service for social networking with convenient accessibility. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 263–270). Kim, Y. S. (2008). Reviewing and critiquing computer learning and usage among older adults. Educational Gerontology, 34(8) 709–735. Kobayashi, M., Hiyam, A., Miura, T., Asakawa, C., Hirose, M., & Ifukube, T. (2011). Elderly user evaluation of mobile touchscreen interactions. In P.  Campos, N.  Graham, J.  Jorge, N. Nunes, P. Palanque, and M. Winckler (Eds.), Proceedings Human-computer interaction (INTERACT 2011). Lecture Notes in Computer Science, 6947, (pp. 83–99). Kobayashi, M., Ishihara, T., Kosugi, A., Takagi, H., & Asakawa, A. (2013). Question-answer cards for an inclusive micro-tasking framework for the elderly. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part III (INTERACT 2013). Lecture Notes in Computer Science, 8117, (pp. 590–607). Komori, T., Takagi, T., Kurozumi, K., Shoda, K., & Murakawa, K. (2010). A device to evaluate broadcast background sound balance using loudness for elderly listeners. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 560–567). Koopman-Boydena, P. G., & Reid, S. L. (2009). Internet/E-mail usage and well-being among 65–84 year olds in New Zealand: Policy implications. Educational Gerontology, 35(11), 990–1007. Kort, H. S. M., & van Hoof, J. (2014). Design of a website for home modifications for older persons with dementia. Technology and Disability, 26(1), 1–10. Kurniawan, S. (2008). Older people and mobile phones: A multi-method investigation. International Journal of Human-Computer Studies, 66(12), 889–901.

172   Helen Petrie and Jenny S. Darzentas Laberge, J. C., & Scialfa, C. T. (2005). Predictors of web navigation performance in a life span sample of adults. Human Factors, 47(2), 289–302. Laganà, L. (2008). Enhancing the attitudes and self-efficacy of older adults toward computers and the internet: Results of a pilot study. Educational Gerontology, 34(9), 831–849. Larsson, E., Larsson-Lund, M., & Nilsson, I. (2013). Internet based activities (IBAs): Seniors’ experiences of the conditions required for the performance of and the influence of these conditions on their own participation in society. Educational Gerontology, 39(3), 155–167. Lee, B., Lee, M., Sook, M. S., You, H., Park, H., Jung, K., Lee, B. H., Na D. L., & Kim, G. H. (2015). The effects of age, gender, and hand on force control capabilities of healthy adults. Human Factors, 57(8), 1348–1358. Lee, C.  C., Czaja, S.  J., & Sharit, J. (2008). Training older workers for technology-based employment. Educational Gerontology, 35(1), 15–31. Lehtinen, V., Nasanen, J., & Sarvas, R. (2009). “A little silly and empty-headed”: Older adults’ understandings of social networking sites. In Proceedings of the 23rd British HCI group annual conference on people and computers (BCS-HCI ‘09) (pp. 45–54). Leinonen, E., Syrjanen, A.-L., & Isomursu, M. (2014). Designing assistive and cooperative HCI for older adults’ movement. In V. Roto, J. Häkkilä, Lárusdóttir, K. Väänänen-VainioMattila, O. Juhlin, T. Olsson & E. Hvannberg (Eds.), Proceedings of the 8th Nordic conference on human-computer interaction (NordiCHI ‘14) (pp. 877–882). Leitner, G., Fercher, A. J., Felfernig, A., & Hitz, M. (2012). Reducing the entry threshold of AAL Systems: Preliminary results from Casa Vecchia. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 709–715). Leonard, V. K., Jacko, J. A., & Pizzimenti, J. J. (2005). An exploratory investigation of handheld computer interaction for older adults with visual impairments. In Proceedings of the 7th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘05) (pp. 12–19). Leonard, V. K., Jacko, J. A., & Pizzimenti, J. J. (2006). An investigation of handheld device use by older adults with age-related macular degeneration. Behaviour and Information Technology, 25(4), 313–332. Lepicard, G., & Vigouroux, N. (2010). Touch screen user interfaces for older subjects: Effect of the targets number and the two hands use. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 592–599). Lepicard, G., & Vigouroux, N. (2012). Comparison between single-touch and multi-touch interaction for older people. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 658–665). Leung, R., Findlater, L., McGrenere, J., Graf, P., & Yang, J. (2010). Multi-layered interfaces to improve older adults’ initial learnability of mobile applications. ACM Transactions on Accessible Computing, 3(1), Art. 1. Leung, R., McGrenere, J., & Graf, P. (2011). Age-related differences in the initial usability of mobile device icons. Behaviour and Information Technology, 30(5), 629–642. Leung, R., Tang, C., Haddad, S., McGrenere, J., Graf, P., & Ingriany, V. (2012). How older adults learn to use mobile devices: Survey and field investigations, ACM Transactions on Accessible Computing, 4(3), Art. 11. Lexis, M., Everink, I., van der Heide, L., Spreeuwenber, M., Willems, C., & de Witte, L. (2013). Activity monitoring technology to support homecare delivery to frail and psychogeriatric elderly persons living at home alone. Technology and Disability, 25(3), 189–197.

Digital Technology for Older People   173 Li, H., Rau, P.-L. P., Fujimura, K., Gao, Q., & Wang, L. (2012). Designing effective web forms for older web users. Educational Gerontology, 38(4), 271–281. Liao, M.-J., Wu, Y., & Sheu, C.-F. (2014). Effects of perceptual complexity on older and younger adults’ target acquisition performance. Behaviour and Information Technology, 33(6), 591–605. Liao, Q. V. & Fu, W.-T. (2014). Age differences in credibility judgements of online health information. ACM Transactions on Computer-Human Interaction, 21(1), Art. 2. Lindley, S.  E. (2012). Before I forget: From personal memory to family history HumanComputer Interaction, 27(1–2), 13–36. Lindley, S. E., Harper, R., & Sellen, A. (2008). Designing for elders: Exploring the complexity of relationships in later life. In Proceedings of the 22nd British HCI group annual conference (BCS-HCI ‘08) (pp. 77–86). Lindley, S. & Wallace, J. (2015). Placing in age: Transitioning to a new home in later life. ACM Transactions on Computer-Human Interaction, 22(4), Art. 20. Little, L., Briggs, P., & Coventry, L. (2011). Who knows about me? An analysis of age-related disclosure preferences. In Proceedings of the 25th international BCS human computer interaction conference (BCS HCI ‘11) (pp. 84–87). Liu, C.-L. (2014). A study of detecting and combating cybersickness with fuzzy control for the elderly within 3D virtual stores. International Journal of Human-Computer Studies, 72(12), 796–804. Lockhart, T. E., Smith, J. L., & Woldstad, J. C. (2005). Effects of aging on the biomechanics of slips and falls. Human Factors, 47(4), 708–729. Lozano, H., Hernáez, I., Picón, A., Camarena, J., & Navas, E. (2010). Audio classification techniques in home environments for elderly/dependent people. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 320–323). McBride, S. E., Rogers, W. A., & Fisk, A. D. (2011). Understanding the effect of workload on automation use for younger and older adults. Human Factors, 53(6), 672–686. McCann, R. M., & Keaton, S. A. (2013). A cross cultural investigation of age stereotypes and communication perceptions of older and younger workers in the USA and Thailand. Educational Gerontology, 39(5), 326–341. McCarthy, S., Sayers, H., & McKevitt, P. (2007). Investigating the usability of PDAs with ageing users. In Proceedings of the 21st British HCI group annual conference (BCS-HCI ‘07) (pp. 67–70). McLaughlin, A. C., Rogers, W. A., & Fisk, A. D. (2009). Using direct and indirect input devices: Attention demands and age-related differences. ACM Transactions on Computer-Human Interaction, 16(1), Art. 2. Macek, J., & Kleindienst, J. (2011). Exercise support system for elderly: Multi-sensor physiolog­ ical state detection and usability testing. In P.  Campos, N.  Graham, J.  Jorge, N.  Nunes, P. Palanque, & M. Winckler (Eds.), Proceedings Human-computer interaction (INTERACT 2011). Lecture Notes in Computer Science, 6947, (pp. 81–88). Maciuszek, D., Aberg, J., & Shahmehri, N. (2005). What help do older people need?: Constructing a functional design space of electronic assistive technology applications. In Proceedings of the 7th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘05) (pp. 4–11). Magnusson, L. & Hanson, E. (2012). Partnership working: The key to the AT-technology transfer process of the ACTION service (Assisting Carers using Telematics Interventions to meet Older people’s Needs) in Sweden. Technology and Disability, 24(3), 219–232.

174   Helen Petrie and Jenny S. Darzentas Malinowsky, C., Kottorp, A., & Nygård, L. (2013). Everyday technologies’ levels of difficulty when used by older adults with and without cognitive impairment–Comparison of selfperceived versus observed difficulty estimates. Technology and Disability, 25(3), 167–176. Malinowsky, C., Kottorp, A., Patomella, A.-H., Rosenberg, L., & Nygård, L. (2015). Changes in the technological landscape over time: Relevance and difficulty levels of everyday technologies as perceived by older adults with and without cognitive impairment. Technology and Disability, 27(3), 91–101. Mann, W. C., Belchior, P., Tomita, M. R., & Kemp, B. J. (2005). Computer use by middle-aged and older adults with disabilities. Technology and Disability, 17(1), 1–9. Mann, W. C., Locher, S., Justiss, M. D., Wu, S., & Tomita, M. (2005). A comparison of fallers and non-fallers in the frail elderly. Technology and Disability, 17(1), 25–32. Marquis-Faulkes, F., McKenna, S. J., Newell, A. F., & Gregor, P. (2005). Gathering the requirements for a fall monitor using drama and video with older people. Technology and Disability, 17(4), 227–236. Marston, H.  R. (2013). Design recommendations for digital game design within an ageing society. Educational Gerontology, 39(2), 103–118. Marston, H.  R. (2013). Digital gaming perspectives of older adults: Content vs interaction. Educational Gerontology, 39(3), 194–208. Marston, H.  R., Greenlay, S., & van Hoof, J. (2013). Understanding the Nintendo Wii and Microsoft Kinect consoles in long-term care facilities. Technology and Disability, 27(2), 77–85. Martin, S., Nugent, C., Wallace, J., Kernohan, G., McCreight, B., & Mulvenna, M. (2007). Using context awareness within the “Smart home” environment to support social care for adults with dementia. Technology and Disability, 19(2–3), 143–152. Matthews, J.  T., Beach, S.  R., Downs, J., de Bruin, W.  B., Mecca, L.  P., & Schulz, S. (2010). Preferences and concerns for quality of life technology among older adults and persons with disabilities: National survey results. Technology and Disability, 22(1–2), 5–15. Mayer, C., Morandell, M., Gira, M., Hackbarth, K., Petzold, M., & Fagel, S. (2012). AALuis, a user interface layer that brings device independence to users of AAL systems. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 650–657). Mehdi, S. A., & Berns, K. (2014). Behavior-based search of human by an autonomous indoor mobile robot in simulation. Universal Access in the Information Society, 13(1), 45–58. Meza-Kubo, V., Morán, A. L., & Rodríguez, M. D. (2014). Bridging the gap between illiterate older adults and cognitive stimulation technologies through pervasive computing. Universal Access in the Information Society, 13(1), 33–44. Michailidou, E., Parmaxi, A., & Zaphiris, P. (2015). Culture effects in online social support for older people: Perceptions and experience. Universal Access in the Information Society, 14(2), 281–293. Miotto, A., Lessiter, J., Freeman, J., Carmichael, R., & Ferrrari, E. (2013). Cognitive training via interactive television: Drivers, barriers and potential users. Universal Access in the Information Society, 12(1), 37–54. Miura, T., Yabu, K.-I., Hiyama, A., Inamura, N., Hirose, M., & Ifukube, T. (2015). Smartphonebased gait measurement application for exercise and its effects on the lifestyle of senior citizens. In J. Abascal, S. Barbosa, M. Fetter, T. Gross, P. Palanque, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part III (INTERACT 2015). Lecture Notes in Computer Science, 9296, (pp. 80–98).

Digital Technology for Older People   175 Moffatt, K., & McGrenere, J. (2007). Slipping and drifting: Using older users to uncover penbased target acquisition difficulties. In Proceedings of the 9th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘07) (pp. 11–18). Moffatt, K., & McGrenere, J. (2009). Exploring methods to improve pen-based menu selection for younger and older adults. ACM Transactions on Accessible Computing, 2(1), Art. 3. Moffatt, K., Yuen, S., & McGrenere, J. (2008). Hover or tap?: Supporting pen-based menu navigation for older adults. In Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘08) (pp. 51–58). Mohadis, H. M., Ali, N. M., & Smeaton, A. F. (2016). Designing a persuasive physical activity application for older workers: Understanding end-user perceptions. Behaviour and Information Technology, 35(12), 1102–1114. Molin, G., Pettersson, C., Jonsson, O., & Keijer, U. (2007). Living at home with acquired cognitive impairment: Can assistive technology help? Technology and Disability, 19(2–3), 91–101. Moloney, K.  P., Jacko, J.  A., Vidakovic, B., Sainfort, F., Leonard, V.  K., & Shi, B. (2006). Leveraging data complexity: Pupillary behaviour of older adults with visual impairment during HCI. ACM Transactions on Computer-Human Interaction, 13(3), 376–402. Morrell, R. W. (2005). http://www.nihseniorhealth. gov: The process of construction and revision in the development of a model web site for use by older adults, Universal Access in the Information Society, 4(1), 24–38. Morrow, D. G., & Rogers, W. A. (2008). Environmental support: An integrative framework. Human Factors, 50(4), 589–613. Mubin, O., Shahid, S., & Al Mahmud, A. (2008). Walk 2 Win: Towards designing a mobile game for elderly’s social engagement. In Proceedings of the 22nd British HCI group annual conference (BCS-HCI ‘08) (pp. 11–14). Murata, A., and Iwase, H. (2005). Usability of touch-panel interfaces for older adults. Human Factors, 47(4), 767–776. Muskens, L., van Lent, R., Vijfvinkel, A., van Cann, P., & Shahid, S. (2014). Never too old to use a tablet: Designing tablet applications for the cognitive and physically impaired elderly. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 391–398). Nansen, B., Vetere, F., Robertson, T., Downs, J., Brereton, M., & Durick, J. (2014). Reciprocal habituation: A study of older people and the Kinect. ACM Transactions on ComputerHuman Interaction, 21(3), Art. 18. Nawaz, A., Skjaeret, N., Ystmark, K., Helbostad, J.  L., Vereijken, B., & Svanaes, D. (2014). Assessing seniors’ user experience (UX) of exergames for balance training. In V.  Roto, J. Häkkilä, Lárusdóttir, K. Väänänen-Vainio-Mattila, O. Juhlin, T. Olsson & E. Hvannberg (Eds.), Proceedings of the 8th Nordic conference on human-computer interaction (NordiCHI ‘14) (pp. 578–587). Németh, G., Olaszy, G., Bartalis, M., Kiss, G., Zainkó, C., Mihajlik, P., & Haraszti, C. (2008). Automated drug information system for aged and visually impaired persons. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 238–241). Newell, A. F., Dickinson, A., Smith, M. J., & Gregor, P. (2006). Designing a portal for older users: A case study of an industrial/academic collaboration. ACM Transactions on Computer-Human Interaction, 13(3), 347–375.

176   Helen Petrie and Jenny S. Darzentas Ng, C.-H. (2007). Motivation among older adults in learning computing technologies: A grounded model. Educational Gerontology, 34(1), 1–14. Nicholson, J., Coventry, L., & Briggs, P. (2013). Faces and pictures: Understanding age differences in two types of graphical authentications. International Journal of Human-Computer Studies, 71(10), 958–966. Nicolau, H., & Jorge, J. (2012). Elderly text-entry performance on touchscreens. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 127–134). Nijhof, N., van Gemert-Pijnen, J. E. W. C., de Jong, G. E. N., Ankoné, J. W., & Seydel, E. R. (2012). How assistive technology can support dementia care: A study about the effects of the IST Vivago watch on patients’ sleeping behavior and the care delivery process in a nursing home. Technology and Disability, 24(2), 103–115. Nijhof, N., van Hoof, J., van Rijn, H., & van Gemert-Pijnen, J. E. W. C. (2013). The behavioural outcomes of a technology-supported leisure activity in people with dementia. Technology and Disability, 25(4), 263–273. Nunes, F., Kerwin, M., & Silva, P. A. (2012). Design recommendations for TV user interfaces for older adults: Findings from the eCAALYX Project. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 41–48). Nygård, L. (2009). The stove timer as a device for older adults with cognitive impairment or dementia: Different professionals’ reasoning and actions. Technology and Disability, 21(3), 53–66. Oberzaucher, J., Jagos, H., Zodl, C., Hlauschek, W., & Zagler, W. (2010). Using a wearable insole gait analyzing system for automated mobility assessment for older people. In Proceedings of 12th international conference on computers helping people with special needs (ICCHP ‘10) (pp. 600–603). O’Brien, M., Rogers, W. A., & Fisk, A. D. (2012). Understanding age and technology experience differences in use of prior knowledge for everyday technology interactions. ACM Transactions on Accessible Computing, 4(2), Art. 9. Ofei-Dodoo, S., Medvene, L. J., Nilsen, K. M., Smith, R. A., & DiLollo, A. (2015). Exploring the potential of computers to enrich home and community-based services clients’ social networks. Educational Gerontology, 41(3), 216–225. Ogawa, M., Inagaki, H., & Gondo, Y. (2006). Usage of IT and electronic devices, and its structure, for community-dwelling elderly. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 752–758). Omori, M., Miyao, M., Kanamoriand, H., & Atsumi, B. (2008). Visual cognitive performance of elderly people: Effects on reading time of age, character size and visual distance. Behaviour and Information Technology, 27(4), 313–318. Orpwood, R., Gibbs, C., Adlam, T., Faulkner, R., & Meegahawatte, D. (2005). The design of smart homes for people with dementia–user-interface aspects. Universal Access in the Information Society, 4(2), 156–164. Osman, Z., Poulson, D., & Nicolle, C. (2005). Introducing computers and the Internet to older users: Findings from the Care OnLine project. Universal Access in the Information Society, 4(1), 16–23. Östlund, B. (2008). The revival of research circles: Meeting the needs of modern aging and the third age. Educational Gerontology, 34(4), 255–266. Otjacques, B., Krier, M., Feltz, F., Ferring, D., & Martin Hoffman, M. (2009). Helping older people to manage their social activities at the retirement home. In Proceedings of the

Digital Technology for Older People   177 23rd British HCI group annual conference on people and computers (BCS-HCI ‘09) (pp. 375–380). Panek, P., & Zagler, W. L. (2008). A living lab for ambient assisted living in the Municipality of Schwechat. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 777–784). Panitsides, E. A. (2013). Interview with Ariadne. Educational Gerontology, 39(12), 944–945. Paredes, H., Cassola, F., Morgado, L., de Carvalho, F., Ala, S., Cardoso, F., Benjamim, B., & Martins, P. (2014). Exploring the usage of 3D virtual worlds and Kinect interaction in exergames with elderly. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 297–300). Parekh, M., & Baber, C. (2010). Tool use as gesture: New challenges for maintenance and rehabilitation. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 241–249). Pasoule, E., & Koutsabasis, P. (2014). Redesigning web sites for older adults: A case study. Behaviour and Information Technology, 33(6), 561–573. Patomella, A.-H., Kottorp, A., Malinowsky, C., & Nygård, L. (2011). Factors that impact the level of difficulty of everyday technology in a sample of older adults with and without cognitive impairment. Technology and Disability, 23(4), 243–250. Pedlow, R., Kasnitz, D., Shuttleworth, R. (2010). Barriers to the adoption of cell phones for older people with impairments in the USA: Results from an expert review and field study. Technology and Disability, 22(3), 147–158. Petrie, H., Darzentas, J. S., & Power, C. (2014). Self-service terminals for older and disabled users: Attitudes of key stakeholders. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 340–347). Pigini, L., Facal, D., Blasi, L., & Andrich, R. (2012). Service robots in elderly care at home: Users’ needs and perceptions as a basis for concept development. Technology & Disability, 24(4), 303–311. Pigini, L., Facal, D., Garcia, A., Burmester, M., & Andrich, R. (2012). The proof of concept of a shadow robotic system for independent living at home. In Proceedings of 13th ­international conference on computers helping people with special needs (ICCHP ‘12) (pp. 634–641). Piper, A.-M., Weibel, N., & D. Hollan, H. D. (2014). Designing audio-enhanced paper photos for older adult emotional wellbeing in communication therapy. International Journal of Human-Computer Studies, 72 (8–9), 629–639. Pitts, K., Pudney, K., Zachos, K., Maiden, N., Krogstie, B., Jones, S., . . . Turner, A. (2015). Using mobile devices and apps to support reflective learning about older people with dementia. Behaviour and Information Technology, 34(6), 613–631. Plaisant, C., Clamage, A., Hutchinson, H. B., Bederson, B. B., & Druin, A. (2006). Shared family calendars: Promoting symmetry and accessibility. ACM Transactions on ComputerHuman Interaction, 13(3), 313–346. Pradhan, A.  K., Hammel, K.  R., DeRamus, R., Pollatsek, A., Noyce, D.  A., & Fisher, D.  L. (2005). Using eye movements to evaluate effects of driver age on risk perception in a driving simulator. Human Factors, 47(4), 840–852. Price, M. M., Pak, R., Muller, H., & Stronge, A. (2013). Older adults’ perceptions of usefulness of personal health records. Universal Access in the Information Society, 12(2), 191–204. Priest, L., Nayak, L., & Stuart-Hamilton, I. (2007). Website task performance by older adults. Behaviour and Information Technology, 26(3), 189–195.

178   Helen Petrie and Jenny S. Darzentas Prieto, G., & Leahy, D. (2012). Online social networks and older people. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 666–672). Proctor, R. W., Vu, K.-P. L., & Pick, D. F. (2005). Aging and response selection in spatial choice tasks. Human Factors, 47(2), 250–269. Qadri, S. S., Wang, J., Ruiz J. G., & Roos, B. A. (2009). Personal digital assistant as point-of-care tools in long-term care facilities: A pilot study. Educational Gerontology, 35(4), 296–307. Rasquin, S. M. C., Willems, C., de Vlieger, S., Geers, R. P. J., & Soede, M. (2007). The use of technical devices to support outdoor mobility of dementia patients. Technology and Disability, 19(2–3), 113–120. Rassmus-Gröhn, K., & Magnusson, C. (2014). Finding the way home-supporting wayfinding for older users with memory problems. In V. Roto, J. Häkkilä, Lárusdóttir, K. VäänänenVainio-Mattila, O. Juhlin, T. Olsson & E. Hvannberg (Eds.), Proceedings of the 8th Nordic conference on human-computer interaction (NordiCHI ‘14) (pp. 247–255). Rau, P.-L. P., & Hsu, J.-W. (2005). Interaction devices and web design for novice older users. Educational Gerontology, 31(1), 19–40. Ren, X., & Zhou, X. (2011). An investigation of the usability of the stylus pen for various age groups on personal digital assistants. Behaviour and Information Technology, 30(6), 709–726. Renaud, K. (2005). A visuo-biometric authentication mechanism for older users. In Proceedings of the 19th British HCI group annual conference on people and computers (BCS-HCI ‘05) (pp. 167–182). Renaud, K., Blignaut, R., & Venter, I. (2013). Designing mobile phone interfaces for age diversity in South Africa: “One world” versus diverse “islands.” In P.  Kotzé, G.  Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part III (INTERACT 2013). Lecture Notes in Computer Science, 8117, (pp. 1–17). Renaud, K., & Ramsay, J. (2007). Now what was that password again? A more flexible way of identifying and authenticating our seniors. Behaviour and Information Technology, 26(4), 309–322. Renold, C., Meronk, C., & Kelly, C. (2005). Technology in community-based organizations that serve older People: High tech meets high touch. Educational Gerontology, 31(3), 235–245. Rice, M., & Carmichael, A. (2013). Factors facilitating or impeding older adults’ creative contributions in the collaborative design of a novel DTV-based application. Universal Access in the Information Society, 12(1), 5–19. Rice, M., Newell, A., & Morgan, N. (2007). Forum theatre as a requirements gathering methodology in the design of a home telecommunication system for older adults. Behaviour and Information Technology, 26(4), 323–331. Riikonen, M., Paavilainen, E., & Salo, H. (2013). Factors supporting the use of technology in daily life of home-living people with dementia. Technology and Disability, 25(4), 233–243. Robertson, T., Durick, J., Brereton, M., Vaisutis, K. Vetere, F., Nanse, B., & Howard, H. (2013). Emerging technologies and the contextual and contingent experiences of ageing well. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Proceedings Humancomputer interaction, Part III (INTERACT 2013). Lecture Notes in Computer Science, 8117, (pp. 582–589). Rogers, W. A., Fisk, A. D., McLauglin, A. C., & Pak, R. (2005). Touch a screen or turn a knob: Choosing the best device for the job. Human Factors, 47(2), 271–288.

Digital Technology for Older People   179 Romoser, M.  R.  E. (2013). The long-term effects of active training strategies on improving older drivers’ scanning in intersections: A two-year follow-up to Romoser and Fisher (2009). Human Factors, 55(2), 278–284. Roring, R. W., Hines, F. G., & Charness, N. (2006). Age-related identification of emotions at different image sizes. Human Factors, 48(4), 675–681. Roring, R. W., Hines, F. G., & Charness, N. (2007). Age differences in identifying words in synthetic speech. Human Factors, 49(1), 25–31. Rosenthal, R.  L. (2008). Older computer-literate women: Their motivations, obstacles, and paths to success. Educational Gerontology, 34(7), 610–626. Rudzicz, F., Wang, R., Begum, M., & Mihailidis, A. (2015). Speech interaction with personal assistive robots supporting aging at home for individuals with Alzheimer’s disease. ACM Transactions on Accessible Computing, 7(2), Art. 6. Russell, H. (2008). Later life: A time to learn. Educational Gerontology, 34(3), 206–244. Saenz-de-Urturi, Z., Garcia Zapirain, B., & Mendez Zorrilla, A. (2015). Elderly user experience to improve a Kinect-based game playability. Behaviour and Information Technology, 34(11), 1040–1051. Salovaara, A., Lehmuskallio, A., Hedman, L., Valkonen, P., & Näsänen, J. (2010). Information technologies and transitions in the lives of 55–65-year-olds: The case of colliding life interests. International Journal of Human-Computer Studies. 68(11), 803–821. Sanders, M. J., O’Sullivan, B., DeBurra, K., & Fedner, A. (2013). Computer training for seniors: An academic-community partnership. Educational Gerontology, 39(3), 179–193. Savitch, N., & Zaphiris, P. (2006). Accessible websites for people with dementia-a preliminary investigation into information architecture. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 144–151). Sayago, S., & Blat, J. (2008). Exploring the role of time and errors in real-life for older people and ICT. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 46–53). Sayago, S., & Blat, J. (2010). Telling the story of older people e-mailing: An ethnographical study. International Journal of Human-Computer Studies, 68(1–2), 105–120. Sayago, S., Forbes, P., & Blat, J. (2012). Older people’s social sharing practices in YouTube through an ethnographical lens. In Proceedings of the 26th annual BCS interaction specialist group conference on people and computers (BCS-HCI ‘12) (pp. 185–194). Sayago, S., Forbes, P., & Blat, J. (2013). Older people becoming successful ICT learners over time: Challenges and strategies through an ethnographical lens. Educational Gerontology, 39(7), 527–544. Sayago, S., Guijarro, J.-M., & Blat, J. (2012). Selective attention in web forms: An exploratory case study with older people. Behaviour and Information Technology, 31(2), 171–184. Schall, M. C. Jr., Rusch, M. L., Lee, J. D., Aksan, N., & Rizzo, M. (2013). Augmented reality cues and elderly driver hazard perception. Human Factors, 55(3), 643–658. Schikhof, Y., Mulder, I., & Choenni, S. (2010). Who will watch (over) me? Humane monitoring in dementia care International. Journal of Human-Computer Studies, 68(5), 410–422. Schmettow, M., & Havinga, J. (2013). Are users more diverse than designs? Testing and extending a 25 years old claim. In Proceedings of the 27th international BCS human computer interaction conference (BCS-HCI ‘13) (Art. 40). Schneider, N., Schreiber, J., Wilkes, J., Grandt, M., & Schlick, C. M. (2008). Foundations of an age-differentiated adaptation of the human-computer interface. Behaviour and Information Technology, 27(4), 319–324.

180   Helen Petrie and Jenny S. Darzentas Schröder, S., & Ziefle, M. (2008). Effects of icon concreteness and complexity on semantic transparency: Younger vs. older users. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 90–97). Sciegaj, M. & Behr, R. A. (2010). Lessons for the United States from countries adapting to the consequences of aging populations. Technology and Disability, 22(1–2), 83–88. Scopelliti, M., Giuliani, M. V., & Fornara, F. (2005). Robots in a domestic setting: A psychological approach. Universal Access in the Information Society, 4(2), 146–155. Seals, C. D., Clanton, K., Agarwal, R., Doswell, F., & Thomas, C. M. (2008). Lifelong learning: Becoming computer savvy at a later age. Educational Gerontology, 34(12), 1055–1069. Sharit, J., Hernandez, M. A., Czaja, S. J., & Pirolli, P. (2008). Investigating the roles of knowledge and cognitive abilities in older adult information seeking on the web. ACM Transactions on Computer-Human Interaction, 15(1), Art. 3. Sharit, J., Hernandez, M. A., Nair, S. N., Kuhn, T., & Czaja, S. J. (2011). Health problem solving by older persons using a complex government web site: Analysis and implications for web design. ACM Transactions on Accessible Computing, 3(3), Art. 11. Shaw, S., & Roberson, L. (2013). Social connectedness of deaf retirees. Educational Gerontology, 39(10), 750–760. Shepherd, C. E., & Aagard, S. (2011). Journal writing with Web 2. 0 tools: A vision for older adults. Educational Gerontology, 37(7), 606–620. Siek, K. A., Rogers, Y., & Connelly, K. H. (2005). Fat finger worries: How older and younger users physically interact with PDAs. In M. F. Costabile and F. Paternò (Eds.), Proceedings Human-computer interaction (INTERACT 2005), Lecture Notes in Computer Science (LNCS), 3585, (pp. 267–280). Simpson, T., Camfield, D., Pipingas, A., Macpherson, H., & Stough, C. (2012). Improved processing speed: Online computer-based cognitive training in older adults. Educational Gerontology, 38(7), 445–458. Simsik, D., Galajdova, A., Siman, D., Bujnak, J., Andrasova, M., & Novak, M. (2012). MonAMI Platform in elderly household environment: Architecture, installation, implementation, trials and results. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 419–422). Siriaraya, P., Ang, C. S., & Bobrowicz, A. (2014). Exploring the potential of virtual worlds in engaging older people and support healthy aging. Behaviour and Information Technology, 33(4), 283–294. Sjölinder, M., Höök, K., Nilsson, L.-G., & Andersson, G. (2005). Age differences and the acquisition of spatial knowledge in a three-dimensional environment: Evaluating the use of an overview map as a navigation aid. International Journal of Human-Computer Studies, 63(6), 537–564. Slegers, K., van Boxtel, M. P. J., & Jolles, J. (2007). The effects of computer training and internet usage on the use of everyday technology by older adults: A randomized controlled study. Educational Gerontology, 33(2), 91–110. Smith, A. L., & Chaparro, B. S. (2015). Smartphone text input method performance, usability, and preference with younger and older adults. Human Factors, 57(6), 1015–1028. Smith, D. J. (2005). Senior users of the internet: Lessons from the cybernun study. Universal Access in the Information Society, 4(1), 59–66. Sokoler, T., & Svensson, M. S. (2007). Embracing ambiguity in the design of non-stigmatizing digital technology for social interaction among senior citizens. Behaviour and Information Technology, 26(4), 297–307.

Digital Technology for Older People   181 Soubelet, A. (2012). Computer use and the relation between age and cognitive functioning. Educational Gerontology, 38(9), 644–649. Spreicer, W., Ehrenstrasser, L., & Tellioglu, H. (2012). kommTUi: Designing communication for elderly. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 705–708). Stark-Wroblewski, K., Edelbaum, J. K., & Ryan, J. J. (2007). Senior citizens who use e-mail. Educational Gerontology, 33(4), 293–307. Stenitzer, M., Putzhuber, M., Nemecek, S., & Büchler, F. (2008). Accessible online shops for the older generation and people with disabilities. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 462–465). Stephens, E. C., Carswell, C. M., & Schumacher, M. M. (2006). Evidence for an elders’ advantage in the naïve product usability judgments of older and younger adults. Human Factors, 48(3), 422–433. Stoltz-Loike, M., Morrell, R. W., & Loike, J. D. (2005). Usability testing of BusinessThinking™ e-Learning CD-ROMs with older adults. Educational Gerontology, 31(10), 765–786. Stössel, C., & Blessing, L. (2010). Mobile device interaction gestures for older users. In E. Hvannberg, M. K. Lárusdóttir, A. Blandford, & J. Gulliksen (Eds.), Proceedings of the 6th Nordic conference on human-computer interaction (NordiCHI ‘10) (pp. 793–796). Stronge, A. J., Rogers, W. A., & Fisk, A. D. (2006). Web-based information search and retrieval: Effects of strategy use and age on search success. Human Factors, 48(3), 434–446. Struve, D., & Wandke, H. (2009). Video modeling for training older adults to use new technologies. ACM Transactions on Accessible Computing, 2(1), Art. 4. Such, M. J., Barberà, R., Poveda, R., Belda-Lois, J.-M., Gómez, A., López, A., Cort, J. M., & Sánchez, M. (2006). The use of emotional design techniques in user oriented design of interfaces within a smart house environment: Case study. Technology and Disability, 18(4), 201–206. Sveistrup, H., Lockett, D., Edwards, N., & Aminzadeh, F. (2006). Evaluation of bath grab bar placement for older adults. Technology and Disability, 18(2), 45–55. Swallow, D., Petrie, H., Power, C. and D. N. Edwards, A. D. N. (2015). Using photo diaries to elicit user requirements from older adults: A case study on mobility barriers. In J. Abascal, S.  Barbosa, M.  Fetter, T.  Gross, P.  Palanque, & M.  Winckler (Eds.), Proceedings Humancomputer interaction, Part I (INTERACT 2015). Lecture Notes in Computer Science, 9296, (pp. 147–164). Szymkowiak, A., Morrison, K,., Gregor, P., Inglis, E. A., Shah, P., Evans, J. J., & Wilson, B. A. (2006). A memory aid with remote communication: Preliminary findings. Technology and Disability, 17(4), 217–225. Tee, K., Bernheim Brush, A. J., & Inkpen, K. M. (2009). Exploring communication and sharing between extended families. International Journal of Human-Computer Studies, 67(2), 128–138. The, P.-L., Lim, W. M., Ahmed, P. K., Chan, A. H. S., Loo, J. M. Y., Cheong, S.-N., & Yap, W.-J. (2017). Does power posing affect gerontechnology adoption among older adults? Behaviour and Information Technology, 36(1), 33–42. Thomas, K. E. (2013). Investigation of age-differentiated spatial semantic elaboration strategies for communicating route instructions. Universal Access in the Information Society, 12(2), 175–190. Thomas, L., & Briggs, P. (2014). An older adult perspective on digital legacy. In V.  Roto, J. Häkkilä, Lárusdóttir, K. Väänänen-Vainio-Mattila, O. Juhlin, T. Olsson & E. Hvannberg

182   Helen Petrie and Jenny S. Darzentas (Eds.), Proceedings of the 8th Nordic conference on human-computer interaction (NordiCHI ‘14) (pp. 237–246). Torun, M., van Kasteren, T., Incel, O. D., & Ersoy, C. (2012). Complexity versus page hierarchy of a GUI for elderly homecare applications. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 689–696). Trewin, S., Richards, J. T., Hanson, V. L., Sloan, D., John, B. E., Swart, C., & Thomas, J.  C. (2012). Understanding the role of age and fluid intelligence in information search. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 119–126). Tsai, H.-Y. S., Shillair, R. Cotten, S. R., Winstead, V., & Yost, E. (2015). Getting grandma online: Are tablets the answer for increasing digital inclusion for older adults in the U.S.? Educational Gerontology, 41(10), 695–709. Tsuji, A., Yabuno, N., Kuwahara, N., & Morimoto, K. (2014). The design and evaluation of the body water management system to support the independent living of the older adult. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 433–436). Turner, P., Turner, S., & van de Walle, G. (2007). How older people account for their experiences with interactive technology. Behaviour and Information Technology, 26(4), 287–296. Uzor, S., Baillie, L., Skelton, D., & Fairlie, F. (2011). Identifying barriers to effective user interaction with rehabilitation tools in the home. In P. Campos, N. Graham, J. Jorge, N. Nunes, P. Palanque, & M. Winckler (Eds.), Proceedings Human-computer interaction (INTERACT 2011). Lecture Notes in Computer Science, 6947 (pp. 36–43). Vacher, M., Caffiau, S., Portet, F., Meillon, B., Roux, C., Elias, E., . . . Chahuara, P. (2015). Evaluation of a context-aware voice interface for ambient assisted living: Qualitative user study vs. quantitative system evaluation. ACM Transactions on Accessible Computing, 7(2), Art. 5. van den Heuvel, E., Jowitt, F., & McIntyre, A. (2012). Awareness, requirements and barriers to use of assistive technology designed to enable independence of people suffering from Dementia (ATD). Technology and Disability, 24(2), 139–148. van der Geest, T., van der Meij, H., & van Puffelen, C. (2014). Self-assessed and actual Internet skills of people with visual impairments. Universal Access in the Information Society, 13(2), 161–174. van der Heide, L. A., Willems, C. G., Spreeuwenberg, M. D., Rietman, J., & de Witte, L. P. (2012). Implementation of CareTV in care for the elderly: The effects on feelings of loneliness and safety and future challenges. Technology and Disability, 24(4), 283–291. van Volkom, M., Stapley, J. C., & Malter, J. (2013). Use and perception of technology: Sex and  generational differences in a community sample. Educational Gerontology, 39(10), 729–740. Vetere, F., Davis, H., Gibbs, M., & Howard, S. (2009). The Magic Box and Collage: Responding to the challenge of distributed intergenerational play. International Journal of HumanComputer Studies, 67(2), 165–178. Vines, J., Blythe, M., Dunphy, P., & Monk, A. (2011). Eighty something: Banking for the older old. In Proceedings of the 25th international BCS human computer interaction conference (BCS HCI ‘11) (pp. 64–73). Viswanathan, P., Little, J. L., Mackworth, A. K., & Mihailidis, A. (2011). Navigation and obstacle avoidance help (NOAH) for older adults with cognitive impairment: A pilot study. In Proceedings of the 13th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘11) (pp. 43–50).

Digital Technology for Older People   183 Volkmann, T., Sengpiel, M., & Jochems, N. (2016). Historytelling: A website for the elderly a human-centred design approach. In S. Björk, E. Eriksson, M. Fjeld, S. Bødker, W. Barendregt, & M. Obaid, (Eds.), Proceedings of the 9th Nordic conference on human-computer interaction (NordiCHI ‘16) (Art. 100). Vrkljan, B. H. (2011). Collaborative learning among older married couples: An exploratory study. Educational Gerontology, 37(2), 117–137. Vutborg, R., Kjeldskov, J., Pedell, S., & Vetere, F. (2010). Family storytelling for grandparents and grandchildren living apart. In E.  Hvannberg, M.  K.  Lárusdóttir, A.  Blandford & J. Gulliksen (Eds.), Proceedings of the 6th Nordic conference on human-computer interaction (NordiCHI ‘10) (pp. 531–540). Waara, N., & Risser, R. (2013). Exploring the influence of online traveller information services on the use of public transport by older people and people with functional limitations: A mixed methods approach. Technology and Disability, 25(1), 15–25. Walker, B. L., & Harrington, S. S. (2008). Computer-based instruction and the Web for delivering injury prevention training. Educational Gerontology, 34(8), 691–708. Wang, L., Rau, P.-L. P., & Salvendy, G. (2011). Older adults’ acceptance of information technology. Educational Gerontology, 37(12), 1089–1099. Wang, L., Sato, H., Rau, P.-L. P., Fujimura, K., Gao, Q., & Asano, Y. (2008). Chinese text spacing on mobile phones for senior citizens. Educational Gerontology, 35(1), 77–90. Webster, G., Fels, D.  I., Gowans, G., & Hanson, V.  L. (2011). Portraits of individuals with dementia: Views of care managers. In Proceedings of the 25th international BCS human computer interaction conference (BCS HCI ‘11) (pp. 331–340). Webster, G., & Hanson, V. L. (2014). Technology for supporting care staff in residential homes. ACM Transactions on Accessible Computing, 5(3), Art. 8. Werner, J. M., Carlson, M., Jordan-Marsh, M., & Clark, F. (2011). Predictors of computer use in community dwelling, ethnically diverse older adults. Human Factors, 53(5), 431–447. Wherton, J. P., & Monk, A. F. (2008). Technological opportunities for supporting people with dementia who are living at home. International Journal of Human-Computer Studies, 66(8), 571–586. Wilkinson, C., Langdon, P., & Clarkson, J. (2010). Observing learning and conceptual development through novel product interaction. In Proceedings of the 24th BCS interaction specialist group conference (BCS HCI ‘10) (pp. 328–336). Wilson, J., Curzon, P., & Duncker, E. (2015). Exploring older women’s confidence during route planning. Behaviour and Information Technology, 34(7), 725–740. Wilson, M., Doyle, J., & McTaggart, G. (2015). Designing stress management interventions to older adults to improve wellbeing. In Proceedings of the 2015 British Human Computer Interaction Conference (British HCI 2015) (pp. 307–308). Wittenberg-Lyles, E., Parker Oliver, D. P., Demiris, G., & Shaunfield, S. (2012). Benefits and challenges of the passport broadcast intervention in long-term care. Educational Gerontology, 38(10), 691–698. Wittland, J., Brauner, P., & Ziefle, M. (2015). Serious games for cognitive training in ambient assisted living environments-a technology acceptance perspective. In J. Abascal, S. Barbosa, M. Fetter, T. Gross, P. Palanque, & M. Winckler (Eds.), Proceedings Human-computer interaction, Part I (INTERACT 2015). Lecture Notes in Computer Science, 9296, (pp. 453–471). Wockl, B., Wimmer, B., Yildizoglu, U., Leitner, M., & Tscheligi, M. (2011). “Accept” or “Decline”: Alternative options for video telephony tools for inter-generational family communication. In Proceedings of the 25th international BCS human computer interaction conference (BCS HCI ‘11) (pp. 253–258).

184   Helen Petrie and Jenny S. Darzentas Wockl, B., Yildizoglu, U., Buber, I., Diaz, B. A., Kruijff, E., & Tscheligi, M. (2012). Basic senior personas: A representation design tool covering the spectrum of European older adults. In Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility (ASSETS ‘12) (pp. 25–32). Wolters, M., Georgila, K., Moore, J. D., & MacPherson, S. E. (2009). Being old doesn’t mean acting old: How older users interact with spoken dialog systems. ACM Transactions on Accessible Computing, 2(1), Art. 2. Wood, E., Lanuza, C., Baciu, I., MacKenzie, M., & Nosko, A. (2010). Instructional styles, attitudes and experiences of seniors in computer workshops. Educational Gerontology, 36(10–11), 834–857. Wood, J. M., Tyrrell, R. A., & Carberry, T. P. (2005). Limitations in drivers’ ability to recognize pedestrians at night. Human Factors, 47, 644–653. Wulf, L., Garschall, N., Himmersbach, H., & Tscheligi, M. (2014). Hands free-care free: Elderly people taking advantage of speech-only interaction. In V.  Roto, J.  Häkkilä, Lárusdóttir, K. Väänänen-Vainio-Mattila, O. Juhlin, T. Olsson & E. Hvannberg (Eds.), Proceedings of the 8th Nordic conference on human-computer interaction (NordiCHI ‘14) (pp. 203–206). Wulf, L., Garschall,  M., Klein,  M., & Tscheligi,  M. (2014). The influence of age and device orientation on the performance of touch gestures. In Proceedings of 14th international conference on computers helping people with special needs (ICCHP ‘14) (pp. 583–590). Xie, B. (2007). Information technology education for older adults as a continuing peer-learning process: A Chinese case study. Educational Gerontology, 33(5), 429–450. Xie, B., Druin, A., Fails, J., Massey, S., Golub, E., Franckel, S., & Schneider, K. (2012). Connecting generations: Developing co-design methods for older adults and children. Behaviour and Information Technology, 31(4), 413–423. Xie, B., Watkins, I., Golbecka, J., & Huanga, M. (2012). Understanding and changing older adults’ perceptions and learning of social media. Educational Gerontology, 38(4), 282–296. Xu, Q., Ng, J. S. L., Tan, O. Y., & Huang, Z. (2015). Needs and attitudes of Singaporeans towards home service robots: A multi-generational perspective. Universal Access in the Information Society, 14(4), 477–486. Yancura, L. A. (2013). How to make reminiscence movies: A project-based gerontology course. Educational Gerontology, 39(11), 828–839. Yanguas, J. J., Buiza, C., Etxeberria, I., Galdona, N., Gonzalez, M. F., & Urdaneta, E. (2006). A randomized, placebo-controlled study of the efficacy of cognitive intervention on elderly people and on patients with Alzheimer’s disease. In Proceedings of 10th international conference on computers helping people with special needs (ICCHP ‘06) (pp. 759–765). Zakraoui, J., & Zagler, W. (2012). A method for generating CSS to improve web accessibility for old users. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 329–336). Zaphiris, P., & Sarwar, R. (2006). Trends, similarities, and differences in the usage of teen and senior public online newsgroups. ACM Transactions on Computer-Human Interaction, 13(3), 403–422. Zhang, B., Lau, P.-L. P., & Salvendy, G. (2009). Design and evaluation of smart home user interface: Effects of age, tasks and intelligence level. Behaviour and Information Technology, 28(3), 239–249. Zhang, C., Tian, Y., & Capezuti, E. (2012). Privacy preserving automatic fall detection for elderly using RGBD Cameras. In Proceedings of 13th international conference on computers helping people with special needs (ICCHP ‘12) (pp. 625–633).

Digital Technology for Older People   185 Zheng, R., Spears, J., Luptak, M., & Wilby, F. (2015). Understanding older adults’ perceptions of Internet use: An exploratory factor analysis. Educational Gerontology, 41(7), 504–518. Zhou, J., Rau, P.-L.  P., & Salvendy, G. (2014). Age-related difference in the use of mobile phones. Universal Access in the Information Society, 13(4), 401–413. Zhou, J., Rau, P.-L. P., & Salvendy, G. (2014). Older adults’ use of smart phones: An investigation of the factors influencing the acceptance of new functions. Behaviour and Information Technology, 33(6), 552–560. Ziefle, M., & Bay, S. (2005). How older adults meet complexity: Aging effects on the usability of different mobile phones. Behaviour and Information Technology, 24(5), 375–389. Ziefle, M., & Bay, S. (2006). How to overcome disorientation in mobile phone menus: A comparison of two different types of navigation aids. Human-Computer Interaction, 21, 393–433. Ziefle, M., Pappachan, P., Jakobs, E. M., & Wallentowitz, H. (2008). Visual and auditory interfaces of advanced driver assistant systems for older drivers. In Proceedings of 11th international conference on computers helping people with special needs (ICCHP ‘08) (pp. 62–69). Ziefle, M., Schroeder, U., Strenk, J., & Michel, T. (2007). How younger and older adults master the usage of hyperlinks in small screen devices. In Proceedings of the SIGCHI conference on human factors in computing systems (CHI 2006) (pp. 307–316).

chapter 7

A Digita l N exus Sustainable HCI and Domestic Resource Consumption Nicola Green, Rob Comber, and Sharron Kuznesof

Introduction: Digital Systems and Natural Resources At the beginning of the twenty-first century, human beings are facing a range of global and globalizing shifts in social organization that bring technologies, societies, and cultures into complex tension with each other. Nowhere is this more so than in the shifts wrought by the development of digital technologies towards extensive social change. In the second decade of the 21st century, we have also seen the emergence of what is now commonly referred to as “ubiquitous computing,” or the “Internet of Things,” where alongside various forms of automation, “smart” technologies have emerged to be deployed at different scales, whether that is at the level of the urban land/city-scape, the workplace, or the dwelling (as several chapters in this Handbook emphasize). While the technological imaginaries of the late 1990s had already identified the possibilities associated with, for example, “smart homes” (Harper, 2003), developments since have seen urban environments becoming ever more deeply imbricated with the material infrastructures of the digital, and the data-mediated social and cultural relations those infrastructures support. Here then, urban domestic dwelling spaces and places have received increasing attention as crucial units of analysis for the understanding of potential technologically mediated futures and the ways they might shape our ways of being in a dig­ital age. At the same time, we have simultaneously seen challenges that pose fundamental questions about the continued prosperity (or security, or in some cases even survival) of humankind and our technological, economic, political, social, and cultural futures. Not least of these challenges are those posed by both climate change, and a related and

A Digital Nexus: Sustainable HCI and Resource Consumption   187 ongoing large-scale depletion or waste of increasingly scarce natural resources. These challenges are global (indeed also planetary) in nature, and are expressed over multiple scales of human social life—from the practices and relations of the everyday, to the or­gan­i­za­tion of communities, cities and regions, to large-scale technical, economic, and political systems at national and international levels. It would be something of an understatement to remark that these are not inconsiderable environmental challenges for global humanity to be facing.

Where Digital Development Meets Environmental Crisis Given these two widespread and significant sets of rapid social change, this chapter considers the ways that humanity’s digital and environmental futures are becoming intertwined, and how each domain is implicated in shaping the economic, cultural, and politically mediated futures of the other: that is, a complex digital nexus. Whilst the natural resources that human beings exploit are innumerable, the most fundamental of these are those that are crucial for the sustenance of human life itself: The availability of, and access to, Water, Energy, and Food (WEF). The ways that human beings exploit (or use, or consume, or transform) these crucial WEF resources are, of course, variable, and are considerably differentiated across globally interconnected societies, depending on a range of variables such as: Historical systems of economic development, the deployment of technical systems at different scales, the politics of colonization and globalization, and the social and cultural values and norms that define the relationships between “natures” and “cultures.” At the beginning of the twenty-first century, however, further factors are increasingly coming into play. Two of the most important of these are population growth and urbanization. On the one hand, pressures are increasing on the planetary WEF resource base attributable to the simple calculation of global population growth (Maheshwari et.al., 2014). The United Nations Department of Economic and Social Affairs (UNDESA) currently estimates the global population at around 7.6 billion, with a projection to 2030 of 8.6 billion, including “roughly 83 million being added to the world’s population every year” (UNDESA, 2017a). Of those 7.6 billion around the world, 665 million have no access to clean drinking water, 795 million are undernourished or malnourished, 1.4 billion have no access to electricity, and 2.4 billion have no access to basic sanitation services (UNDESA, 2017b). While the implications of such growth for the environment have been apparent for quite some time, studies in recent years have highlighted population growth as an increasingly pressing challenge in relation to current global economic and political organizations of resource capture, distribution and use. On the other hand, populations are not only growing, they are also simultaneously and progressively urbanizing. The United Nations (2018) estimates that 55% of the world’s population currently live in urban areas, and project that this figure will reach 68% by 2050. This means that cities, and the organization of them, are becoming an ever-more critical locus of resource deployment and consumption—and are therefore vital sites of

188   Nicola Green ET AL. evaluation for those processes (Colucci et al., 2017). The dwelling places of contemporary cities—and their progressive digitization—have therefore simultaneously become an important focus for the investigation of contemporary resource consumption, and have become crucial spaces to explore in-depth in any consideration of the digitally supported sustainability of resource use. The focus on dwelling places therefore simultaneously draws attention to the role households play in processes of consumption more broadly (Burgess et al., 2003).

A Nexus of Relationships Alongside the critical role of the expansion of growing urban concentrations in relation to population growth, attention has at the same time further been drawn to the relationships between the core WEF resources themselves in a nexus of interdependencies (Abdul Salam et al., 2017; Bhaduri et al., 2015). This nexus thinking has been largely framed by the notion of a WEF nexus (or multiple nexii) of resources—the ways in which any shifts or perturbations in the availability, process, distribution or consumption of one WEF resource system will have a “tipping point” into “ripple effects” on the sociotechnical systems organizing the exploitation, deployment and consumption of the other resources (Beddington, 2009; McGrane et al., 2018; Smajgl et al., 2016), with widespread social implications. The focus of this chapter is therefore to review environmental social science and human-computer interaction (HCI) research on the ways in which digital systems can potentially intervene in urban domestic spaces to investigate, analyses and understand household WEF resource consumption, and design digital infrastructures to mitigate against unsustainable consumption.

Chapter Overview Towards this more general goal, the chapter outlines social research-based responses to the environmental challenges presented by population growth, urbanization and climate change, and their effects on the consumption of related WEF resources. We begin our review of research in The Development of Sustainable HCI by introducing the concept and (inter)disciplines of “Sustainable HCI” (Human-Computer Interaction). Here we highlight the interplay of social research and digital design disciplines, and the goals articulated by strands of Sustainable HCI research towards resource minimization in sustainable digital design, and persuasion towards sustainability in consumption via digital systems. The chapter then goes on to consider the various strands of theory that have informed the conceptual development of Sustainable HCI over time, how those various theoretical frameworks have been employed in empirical investigations of WEF resource use, and their implications for digital systems in general and sustainable HCI in particular.

A Digital Nexus: Sustainable HCI and Resource Consumption   189 Accordingly, the section Investigating Physical Resource Use addresses those studies that have attempted to represent the human activities that consume WEF resources across different social domains, and the systems, organizations, and interactions involved— from large-scale systems of resource distribution and consumption, to a range of microlandscapes of resource use (including studies employing digital technologies to such ends). In the following section, Investigating Rational Choice and Behaviour Change, we then review those studies that have used theories of behaviour and cognition to analyses what people do with physical resources in the process of consumption, and under what conditions. A particular concern here is to examine those research designs that are concerned with the possibilities for behavioral influence and social change towards more sustainable resource relationships. We then turn to a review of those projects that have attempted to operationalize concepts such as Attitudes, Values and Lifestyles in the pursuit of understanding the social aspects of resource consumption that can then be used to guide the design, development and implementation of digital systems towards various dimensions of sustainability. In the final section, Investigating Practices and Networks, we go on to consider how theories of practices and networks have been deployed at different scales to both understand the dynamics of resource use, and to identify potential levers for “doing things differently” with and through digital systems. Table 7.1 summarizes the approaches, key concepts, methodologies, and emphases in studies associated with each section.

The Development of Sustainable HCI One of the main contemporary developments in the investigation of the intersections of digital systems and resource use is the development of approaches in “Sustainable HCI.” Human-Computer Interaction (HCI) is itself a broadly interdisciplinary field, focusing as it does on the interaction of humans with computing objects and environments, and encompassing (inter)disciplinary perspectives from computer sciences and engineering, alongside interdisciplinary social sciences, and design disciplines. In “Sustainable HCI” (sometimes “Environmental HCI”) (Hee-Jeong Choi & Blevis, 2010; DiSalvo et al., 2010), understanding humans’ interaction with digital technologies—their objects and infrastructures—and the development of novel digital technologies, has been brought together with a concern to address current and future environmental challenges. HCI approaches to human-digital relations are focussed at the “interface” between computing systems and their human “users,” and involve both understanding the relationship between them (given the contexts of their interactions), and planning design interventions for the development of alternative or improved digital systems. Based on critical design studies, Blevis (2007, p. 503) argues that an important dimension of Sustainable HCI is the efforts by technology designers to build sustainability into both material and data-based computing products—including in their “invention, disposal, renewal and reuse.” Mankoff et al. (2007) characterize this position as advocating

Table 7.1  Summary of Approaches, Key Concepts, Methodologies, and Studies in Each Section Chapter sections

Approaches

Key concepts

Methodologies

Studies

Introduction

Climate change Population growth Urbanization

Defining a resource nexus Tipping points and ripple effects

 

Household-focussed

Sustainable HCI

Human–computer interaction

Sustainability in and through design Pervasiveness and persuasion Sustainable interaction design Revisioning consumption Citizen sensing

Measurement Intervention

Digital feedback systems Displays Applications Interventions

Physical resource use

Infrastructure measurement

Empirical studies of physical resources

Provider-supplied aggregate system statistics Scaled real-time and phase-time measurement of WEF infrastructures and appliances Surveys Interviews

Digital measurement systems Interventions

Rational choice and behaviour change

Behavioral psychology Cognitive and social psychology

Individual action Cognition Agency

(including online) Surveys Structured and semi-structured interviews Observation

(including online) Individually focussed Single-resource focussed Aggregate digital measurement of resource use Aggregate survey responses Individual qualitative responses

Attitudes, values, and lifestyles

Cognitive and social psychology Environmental sociology Cultural sociology Phenomenology

Norms Values Attitudes Knowledge Structure-agency

Surveys Structured and semi-structured interviews Focus groups Observation Visual methods

Individual and group-focussed Aggregate survey measurement of norms, values, attitudes Individually narrated lifestyles and consumption patterns

Practices and networks

Environmental, political, and cultural sociology Socio-technical networks Actor-network theory

Habits and routines Material infrastructures Knowledge-meaning-action competencies Human and non-human (digital) agencies

(including online) Observation Ethnography Semi-structured interviews Visual, virtual, and sensory methods Participatory methods

Practice focussed (doing) across social scales Nexus of systems, things, thinking, doing, and meaning Negotiation of social complexity

Revisiting sustainable HCI

Extensively interdisciplinary approaches Computing-based Design-based Social science-based

Digital information transformation Influence

(including online) Participatory methods Design-based methods Interventions Ethnography Semi-structured interviews Visual, virtual, and sensory methods

(including online) Digital feedback systems Displays Applications Social media Interventions Workshops Games Negotiation of digitalenvironmental-social complexity across scales

192   Nicola Green ET AL. “sustainability in design” (reducing the resource intensity of computing systems), to which they add an orientation towards interaction in “sustainability through design”— designing digital media technologies to influence broad socio-cultural trends towards sustainability. According to DiSalvo et al., these two broad-based characterizations of the field of Sustainable HCI have since produced a proliferation of HCI sub-specialties in both pervasive (in design) and persuasive (through design) computing systems oriented towards sustainability. Woodruff and Mankoff (2009) summarizes the combination of these approaches as the “core challenges” of Sustainable HCI, “including monitoring the state of the physical world; managing the direct and indirect impacts of large-scale human enterprises such as agriculture, transport, and manufacturing; and informing individuals’ personal choices in consumption and behavior” (DiSalvo et al., 2010, p. 1976). Drawing on Goodman’s (2009) characterization of Sustainable HCI into three broad discursive and empirical clusters of environment-digital understandings—including “sustainable interaction design,” “revisioning consumption,” and “citizen sensing”—DiSalvo et al. (2010) further extend an evaluation of the relevant research literature to identify multiple sub-specialties in Sustainable HCI that both relate to, and challenge, each other—and in doing so they provide an excellent critical map of this extensive field. In the following sections then, the chapter reviews the empirical research in HCI and environmental social sciences that has variously addressed questions relating to human interaction with both digital technologies and resources in the consumption spaces of urban domestic dwellings. From mapping physical systems and infrastructures via dig­ital means, to considering the factors that affect human interaction with and transformation or consumption of both natural resources and digital systems in the home, the chapter aims to summarize the contemporary state of play in relevant research, and to indicate directions for development in sustainable digital design.

Investigating Physical Resource Use There is no doubt that environmental systems are broadly grounded in long-term proc­esses of industrialization (now extended to digitization), and systems of market capitalism. These technical developments and economic shifts inevitably altered human relationships with their environments—in the case of resources, the technically and economically mediated means through which people access, use, exploit or consume the resources of the natural world. On the one hand this promotes a logic of technocratic rationality, placing faith in technology to guarantee the wide-ranging and efficient exploitation of resources. At the same time, emerging digital technologies offer the potential to “map,” to a fine-grained level, the systems through which resources are distributed, circulated, and consumed, and therefore contribute to understandings of (and potential interventions in) un/sustainable socio-cultural practices.

A Digital Nexus: Sustainable HCI and Resource Consumption   193 In the investigation of WEF relationships, natural science and engineering disciplines tend to focus on the quantitative measurement of extant and available physical resources (and the modelling and visualization of available macro-scale national and international data pertaining to such via digital systems). Such data may then be used by social research that focuses on the human exploitation, organization, distribution and gov­ern­ ance of such resources via economic and technical (especially urban) infrastructures that make the harvest, supply, delivery and circulation of resources possible (Kalbar et al., 2016, 2018; McGrane et al., 2018). Relevant empirical examples may be derived from a number of sources,1 such as utility companies’ annual supply and distribution reporting, analyzed with respect to official population demographics (such as those from the Office for National Statistics) derived via digitally generated data. In the case of water, for example, Ofwat (the independent UK Water Services Regulation Authority) requires water utility companies to measure (via aggregate—both analogue and digital—water flow distribution meters) and report the cubic metric distribution of water to households and non-households (business and industry), as well as accounting for waste (leakages) and operational usage (Ofwat, 2018). Similar national-level statistics, often generated via the deployment of digital metering systems, are available via various government departments and agencies, such as (in the UK) the Department for Business, Energy and Industrial Strategy (DBEIS; for example, see their Energy Use in the UK, 2018a; Digest of UK Energy Statistics, 2018b) reporting on energy use—households comprise 28% of total UK energy consumption (DBEIS, 2018a)—or various aggregate-level food statistics from the Department of Environment, Food and Rural Affairs (DEFRA), the Environment Agency, and the Food Standards Agency (FSA). Digital systems, of course, support the collection, processing, and analysis of such data. But whereas the foregoing research attempts to map physical resource consumption at the level of entire populations, digital developments situated in HCI often instead turn to the innovation of digital systems to map such relationships at a more finegrained scale. Drawing on approaches where the focus is “sustainability in design,” a prominent approach is the design of urban household sensor-based information systems that are capable of collecting and managing data on local natural resource use. In the domestic sphere, the ecosystems comprise the spaces of the built environments that are inhabited, and the distribution and circulation of natural resources within (and beyond) them. The intent in such digital systems, then, is the quantitative measurement of resource use or consumption within dwellings in order to be able to understand how, where, and when resource consumption is taking place. Such measurements can be both aggregated (for example, measuring total energy use within the dwelling via “smart” [digitally based] metering), and/or disaggregated (such as sensor systems to measure energy or water use down to the appliance level, or for different daily or seasonal periods, or even in comparison to comparable neighborhoods or areas). The measurement of domestic energy consumption (typically including domestic electricity and gas use) has been one of the most extensively researched areas of ­digital technology design, deployment, and evaluation within the domestic sphere.

194   Nicola Green ET AL. Examples in product and critical design could include advances in the design of ­sensor-based measurement systems to accurately measure consumption, such as Wood and Newborough’s (2003) development of indicators for (disaggregated) appliance consumption of energy. They could also include the development of interactive visualizations to represent such consumption (Costanza et al.,2012). Other salient examples might include designs for water measurement, such as Srinivasan et al.’s (2011) motion-sensor-based disaggregated water flow measurement system, or Arroyo et al.’s (2005) development of the “Waterbot,” measuring water flow, use, and waste at the interface of the sink. Waterbot not only measured water flow, but also developed visualization and display technologies (located at the sink) intended to feedback information on resource consumption to its human inter-actors (see also Kuznetsov & Paulos, 2010). Pierce, Odom and Blevis (2008) provide a useful critical overview of interaction design for eco-visualization in general, and Casado-Mansilla et al. (2016, p. 1695) extend such a discussion into “eco-aware systems within everyday things” to provide user feedback toward awareness and understanding. Such feedback systems are largely informational, and as such they assume a rational and decision-making human subject for their interpretation, and the ability to act on that information and the interpretation of it. However, such digital systems (and the design of them) are also sometimes significantly oriented towards persuasion and “behaviour change”: As such, the design, deployment and use of persuasive digital systems is much-debated in the Sustainable HCI literatures. In the following section we therefore consider those studies that have deployed theories of cognition and behaviour to not only analyze what people do with the physical WEF resources available to them in their domestic environments (and beyond), but also whether innovative digital systems might be designed and developed to persuade consumers of the need for (or desirability of) change, or influence behaviour towards more sustainable practices of resource use within the domestic sphere.

Investigating Rational Choice and Behavior Change The physical resources and distribution systems outlined earlier that are “mapped” at the macro-scale tend to be grounded in a technocratic rationality derived from the proc­ess of industrialization. Simultaneously, however, environmental relations are also embedded in global systems of market capitalism, and the assumptions that underpin them—such as the ever-expanding growth of markets (for thorough analysis and critique, see Callon,  1998; Granovetter,  1985; Polanyi,  1944/1957). Such macro-level economic analyses have, however, received extensive criticism for their generally reductionist (and sometimes determinist) tendencies. Given the limitations of such approaches for understanding, for example, everyday lived experiences and interactions, or collective

A Digital Nexus: Sustainable HCI and Resource Consumption   195 or organizational processes, others have turned their focus to other scales of human life—such as individual human beings and their behaviours in social context.2 There are a number of accounts of nature-people relationships that have derived their approaches from the discipline of social psychology in general, and environmental social psychology more specifically. The focus of these approaches is to understand and explain the ways that individuals encounter and experience WEF resources in their everyday lives, and, most importantly, what they do with them. Here the focus shifts to the position of the individuals as sovereign agents within such structures as economic decision makers, exercising rational choice and acting in self-interest with respect to both the production and consumption of goods and services within markets (Barr et al., 2011b). With respect to the uses of WEF resources then, individuals are positioned predominantly as consumers, and the aggregate outcomes of individuals’ rational judgements in consumption are assumed to maximize the efficiencies of resource supply and demand, and guarantee rational resource distribution. Alongside the more micro-scale investigations of digital systems mapping resource use in domestic spaces then, there are a range of projects that attempt to simultaneously outline individual (or household) environmentally focussed behaviour, especially where consumption is a matter of the intersections between economic decision making or behaviour, and those activities concerned with resource use (Advani et al.,  2013; Chitnis et al., 2013, 2014; Druckman et al., 2011; Ofwat, 2011; Oikonomou et al., 2009). If the concept of rational action positions the individual as a pre-eminent economic social actor, further approaches in environmental psychology extend the conceptualization of human action from the exclusively economic, to other social realms. The focus here is again on what individuals do, rather than who they are or what they experience. Other projects therefore focus towards more extensive behavioral dimensions of resource use itself. In these types of quantitative studies, digital and online research tools become particularly important, demonstrating the increasing importance of Internetbased digital systems for the investigation of household water-energy consumption. For example, the studies by the Energy Saving Trust (2013) characterizing the self-reported water (and energy) behaviours of 86 thousand households across the UK, are based on their online water-energy calculator (see also Energy Saving Trust 2014; Kenway et al., 2011). Similar larger-scale surveys in water analyze self-reported behaviours amongst smaller and regional population samples—for example, see Pullinger et al. (2013), for an analysis of behavioral water data derived from 997 questionnaire respondents using computer-assisted personal interviews. In energy, Palmer and Terry’s (2014) Powering the nation is based on the Household Electricity Survey, consisting of self-reported and retrospective behaviours of a sample of 250 UK households (another example of methodology using computer-assisted personal interviews), alongside extensive digital energy consumption monitoring down to appliance level. There are now also a significant number of larger-scale surveys of food behaviour. The UK Food and You survey, for example, provides a broad-based snapshot of the United Kingdom’s food provisioning, preserving, cooking, eating, and waste behaviors in a self-reported representative questionnaire survey of 3,453 respondents

196   Nicola Green ET AL. (Food Standards Agency, 2014a, 2014b). Similarly, the Waste and Resources Action Program (WRAP) provides aggregate statistics on food waste, including that at the household level (WRAP, 2017). It is within this broadly environmental social science research milieu that Sustainable HCI is located, and where the broad question of potential digital interventions towards sustainable behaviour might take place. According to the APA (2018) “behaviours” are defined as “an organism’s activities in response to external or internal stimuli, including objectively observable activities, introspectively observable activities, . . . and nonconscious processes.” In the case of WEF resource use, attention is therefore paid to those behaviours directly related to environmental (causes and) effects. According to Stern (2000, p. 408), this “environmentally significant behaviour” is that which can  . . . reasonably be defined by its impact: the extent to which it changes the availability of materials or energy from the environment or alters the structure and dynamics of ecosystems or the biosphere itself . . . Some behavior, such as clearing forest or disposing of household waste, directly or proximally causes environmental change . . . Other behavior is environmentally significant indirectly, by shaping the context in which choices are made that directly cause environmental change . . . For example, behaviors that affect international development policies, commodity prices on world markets, and national environmental and tax policies can have greater environmental impact indirectly than behaviors that directly change the environment.

Stern goes on to note that the environmental impacts listed have historically been byproducts of activities aimed at the fulfilment of “human desires” (and the creation of technologies and organizations to achieve them) rather than the straightforward sustenance of human life itself (Stern, 2000, p. 408: emphasis added). By conceptualizing environmentally significant behaviours as the product of human “desires” rather than of “needs,” the approach therefore opens discursive space to invoke the possibility for change in human behaviour with respect to the environment via social interventions, including via digital interventions. The interventions proposed therefore tend towards persuasion of one kind or another. If rational individuals can be convinced that maximizing their self-interests can be aligned with a collective interest in maximizing resource use without damaging the sources of those resources (which will also therefore maximize future self-interests), then changes to individual behaviour will “naturally” result as a consequence of rational cognitive processes.3 Indeed, it is the design, development and evaluation of digital feedback systems toward persuasion—whether aggregated or disaggregated—that are most extensively found in the Sustainable HCI literature. That is, they include not only the measurement of resource use to understand its dynamics, but also to represent that use and convey that information to users in order to persuade them towards resource conservation— “sustainability through design” in “eco-feedback” systems (Froehlich et al., 2010). Studies explicitly focussed on the creation of interventions for behaviour change tend to be directed towards the interactional design of their feedback components in terms of

A Digital Nexus: Sustainable HCI and Resource Consumption   197 communication, clarity, and goals, and the evaluation of the effectivity of such systems with respect to both human interactions with digital technologies, and/or concomitant impacts in the form of observable changes in users’ behaviours towards sustainability (Fischer, 2008; Vassileva et al., 2013). As is the case with studies of resource consumption in the social sciences more generally, however, examples of intervention and persuasion here are often focussed on a single resource—most commonly with respect to either energy (Bang et al.,  2007; Bonino et al., 2012; Froehlich, 2009; Gamberini et al., 2012; Oliveira et al., 2016; Riche et al., 2010; see Hazas et al., 2011 for a useful review), or water (Erikson et al., 2012; Froehlich et al., 2012; Kappel & Grechenig, 2009; Liu et al., 2015). Novel approaches to domestic practices concerned with digital interaction design in food provisioning, preservation, preparation and food waste have only more recently emerged in the Sustainable HCI research base. Notable examples include understanding food consumption lifecycles using wearable cameras (Ng et al., 2015), or “the pervasive fridge,” a fridge-based digital system mitigating against food waste (Rouillard,  2012) (see also Farr-Wharton et al., 2014a, 2014b on the use of “fridge-cams” for similar purposes). Murtagh et al. (2014) remark that a range of studies have tended to confirm the view that resource usage feedback technologies—for example, In-Home Displays (IHDs)— can make some (at least marginal) difference in resource demand reduction. They point out, however, that there is considerable heterogeneity amongst individuals and households, and their own project on energy use indicated that whilst located feedback appears to be of immediate utility in persuasion towards changes in behaviour, its effectiveness diminishes over time and is largely secondary to attempts at conservation or persuasion “situated in wider social and physical contexts” (Murtagh et al., 2014, p. 1). Others tend to concur (Burchell et al, 2016; Boucher et al., 2012; Foster et al., 2010; Tirado Herrero et al.,  2018), with some claiming that intentional HCI interventions towards persuading individuals are at least relatively limited in their effectiveness, and at most such interventions themselves “narrow our visions of sustainability” (Brynjarsdottir et al., 2012, p. 947). DiSalvo et al. (2010a), in their critical review of sustainable HCI, acknowledge that while there is diversity in design among the digital systems developed to persuade—from ambient awareness systems that seek to provide information for knowledge and understanding, to systems that seek to change both thinking and action—their premises are often problematic. On the one hand, they can render aspects of environments and the consumption of resources visible. On the other hand, they might attempt, with various degrees of intent, to influence users to behave in ways deemed “more sustainable.” As DiSalvo et al. (2010a, p. 22) point out, this involves specific value judgements about what constitutes “sustainable behaviour,” and as such are also politically inflected (even ideologically aligned) positions: Most persuasive technologies imply that users engage in problematic behaviors and should be directed toward more desirable ones. In many scenarios, persuasion begins to border on coercion, sometimes even evoking Skinnerian behaviour modification . . . Questions of “the user” quickly become issues of expertise and

198   Nicola Green ET AL. he­gem­ony. If we agree that fundamental change is needed and it might be change that users don’t want, who gets to decide what change should happen and how? Whose needs are met, and whose values matter?  (DiSalvo et al., 2010a, p. 23).

Such approaches have therefore attracted critique on the basis that they are both reductionist (to the level of the individual and their actions), and deterministic (to largely isolated causes and effects; (Barr et al., 2011a). In response, other psychologically oriented approaches have therefore added some level of nuance, acknowledging that more than a single factor might be involved in any behaviour change: Various and multiple individual motivations, as well as collective altruistic motivations, might simultaneously qualify as “self-interest.” Moreover, seemingly contradictory choices between competing individual and collective motivations might also qualify as “rational”—as some have argued, “choice matters” (Murtagh et al., 2015; Uzzell et al., 2006). The focus on behaviour change is certainly not limited to the HCI exploration of resource-conservation behaviours, and there is a wide range of social science studies more broadly that also take individuals and/or their behaviours as the starting point for investigating resource consumption. These include studies, both within HCI and beyond, that have attempted to capture wider aspects of environmental practice such as attitudes, values, and lifestyles.

Investigating Attitudes, Values, and Lifestyles By way of contrast to approaches that focus largely on “behaviour” and its change, more sociologically oriented approaches focus on units of analysis between the socialpsychological and sociological via concepts such as attitudes with respect to norms, and further acknowledge the additional intervening roles of values or situation with respect to the intention-behaviour relationship (Barr, 2003; Barr, 2006). Further research has also invoked more extensive categories that could act as intervening factors in the formation of values as they relate to environmental behaviours: including, for example, (both individual and collective) identities (Evans, 2011b; Gatersleben et al., 2014), and cross-cultural politics (Katz-Gerro et al., 2017). One important and extensively debated concept in recent years has related to the formulation and deployment of a meta-category labelled lifestyles in an attempt to link the psychologically and individualistically oriented concerns described by behavioral and cognitive processes with wider socio-cultural concerns that account for values, but which also recognize the interplay between individuals and the extensive and diverse social and cultural collectivities (at various scales) of which they are a part. Some researchers have, for example, advocated attempts to formulate a broadly conceived typology of “lifestyle groups” based on, for example, “environmental values and concern,”

A Digital Nexus: Sustainable HCI and Resource Consumption   199 “socio-demographic variables,” and “psychological factors” (Barr & Gilg,  2006; Gilg et al., 2005). Many of these have also been explicitly oriented towards describing a home or household as engaged in lifestyles, rather than focusing at the level of the individual or behaviour per se (Barr & Gilg, 2006; for reviews across frameworks see Barr, 2016; Evans & Abrahamse, 2009). As Evans and Abrahamse (2009) point out however, the historic influence of policy shifts and their orientation towards persuasion has also meant that the discourse of “sustainable lifestyles” has become ever-more ubiquitous, but also therefore ever-more politically inflected. Instead, in nearly every case where “sustainable lifestyles” are invoked, the analysis reverts almost immediately to a consideration of “sustainable consumption” as definitive of lifestyles (Barr et al., 2011a; Connolly & Prothero, 2008: Evans,  2011a; Hobson,  2002; Shove & Warde,  2002; Spaargaren,  2003; Spaargaren & Oosterveer, 2010; Spaargaren & Van Vliet, 2000). Hobson (2002), for example, explores the relationship between the pro-environmental meanings of consumption and values relating to social justice, arguing that rationalized formulations of “sustainable consumption” carry little cultural meaning, and are therefore unable to address collective social concerns. Seyfang (2006) addresses similar themes with respect to the intersection of “sustainable consumption” and the concept of “ecological citizenship.”4 One response to some of this complexity has been the more extensive development of mixed methods approaches to understand resource consumption in the domestic sphere. Pullinger et al. (2013), for example, extend behavioral questionnaires to a subsample of qualitative interviews to further explore the combination of factors that might contribute to environmentally aware action and interaction concerning water use. Still others have turned entirely to a range of qualitative methods (such as interviews, focus groups, or diaries, amongst other methods) to explore dimensions of behaviour that concern knowledge, attitudes, beliefs, values and lifestyles (including aspirations and assumptions). Some explore these issues with respect to a range of environmental issues of concern to individuals and households (e.g., Barr, 2006; Barr & Gilg, 2006), whereas others turn their attention to resource consumption—and specific resources—in particular (Owen et al., 2009; Wutich, 2009). In this broad context, the responses of research in Sustainable HCI have also included the methodological as well as the analytic, with the development of mixed methods projects that attempt to capture some of these multiple dimensions of human actions and interactions with computing and the digital, as they pertain to domestic resource consumption. Given that the focus of HCI has historically been concerned with the interaction between digital systems and their users, the challenges for Sustainable HCI here include ways to capture the multidimensionality of material action and interaction alongside dimensions of practice that remain “unobservable.” Some HCI studies have therefore turned to alternative frameworks to explore the relationship between home inhabitants and embedded resources via digital technologies more extensively. Schwartz et al. (2013), for example, extend behavioral or cognitive approaches to consider explicitly phenomenological and ethnomethodological aspects of interaction with energy—rendering it visible, perceptible, and therefore

200   Nicola Green ET AL. “accountable” via “ordering structures”—thereby capturing both the observable, and the implicit. This shift in focus allows them to examine observable situated practices at the microcosmic scale—the connections between thinking, being, and doing. This widening of topics within the sustainabilities literature has allowed both social science practitioners, and HCI designers, to focus their research approaches on combinations of methods that are specifically based on theories of those practices. Furthermore, in all of these debates, the concept of agency becomes particularly important—and agency of different types, from humans to (digital) non-humans, and from the individual to collectivities such as digital systems. Some, for example, have evolved “ideal-typical” typologies to characterize “environmental agency” (Spaargaren & Oostvesteer, 2010). According to this argument, even at the everyday level of the household, forms of environmental agency are at the same time also necessarily mediated by the technologies, objects, and infrastructures of consumption practices (or­gan­ized by “distant others”), in different modes of appropriation and provision—including digital systems (see also van Vliet et al., 2005). They argue that attention to both (distributed) human practices and non-human (especially digital) interventions are crucial. To these ends, a substantial body of environmental social sciences literatures has turned to theories of human practices in order to fully explore how human-environment relations are embedded in multiple networks of action, interaction, knowledge, meaning, or­gan­i­za­tion, and power at different scales of social life.

Investigating Practices and Networks Theories of practices have therefore become ever-more-extensively substantial and influential in Sustainable HCI over the past two decades. Adopted from approaches in environmental social sciences more broadly, these frameworks have been largely developed from Bourdieu’s theories of “habitus,” “capital,” and “field” and Giddens’ theories of “agency,” “system,” and “structure.”5 Practice theories take earlier notions of “behaviour” as central—in the sense that it is doing that is a key unit of analysis—but conceptualize those “doings” in more complex ways than cause-effect models would have us believe. As Spaargaren (2011, p. 815) argues with respect to Bourdieu and Giddens, [w]hat is recognized as being of lasting value in their work is the understanding of  social life as a series of recursive practices reproduced by knowledgeable and capable agents who are drawing upon sets of virtual rules and resources which are connected to situated social practices. Agents are involved in the reproduction of series of practices within designated fields of social life by drawing upon the specific sets of rules and resources constitutive for those practices. Because of the emphasis on practices as “shared behavioural routines,” the individual is no longer in the center of the analysis. Practices, instead of individuals, become the units of analysis that matter most. Practices “produce” and co-constitute individuals and their values, knowledge and capabilities, and not the other way around.

A Digital Nexus: Sustainable HCI and Resource Consumption   201 Crucially, practice theories attempt to bridge the structure-agency dualism, connecting the micro- and macro-sociological contexts—agency as performed, powers as enacted, and interests as actively pursued (Spaargaren, 2011). One of the important points here is therefore that “practices” can be conceived at different scales of production and reproduction—everyday habits are reproduced through practices (the “social or­gan­i­za­tion of normality”; Shove, 2003), but so are markets—consumption routines are reproduced through practices, but so are structures of governance. It is not far from this observation to extend to a focus on context—to observe that practices are always “situated” in networks of relationship between people and their material and socio-cultural contexts (Hui & Walker, 2018). In the domain of resource-human relations then, “the most vigorous application of practice theoretical repertoires . . . may be found in the interstices between technologies, utilities, resource consumption and the problematic of sustainability” (Halkier, Katz-Gerro, & Martens, 2011, p. 5). A number of contemporary researchers have incorporated such positions and integrated them into practice research on contemporary environmental politics: From engaging in practice-theoretical development more generally (Shove,  2003; Shove, Pantzar, & Watson, 2012), to elaborating the congruence of practices and “sustainable consumption” in everyday life and the domestic sphere (Shove & Spurling,  2013; Southerton, 2013), and with particular regard to “ecological citizenship” (such as research on the role of practice theory in understanding shared ecological governance; Spaargaren, 2011). It is worth noting that contemporary formulations of practice theories in relation to  consumption and sustainability have progressively incorporated and emphasized materiality as a key dimension of practices, whether that be the materiality of the embodied human (en)acting, or the contextual materialities of human-built objects, environments, structures and technologies (Appadurai, 1988; Dant, 1999; Miller, 1998). In these frameworks, material relations are always co-constructed, so the focus is on the materials necessary to reproduce ways of doing over time, the knowledge and competencies to deploy those materials, and the (shared) meanings associated with particular “ways of doing” in relation to norms, values, and identities. The result is a complex framework that opens up digital and HCI design spaces to explore the intricacies of agentic-yet-institutionally embedded socio-cultural knowledge-meaningaction assemblages. In the context of empirical social science research on domestic resource consumption, both experience and interaction are captured in the notions of “situated knowledge” and “situated action.” That is, any practices are a combination of contextually dependent and mutually informing organization of human activities, material infrastructures, and knowledges of them, the relations between which are played out in the routines, habits, and rhythms of everyday life. These approaches have become well-represented in empirical environmental social science literature on resource consumption. Of particular interest to research on practices has been a shift of the units of analysis in the design of empirical research. Whereas behavioral or attitudinal studies tend to focus on the individual, increasingly the environmental social science literature has focussed on

202   Nicola Green ET AL. alternative units of consumption such as the urban-based household, where activities, routines, and practices are both shared and negotiated amongst spatially and temporally extended networks of actors, infrastructures, organizations and agencies. Thus the home is both a micro-topography, and a simultaneously multiply layered and connected spaces (Barac & McFadyen, 2007; Hitchings, 2004; Horta et al., 2014). Some projects are concerned with attempting to empirically map household consumption as a set of practices in-depth, rather than attempting to outline the breadth of common practices across more general populations. Early research here focussed on the (particularly routinized and habitual) activities, knowledges, and materialities framing consumption, using observational, interview, and ethnographic methods of various sorts (Shove, 2003; Shove & Warde, 2002; Shove et al., 2012; Southerton 2013; Spaargaren & Van Vliet,  2000). Research has extended more general household studies to include a particular focus on specific single resources, such as those oriented towards energy (Butler et al.,  2016; Genus & Jensen,  2019; Moroşanu,  2016; Shove & Walker,  2014; Strengers,  2012), water (Vannini & Taggert,  2016), or food (Crivits & Paredis,  2013; Paddock, 2017; Sahakian & Wilhite, 2014; Warde, 2013, 2014). Increasingly, practice research is also turning to the “nexus of practices” (Hui et al., 2017) that form consumption in relation to a nexus of resources—such as that between water and energy (Strengers,  2011; Strengers & Maller  2017; Strengers et al.,  2014). At times these studies drill down their analytic focus to particular nexus points such as  a  cluster of related practices—for example, showering (Shove  2003) or washing (Kuijer, 2017), doing laundry (Jack, 2013), or eating practices (Devaney & Davies, 2016)— and also often entail the innovative use of novel participatory, design-based, or interventionist methodological strategies. At other times, the analytic focus is on those nexus points that are materialized in the infrastructures of everyday life—such as the nexus of food and energy in the case of the domestic freezer (Hand & Shove, 2007; Southerton & Shove, 2000), or energy and water in the (potentially digital) washing machine (Bourgeois et al., 2014). This latter focus on everyday materialities is where the relationship between the focus on practices and the focus on (digital) technological networks is most significant— whether that is cast in terms of the “infrastructures of consumption” (van Vliet et al., 2005), or the “socio-technical networks” of humans and non-humans in Actor-Network Theory (ANT) research, for example in water (Sofoulis, 2005; Sofoulis & Williams, 2008) or energy (Strengers, 2012). In this regard then, practice theories are particularly well-situated to explore multidimensional phenomena such as a WEF nexus (Hui, Shove, & Schatzki, 2017) across overlapping social contexts and multiple scales. Therefore, while not without its critics (Cairns & Krzywoszynska, 2016), and despite its potential limitations and partialities (Taherzadeh, Bithell, & Richards, 2018), the notion of a WEF nexus of resource interdependencies could potentially provide a recursive lens alongside practice theories to understand domestic resource use. The focus on materiality—associated significant technologies and infrastructures—simultaneously provides a lens through which to view potential emerging interdependencies between digital systems, local practices, and

A Digital Nexus: Sustainable HCI and Resource Consumption   203 resource infrastructures. Such arguments are particularly congruent with a consideration of socio-technical systems and “actor-networks.” Actor-network approaches share practice theory’s concern with paying attention to the materialities within which embodied human beings are embedded—the things/ objects and technologies through which humans act and interact in the world. In HCI most broadly, this therefore involves understanding human-social relationships as co-constitutive of human-computer/object relationships in the context of digitally mediated social processes. In the case of Sustainable HCI, where the focus is simultaneously environmental, the consideration of digital networks-and/as-things is coupled with a consideration of physical-resources-as-things, and humans interact with both. In ANT, identifying the actors in any relation is a primary endeavor—importantly, as already noted, actors can be both human and non-human. Humans are only one set of entities (alongside objects and technologies) that act and interact in relation to humans and to each other, meaning that in ANT, “things” such as digital technology systems, and physical resource distribution and consumption systems, have equivalent agency as the humans they shape and are shaped by. Such agentic actors are thereafter themselves embedded in networks—the normative organizational systems (or assemblages) that associate actor-entities with each other across relational domains (Akrich, 1992; Latour, 2000; Suchman, 2006; Taylor, 2015). The socio-technical systems described by ANT are in many ways recursive with theories of practice, with the  additional emphasis on the agencies of non-humans as well as humans—and both are currently widely employed in both the environmental social sciences and HCI literatures.

Revisiting Sustainable HCI in a WEF Context It is within the assemblages of human practices and non-human agencies described by practice and actor-network theories that the relationship between empirical research in environmental social science and HCI is at its strongest. This is not least because digital technologies increasingly comprise one of the most important infrastructures that underpin our everyday material and social lives in the context of both the household and the urban landscape. On the one hand, there are those environmental social science studies of household resource consumption that have been embedded in innovative methods and interdisciplinary collaborations with engineering and computer science colleagues—such as Pink’s (and collaborators’) studies of household energy use combining ethnographic exploration of practices with the use of digital sensors to quantitatively measure consumption (Pink, 2011; Pink & Leder Makley, 2012; Pink et al., 2013; see also Coughlan et al., 2013). On the other hand, practitioners and designers situated firmly in HCI as an inter-discipline have increasingly used an extremely broad mixture

204   Nicola Green ET AL. of (often digitally based) methods to understand practice-based consumption within the home (and at times intervene in it – see Mitchell et al., 2015). Smart meters, and digital technologies such as (still and video) cameras and/or sensor arrays, have all been variously deployed to both measure and understand resource-related domestic practices. Larrabee Sønderlund et al. (2014), for example, explore and review the different types of smart metering of water and their associated user feedback systems, whilst with respect to food, Ganglbauer (2013) introduced—in addition to interviews and home tours—a “FridgeCam” within households to record the situated practices of food waste (see also Ganglbauer et al., 2013). The FridgeCam consisted of a mobile phone camera attached to the refrigerator door capturing images automatically when activated by refrigerator door-opening. The captured images were then further uploaded to a Facebook site to be shared amongst interested parties, which encouraged the social media discussion of “appropriate” or “inappropriate” practices leading to food waste and/or its mitigation (to supplement the narrative data from participants in interviews and tours). Through these (digitally based) “technology probe” methods, Ganglbauer et al.’s study therefore sought to utilize practice theory to “design strategies to support dispersed as well as integrated food practices” (2013, p. 1)—that is, to explore how digital technologies might be deployed to understand, intervene in, and feed back to users on the assemblages of practices associated with their preservation, preparation, consumption and disposal of food to mitigate against waste. Ganglbauer et al.’s research therefore echoed Comber and Thieme’s (2012) earlier study that developed a similar technology intervention in the form of a “BinCam” (a mobile phone capturing images when triggered by the bin [garbage] lid, capturing still images), while additionally also instituting an online community amongst their study participants to discuss the practices concerned (Thieme et al., 2012). It is only more recently that Sustainable HCI research has come to focus explicitly on WEF relationships—the ways that resources and their associated practices and networks of actors in the domestic sphere are interrelated and mutually dependent—rather than focusing simply or intensively on single resources. For example, with respect to food and energy research in HCI, Clear et al. (2013) introduced a “cooker cam” to explore practices of cooking within shared student accommodation—focusing on both energy and food. The aim of this study was to uncover the observable mundane, ubiquitous, and habitual practices of food preparation at the site of the cooker (stove/range) within student-shared households, amongst a demographic where design interventions might have a significant impact in transitional life stages. The “cooker cam” consisted of a motion-triggered wildlife trail camera mounted above the cooker (dubbed the “hobcam”), capturing still images every 30 seconds when motion-activated. This was supplemented with data from real-time energy smart meter readings for each cooking session recorded, and further supplemented with interview narratives on the experiential proc­ess and meaning of cooking. Such multiple methods uncovered several potential directions for innovative design interventions, including various smart/digital modifications to cooking appliances, the possibilities of encouraging communal organization of food responsibility and sociality via the design of mobile and social media applications or

A Digital Nexus: Sustainable HCI and Resource Consumption   205 add-ons, or developing digital tools to render the carbon intensity of particular foods and their preparation more accessible and transparent to users, encouraging less energy-intensive diets (see also Clear et al., 2016; Hupfeld & Rodden, 2012). In the case of water and energy, Chetty et al. (2008) focused on “in the moment” household resource consumption to map the relative “visibility” of resources and their infrastructures to participants in the course of their consumption practices. The aim of the study was to develop digital display and control system tools, both for reflection and engagement, and to support and underpin the management of domestic resource consumption based on current management practices, technology use, and interaction with outside stakeholders. Using home tours, semi-structured interviews and (digital image-based) visual methods, the findings of the study underlined the importance of householders’ understandings of domestic utility systems, or more likely, their invisibility in the practice of everyday life. The design interventions considered on the basis of the research findings therefore focussed towards ways of making the production of water and energy more visible and available to consumers via domestic digital toolkits (at the same time as providing comparative and “benchmarking” consumption information at different urban, regional and national scales). The intervention thus accounted for both diversity and inequality in the design of those digital systems, and supported more collective as well as individual agency in “green” behaviour change via those same tools. In a similar vein to Chetty et al., Strengers (2011) draws explicitly on both practice theories and concepts in socio-technical networks to explore the potential role of digital feedback systems in encouraging sustainable water and energy consumption. Her review of different approaches and empirical strategies draw on research derived via methodologies variously including ethnographic interviews and home tours as well as the deployment of digital technologies. The digital systems explored include In-Home Displays (IHDs) and smart meters of a range of different types, from visualization tools to render real-time consumption visible, to the intersection of digital information systems with what are considered “negotiable” or “non-negotiable” domestic practices. Strengers (2011, p. 319) concludes from her comparative review of different digital systems that digital feedback mechanisms have the potential to legitimize particular practices and to overlook those considered non-negotiable . . . IHDs [In-Home Displays] can play a role in making socio-technical systems of energy and water provision more relevant to householders’ everyday lives, and in questioning and debating non-negotiable practices. This will necessitate repositioning and blurring the roles and responsibilities of resource providers and consumers.

As Strengers’ study indicates, both HCI and social science studies focussed on practices through the deployment and use of digital technologies remain commonly oriented towards the development of (digital) tools for persuasion towards “behaviour change”— the Sustainable HCI position of “sustainability through design” (Butler et al.,  2016; Paddock,  2017; Thieme et al.,  2012). It is this model of “persuasion” or “behaviour

206   Nicola Green ET AL. change”—alongside debates over the conceptualization of “agency,” the material organization of “consumption” processes or “sustainability” itself—that remain at the heart of contemporary theory and empirical research design in Sustainable HCI. As such, Sustainable HCI is playing a crucial and expanding role in the politics of ways of being digital.

Conclusion: Resource Sustainability, Resilience, and Security There is no doubt that living in a digital age is transforming the ways that human beings relate to their environments, particularly with respect to the exploitation, use, transformation, and consumption of natural resources. This is especially the case if we are to understand the centrally important role of the household in resource consumption. The key to the evolution of empirical practice in contemporary Sustainable HCI research is the recognition of complexity across multiple social scales, whether that is in the complexity of resource interdependencies (found in WEF Nexus thinking), or in the  complexity of the human consumption of them via digitally based resource ­measurement, feedback, and management systems at the household level (in Practice and Network thinking). Throughout the chapter, we have sought to review the approaches relevant to the development of Sustainable HCI, and evaluate the ways that they inform current empirical studies in the field focussed towards domestic resource consumption. The preceding sections reviewed the frameworks that have underpinned contemporary social science research on the relationships between humans and their digital and natural environments. Such frameworks have variously focussed on: structural-level variables such as political economy and facets of globalization; social psychological approaches encompassing individual behaviours, attitudes, values, norms and lifestyles; and mid-level theories focussed on practices that attempt to connect relational processes across different scales of social organization. As part of this latter discussion, we have also reviewed the focus on digital materiality that is common to both theories of practices and to frameworks derived from STS such as Actor-Network Theory. Throughout this discussion, several further salient conceptualizations of social proc­esses have come into play—including those relating to “consumption” and “sustainability,” as well as those concerning “structure,” “agency.” and “social change.” These latter conceptualizations have become particularly important as environmental crises intensify, and potential digital solutions to expansive resource consumption are sought via Sustainable HCI. It becomes apparent throughout recent Sustainable HCI theory and research that in order to understand the complexity involved in these landscapes of  embedded digital networks—multiple and overlapping configurations of humans and non-humans, of structures and systems, within assembled networks of actions and

A Digital Nexus: Sustainable HCI and Resource Consumption   207 interactions—we need to pay attention simultaneously to the material as well as the ­discursive forms of knowledge and power that situate practices in socio-technical systems, and which enable some digital ways of being to the exclusion of others. Such understandings underpin any attempts within Sustainable HCI to inform, transform, and influence human-resource relationships. Certainly digital technologies as they are empirically deployed in HCI studies might help us to further understand the digital-physical-social webs of relationship within households. How the results of that research are communicated in the public sphere is also of import, given the scale of current environmental challenges. An overly emphatic focus on “behaviour change” via digital systems might, however, also have unintended consequences, running the risk of unequally marginalising some already vulnerable populations: In contrast to aspirational claims for a “smart utopia” of greener, less energy intensive, and more comfortable homes currently present in market and policy discourses, we argue that SHTs [Smart Home Technologies] may reinforce unsustainable energy consumption patterns in the residential sector, are not easily accessible by vulnerable consumers, and do little to help the “energy poor” secure adequate and affordable access to energy at home.  (Tirado Herrero et al., 2018, p. 65)

While Sustainable HCI is therefore able—to some extent—to intervene positively toward behaviour change, some argue that any influence derived from such interventions is not straightforward, nor is it unproblematic (Burchell, Rettie, & Roberts, 2016). Demographic, socio-economic, and life-cycle factors all have an impact on values, lifestyles, and consumption. Similarly, routine, habit, affect, and the meaning of home can also vary significantly amongst and between populations. Therefore any potential interventions are therefore likely to have extensively variable and uneven effects (Watson,  2017). There is therefore also the question as to whether Sustainable HCI research currently only reaches populations already positively oriented towards the environmental issues under scrutiny (Vassileva et al., 2013). If HCI recognizes that digital technologies are imbricated in “the domestic”—and domestic resource practices—in the complex assemblages outlined earlier, it is unsurprising that the question of digital persuasion towards sustainabilities remains contested in the HCI literature and beyond. Innovative approaches drawing on practice theory and ANT have expanded the units and scales of analysis in Sustainable HCI to encompass a variety of different forms of possible influence, such as the recognition of mutual or collective responsibilities for sustainable consumption across different organization and civic scales. For example, supermarkets/food providers and households could each be digitally linked in local communities to hold collective responsibility for sustainable food provision. It is for such reasons that there has been a movement within Sustainable HCI debates to explicitly focus on digital “politics” (broadly defined) towards sustainability. In a timely contribution, Dourish (2010, p. 8) adds a final and crucial consideration to any

208   Nicola Green ET AL. characterization of environmentally oriented HCI as a broad field of endeavor—“the politics of design and the design of politics.” According to Dourish (2010, p. 8), Sustainable HCI must become more explicitly and self-consciously “political”; that is, he is making an attempt to dismantle design as an anti-politics machine. Political, social, cultural, economic, and historical contexts have critical roles to play, not only because they shape our experience with information technologies, but also, and even more, because information technologies in contemporary life are sites at which these contexts are themselves developing.

Such approaches hold the potential to help us further understand how digital technologies and the projects of Sustainable HCI might mediate the relationships between physical resources, things, systems, people, their knowledge, skills, and activities. This understanding might move us closer to resource security and resilience, and sustainable ways of being in a digital age.

Notes 1. The exemplar empirical research presented throughout the chapter citing national statistics are derived from UK sources. Corresponding international statistics, especially those produced by governmental bodies (or e.g., private utilities companies), can be relatively easily located via the relevant corresponding national or international bodies, or via straightforward Internet searches. 2. Environmental social sciences and philosophy addressing the macro-level have focussed on both human-nature social relationships, as well as the salient cultural categories through which those relationships are made meaningful and are understood. The foundational theoretical literatures in environmental social sciences are extremely extensive, and have undergone considerable revision over the course of at least fifty years—not least in light of feminist and post-colonial theories, and theories of globalisation. As such, only an extremely brief indication of broad conceptual areas can be offered here, although those interested in reading towards further contemporary theoretical developments might consult compendia readers such as Gabrielson et al. (2016). 3. A number of different models of persuasion towards “behaviour change” have been proposed over time—and have also been deconstructed. For a critical review of AIDA (Awareness-Information-Decision-Action) models, for example, see Barr (2006). For a critique of 4E (Enable-Engage-Encourage-Exemplify) models, see Jackson (2006) and Spaargaren (2011). Such models have also been adopted in different contexts—for example, in public (often policy-generated) awareness campaigns towards “ecological citizenship,” through to technology-driven techniques of persuasion (see chapter sections on Sustainable HCI design and development). A key endeavor in these approaches is also identifying “barriers to change.” 4. Over the past decade, the literature on “sustainable consumption” has grown rapidly, and has entailed extensive debates not only on the conceptualization of sustainable consumption itself, but also the relation it bears to “consumerism” (Evans & Jackson, 2008) and “citizenship” (Seyfang, 2016). For a review of these literatures see Jackson (2008).

A Digital Nexus: Sustainable HCI and Resource Consumption   209 5. Halkier, Katz-Gerro, and Martens (2011) provide a comprehensive review of the practice theory literature via Bourdieu and Giddens, tracing their philosophical antecedents to (amongst others) Durkheim, Heidegger, Husserl, Levi-Strauss, Marx, Mauss, MerleauPonty, Weber, and Wittgenstein.

References Abdul Salam, P., Shrestha, S., Prasad Pandey, P., & Kumar Anal, A. (2017). Water-energy-food nexus: Principles and practices. Hoboken, NJ: American Geophysical Union, Wiley. doi:10.1002/9781119243175. Advani, A., Johnson, P., Leicester, A., & Stoye, G. (2013). Household energy use in the UK: A distributional analysis. IFS Report R85, Institute for Fiscal Studies. Akrich, M. (1992). The de-scription of technical objects. In W. Bijker & J. Law (Eds.), Shaping technology, building society: Studies in sociotechnical change (pp. 205–224). Cambridge, MA: MIT Press. American Psychological Association (APA). (2018). APA Dictionary of Psychology. American Psychological Association, https://dictionary.apa.org/behavior. Appadurai, A. (Ed.) (1988). The social life of things: Commodities in cultural perspective. Cambridge: Cambridge University Press. Arroyo, E., Bonanni, L., & Selker, T. (2005). Waterbot: Exploring feedback and persuasive techniques at the sink. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 631–639). Portland, OR: ACM. doi:10.1145/1054972.1055059. Bang, M., Gustafsson, A., & Katzeff, C. (2007). Promoting new patterns in household energy consumption with pervasive learning games. In Proceedings of the international conference on persuasive technology, Persuasive 2007, LNCS 4744 (pp. 55–63). Barac, M., & McFadyen, L. (2007). Connected space. Home Cultures, 4(2), 109–115. Barr, S. (2003). Are we all environmentalists now? Rhetoric and reality in environmental action. Geoforum, 35, 231–249. doi:10.1016/j.geoforum.2003.08.009. Barr, S. (2006). Environmental action in the home: Investigating the “value-action” gap. Geography, 91(1), 43–54. Barr, S. (2016). Environment and society: Sustainability, policy and the citizen. London: Routledge. Barr, S., & Gilg, A. (2006). Sustainable lifestyles: Framing environmental action in and around the home. Geoforum, 37, 906–920. doi:10.1016/j.geoforum.2006.05.002. Barr, S., Gilg, A., & Shaw, G. (2011a). “Helping People Make Better Choices”: Exploring the behaviour change agenda. Applied Geography, 31, 712–720. doi:10.1016/j.apgeog.2010.12.003. Barr, S., Gilg, A., & Shaw, G. (2011b). Citizens, consumers and sustainability: (Re) Framing environmental practice in an age of climate change. Global Environment Change, 21, 1224–1233. doi:10.1016/j.gloenvcha.2011.07.009 Beddington, J. (2009). Food, energy, water and the climate: A perfect storm of global events. In Lecture to Sustainable development UK conference. https://assets.letemps.ch/sites/ default/files/media/2014/03/19/2.1.3041002903.pdf Bhaduri, A., Ringler, C., Dombrowski, I., Mohtar, R., & Scheumann, W. (2015). Sustainability in the water-energy-food nexus. Water International, 40(5–6), 723–732. Blevis, E. (2007). Sustainable interaction design: Invention, disposal, renewal & reuse. In Proceedings of SIGGCHI 2007 ACM annual conference on human factors in computing Systems (pp. 503–512). ACM.

210   Nicola Green ET AL. Bonino, D., Corno, F., & De Russis, L. (2012). Home Energy Consumption Feedback: A User Survey. Energy and Buildings, 47, 383–393. doi: 10.1016/j.enbuild.2011.12.017. Boucher, A., Cameron, D., & Jarvis, N. (2012). Power to the people: Dynamic energy measurement through communal cooperation. In Proceedings of the designing interactive systems conference (pp. 612–620). June 11–15, Newcastle, UK. Bourgeois, J., van der Linden, J., Kourtuem, G., Price, B. A., & Rimmer, C. (2014). Conversations with my washing machine: An in-the-wild study of demand shifting with self-generated energy. Proceeding of ubiquitous computing conference, UBICOMP. September 11–13, Seattle, Washington. doi:10.1145/2632048.2632106. Burchell, K., Rettie, R., & Roberts, T. (2016). Householder engagement with energy consumption feedback: The role of community action and communications. Energy Policy, 88, 178–186. doi:10.1016/j.enpol.2015.10.019. Burgess, J., Bedford, T., Hobson, K., Davies, G., & Harrison, C. M. (2003). (Un)sustainable consumption. In F.  Berkhout, M.  Leach, & I.  Scoones (Eds.), Negotiating environmental change: New perspectives from social science (chapter 10). Cheltenham: Edward Elgar. Butler, C., Parkhill, K., & Pidgeon, N. (2016). Energy consumption and everyday life: Choice, values and agency through a practice theoretical lens. Journal of Consumer Culture, 16(3), 887–907. doi:10.1177/1469540514553691. Brynjarsdottir, H., Håkansson, M., Pierce, J., Baumer, E., DiSalvo, C., & Sengers, P. (2012). Sustainably unpersuaded: How persuasion narrows our vision of sustainability. In Proceedings of the SIGGCHI 2012 ACM annual conference on human factors in computing systems (pp. 947–956). ACM. Cairns, R., & Krzywoszynska, A. (2016). Anatomy of a buzzword: The emergence of “the water-energy-food nexus” in UK natural resource debates. Environmental Science and Policy, 64, 164–170. doi:10.1016/j.envsci.2016.07.007. Callon, M. (Ed.) (1998). The laws of the markets. Oxford: Blackwell. Casado-Mansilla, D., López-de-Armentia, J., Ventura, D., Garaizar, P., & López-de-Ipiña, D. (2016). Embedding intelligent eco-aware systems within everyday things to increase people’s energy awareness. In Soft Computing, 20(5), 1695–1711. Chetty, M., Tran, D., & Grinter, R. (2008). Getting to green: Understanding resource consumption in the home. In Proceedings of the 10th international conference on ubiquitous computing, UbiComp (pp. 242–251). September 21–24. doi:10.1145/1409635.1409668. Chitnis, M., Sorrell, S., Druckman, A., Firth, S. K., & Jackson, T. (2013). Turning lights into flights: Estimating direct and indirect rebound effects for UK households. Energy Policy, 55, 234–250. doi:10.1016/j.enpol.2012.12.008. Chitnis, M., Sorrell, S., Druckman, A., Firth, S. K., & Jackson, T. (2014). Who rebounds most? Estimating direct and indirect rebound effects for different UK socioeconomic groups. Ecological Economics, 106, 12–32. doi:10.1016/j.ecoecon.2014.07.003. Clear, A., Hazas, M., Morley, J., Friday, A., & Bates, O. (2013). Domestic food and sustainable design: A study of university student cooking and its impacts. Proceedings of the SIGCHI Conference on human factors in computing systems. Paris, France, April 27-May 2, 2013. doi: 10.1145/2470654.2481339. Clear, A. K., O’Neill, K., Friday, A. & Hazas, M. (2016). Bearing an open “Pandora’s box”: HCI for reconciling everyday food and sustainability. ACM Transactions on Computer-Human Interaction, 23(5), Article 28, doi:http://dx.doi.org/10.1145/2970817. Connolly, J., & Prothero, A. (2008). Green consumption: Life politics, risk and contradictions. Journal of Consumer Culture, 8(1), 117–146. doi: 10.1177/1469540507086422.

A Digital Nexus: Sustainable HCI and Resource Consumption   211 Crivits, M., & Paredis, E. (2013). Designing an explanatory practice framework: Local Food systems as a case. Journal of Consumer Culture, 13(3), 306–336. doi:10.1177/1469540513484321. Colucci, A., Magoni, M., & Menoni, S. (Eds.) (2017). Peri-urban areas and food-energy-water nexus: Sustainability and resilience strategies in the age of climate change. Switzerland: Springer. Comber, R., & Thieme, A. (2012). Designing beyond habit: Opening space for improved recycling and food waste behaviours through processes of persuasion, social influence and aversive affect. Personal and Ubiquitous Computing, 17(6), 1197–1210. doi:10.1007/s00779-012-0587-1. Constanza, E., Ramchurn, S.  D., & Jennings, N. (2012). Understanding domestic energy consumption through interactive visualization: A field study. Proceedings of ubiquitous computing conference, UbiComp (pp. 216–225). September 5–8, Pittsburgh, PA. Coughlan, T., Leder Mackley, K., Brown, M., Martindale, S., Schlögl, S., Mallaband, B., . . . Hine, N. (2013). Current issues and future directions in methods for studying technology in the home. PsychNology Journal, 11(2), 159–184. Retrieved 15/02/2019, from www.psychnology.org. Dant, T. (1999). Material culture in the social world. Buckingham, UK: Open University Press. Department for Business, Energy and Industrial Strategy [DBEIS]. (2018a). Energy consumption in the UK. July 2018. https://assets.publishing.service.gov.uk/government/uploads/system/ uploads/attachment_data/file/729317/Energy_Consumption_in_the_UK__ECUK__2018.pdf. Department for Business, Energy and Industrial Strategy [DBEIS]. (2018b). Digest of United Kingdom energy statistics [DUKES]: 2018 Main Report. July 2018. https://assets.publishing. service.gov.uk/government/uploads/system/uploads/attachment_data/file/729425/ DUKES_2018.pdf. Devaney, L., & Davies, A. (2016). Disrupting household food consumption through experimental HomeLabs: Outcomes, connections, contexts. Journal of Consumer Culture, 17(3), 823–844. doi:10.1177/1469540516631153. DiSalvo, C., Sengers, P., & Brynjarsdóttir, H. (2010). Mapping the landscape of sustainable HCI. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1975–1984). ACM. DiSalvo, C., Sengers, P., & Brynjarsdóttir, H. (2010a). Navigating the terrain of sustainable HCI. interactions, July-August 2010, 22–25. doi: 10.1145/1806491.1806497. Dourish, P. (2010). HCI and environmental sustainability: the politics of design and the design of politics. In DIS 10: Proceedings of the 8th ACM Conference on Designing Interactive Systems (pp. 1–10). doi: 10/1145/1858171.1858173. Druckman, A., Chitnis, C., Sorrell, S., & Jackson, T. (2011). Missing carbon reductions? Exploring rebound and backfire effects in UK households. Energy Policy, 39, 3572–3581. doi:10.1016/j.enpol.2011.03.058. Energy Saving Trust. (2013). At home with water. Energy Saving Trust, Department for Energy and Climate Change, Department of Environment, Food, and Rural Affairs. http://www. energysavingtrust.org.uk/sites/default/files/reports/AtHomewithWater%287%29.pdf. Energy Saving Trust. (2014). At home with water 2. Technical Report. Energy Saving Trust. http://www.energysavingtrust.org.uk/sites/default/files/reports/AHHW2%20final.pdf. Erickson, T., Podlaseck, M., Sahu, S., Dai, J. D., Chao, T., & Naphade, M. (2012). The Dubuque water portal: Evaluation of the uptake, use and impact of residential water consumption feedback. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 675–684). ACM. Evans, D. (2011a). Consuming conventions: Sustainable consumption, ecological citizenship and the worlds of worth. Journal of Rural Studies, 27, 109–115. doi:10.1016/j.rurstud.2011.02.002.

212   Nicola Green ET AL. Evans, D. (2011b). Thrifty, green or frugal: Reflections on sustainable consumption in a changing economic climate. Geoforum, 42, 550–557. doi:10.1016/j.geoforum.2011.03.008. Evans, D., & Abrahamse, W. (2009). Beyond rhetoric: The possibilities of and for “sustainable lifestyles.” Environmental Politics, 18(4), 486–502. doi:10.1080/09644010903007369. Farr-Wharton, G., Hee-Jeong Choi, J., & Foth, M. (2014a). Food talks back: Exploring the role of mobile applications in reducing domestic food wastage. In Proceedings of the 26th Australian computer-human interaction conference on designing futures: The future of design (pp. 352–361). ACM. doi:10.1145/2686612.2686665. Farr-Wharton, G., Hee-Jeong Choi, J., & Foth, M. (2014b). Technicolouring the fridge: Reducing food waste through uses of colour-coding and cameras. In Proceedings of the 13th  international conference on mobile and ubiquitous multimedia (pp. 48–57). ACM. doi:10.1145/2677972.2677990. Fischer, C. (2008). Feedback on household electricity consumption: A tool for saving energy? Energy Efficiency, 1(1), 79–104. doi:10.1007/s12053-008-9009-7. Foster, D., Lawson, S., Blythe, M., & Cairns, P. (2010). Wattsup? Motivating reductions in domestic energy consumption using social networks. In Proceedings of the 6th Nordic conference on human-computer interaction: Extending boundaries (pp. 178–187). ACM. doi:10.1145/1868914.1868938. Froehlich, J. (2009). Promoting energy efficient behaviors in the home through feedback: The role of human computer interaction. In Proceedings of HCIC workshop (pp. 1–11). http://www.cs.umd.edu/~jonf/publications/Froehlich_PromotingEnergyEfficientBehaviros InTheHomeThroughFeedback-TheRoleOfHumanComputerInteraction_HCIC2009.pdf. Froehlich, J., Findlater, L., & Landay, J. (2010). The design of eco-feedback technology. In  Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1999–2008). ACM. http://www.cs.umd.edu/~jonf/publications/Froehlich_TheDesignOf EcoFeedbackTechnology_CHI2010.pdf. Froehlich, J., Findlater, L., Ostergren, M., Ramanathan, S., Peterson, J., Wragg, I., Larson, E., Fu, F., Bai, M., & Patel, S. (2012). The design and evaluation of prototype ecofeedback displays for fixture-level water usage data. In Proceedings of SIGCHI 2010 conference on human factors in computing systems (pp. 2367–2376). April 10–15, Atlanta, GA. ACM. Gabrielson, T., Hall, C., Meyer, J. M., & Schlosberg, D. (Eds.) (2016). The Oxford handbook of environmental political theory. Oxford: Oxford University Press. Gamberini, L., Spagnolli, A., Corradi, N., Jacucci, G., Tusa, G., Mikkola, T., Zamboni, L., & Hoggan, E. (2012). Tailoring feedback to users’ actions in a persuasive game for household electricity conservation. In Proceedings of the international conference on persuasive technology (pp. 100–111). Centre for eHealth & Wellbeing Research, Department of Psychology, Health and Technology, University of Twente, The Netherlands. Ganglbauer, E. (2013). Towards food waste interventions: An exploratory approach. In Proceedings of the 2013 ACM conference on pervasive and ubiquitous computing. Adjunct publication (pp. 337–342). ACM. Ganglbauer, E., Fitzpatrick, G., & Comber, R. (2013). Negotiating food waste: Using a practice lens to inform design. ACM Transactions on Computer-Human Interaction, 20(2), Art. 11. doi:10.1145/2463579.2463582. Gatersleben, B., Murtagh, N., & Abrahamse, W. (2014). Values, identity and pro-environmental behaviour. Contemporary Social Science, 9(4), 374–392. doi:10.1080/21582041.2012.682086. Genus, A., & Jensen, C. (2019). Beyond “behaviour”: The institutionalisation of practice and the case of energy-efficient lighting in Denmark. Journal of Consumer Culture, 19(3), 1–19. doi:10.1177/1469540517717781.

A Digital Nexus: Sustainable HCI and Resource Consumption   213 Gilg, A., Barr, S., & Ford, N. (2005). Green consumption or sustainable lifestyles? Identifying the sustainable consumer. Futures, 37(6), 481–504. doi:10.1016/j.futures.2004.10.016. Goodman, E. (2009). Three environmental discourses in human-computer interaction. In CHI 2009 extended abstracts on human factors in computing systems (pp. 2535–2544). Boston, MA: ACM. doi:10.1145/1520340.1520358. Granovetter, M. (1985). Economic action and social structure: The problem of embeddedness. American Journal of Sociology, 91(3), 481–510. Halkier, B., Katz-Gerro, T., & Martens, L. (2011). Applying practice theory to the study of consumption: Theoretical and methodological considerations. Journal of Consumer Culture, 11(1), 3–13. doi:10.1177/1469540510391765. Hand, M., & Shove, E. (2007). Condensing practices: Ways of living with a freezer. Journal of Consumer Culture, 7(1), 79–104. doi:10.1177/1469540507073509. Harper, R. (ed.) (2003). Inside the smart home. London: Springer. Hazas, M., Friday, A., & Scott, J. (2011). Look back before leaping forward: of domestic energy inquiry. IEEE Pervasive Computing, 10, 13–19. Hee-Jeong Choi, J., & Blevis, E. (2010). HCI & sustainable food culture: A design framework for engagement. In Proceedings of the 6th Nordic conference on human-computer interaction: Extending boundaries (pp. 112–117). ACM. Hitchings, R. (2004). At home with someone nonhuman. Home Cultures, 1(2), 169–186. Hobson, K. (2002). Competing discourses of sustainable consumption: Does the rationalisation of lifestyles make sense? Environmental Politics, 11(2), 95–119. doi:10.1080/714000601. Horta, A., Wilhite, H., Schmidt, L., & Bartiaux, F. (2014). Socio-technical and cultural approaches to energy consumption. Nature and Culture, 9(2), 115–121. doi: 10.3167/nc.2014.090201. Hui, A., Shove, E., & Schatzki, T. (2017). The nexus of practices: Connections, constellations, practitioners. London: Routledge. Hui, A., & Walker, G. (2018). Concepts and methodologies for a new relational geography of energy demand: Social practices, doing-places and setting. Energy Research and Social Science, 36, 21–29. doi:10.1016/j.erss.2017.09.032. Hupfeld, A., & Rodden, T. (2012). Laying the table for HCI: Uncovering ecologies of domestic food consumption. In Proceedings of the SIG conference on human factors in computing systems (pp. 119–128). May 05–10, 2012, Austin, Texas: ACM. doi:10.1145/2207676.2207694. Jack, T. (2013). Nobody was dirty: Intervening in inconspicuous consumption of laundry routines. Journal of Consumer Culture, 13(3), 406–421. doi:10.1177/1469540513485272. Jackson, T. (2008). Challenges for sustainable consumption policy. In T.  Jackson. (Ed.), The  Earthscan reader in sustainable consumption (pp. 109–129). London: Earthscan. doi:10.1111/j.1530–9290.2008.00023.x. Kalbar, P. P., Birkved, M., Hauschile, M., Kabin, S., & Elsborg Nygaard, S. (2018). Environmental impact of urban consumption patterns: Drivers and focus points. Resources, Conservation and Recycling, 137, 260–269. doi:10.1016/j.resconrec.2018.06.019. Kalbar, P.  P., Birkved, M., Kabins, S., Nygaard, S.  E. (2016). Personal-metabolism (PM) coupled with life cycle assessment (LCA) model: Danish case study. Environment International, 91, 168–179. doi:10.1016/j.envint.2016.02.032. Kappel, K., & Grechenig, T. (2009). Show-me: Water consumption at a glance to promote water conservation in the shower. In Proceedings of the 4th international conference on ­persuasive technology (Art. 26). April 26–29, Claremont, California. ACM. doi:10.1145/ 1541948.1541984. Katz-Gerro, T., Greenspan, I., Handy, F., & Hoon-Young, L. (2017). The relationship between value types and environmental behaviour in four countries: Universalism, benevolence,

214   Nicola Green ET AL. conformity and biospheric values revisited. Environmental Values, 26(2), 223–249. doi:10.31 97/096327117X14847335385599. Kenway, S. J., Lant, P. A., Priestley, A., & Daniels, P. (2011). The connection between water and energy in cities: A review. Water Science and Technology. 63, 1983–1990. doi:10.2166/ wst.2011.070. Kuijer, L. (2017). Splashing: The iterative development of a novel type of personal washing. In D. Keyson., O. Guerra-Santin, & D. Lockton (Eds.), Living labs: Design and assessment of sustainable living (pp. 63–74). Switzerland: Springer. doi:10.1007/978-3-319-33,527-8. Kuznetsov, S., & Paulos, E. (2010). UpStream: Motivating water conservation with low-cost water flow sensing and persuasive displays. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1851–1860). April 10–15, 2010, Atlanta, Georgia. ACM. doi:10.1145/1753326.1753604. Larrabee Sønderlund, A., Smith, J. R., Hutton, C., & Kapelan, Z. (2014). Using smart meters for household water consumption feedback: Knowns and unknowns. Procedia Engineering. 89, 990–997. doi: 10.1016/j.proeng.2014.11.216. Latour, B. (2000). When things strike back: A possible contribution of “science studies” to the social sciences. British Journal of Sociology, 51(1), 107–123. Liu, A., Giurco, D., & Mukheibir, P. (2015). Motivating metrics for household water-use feedback. Resources, Conservation and Recycling, 103, 29–46. doi:10.1016/j.resconrec.2015.05.008. Maheshwari, B., Purohit, R., Malano, H., Singh, V. P., & Amerasinghe, P. (Eds.) (2014). The security of water, food, energy and liveability of cities: Challenges and opportunities for periurban futures. Dordrecht: Springer. Mankoff, J. C., Blevis, E., Borning, A., Friedman, B., Fussell, S. R., Hasbrouck, J., Woodruff, A., & Sengers, P. (2007). Environmental sustainability and interaction. In Proceedings of the SIGGCHI EA (pp. 2121–2124). April 28-May 03 2007, San Jose, CA. ACM. doi:10.1145/ 1240866.1240963. McGrane, S.  J., Acuto, M., Artioli, F., Chen, P., Comber, R., Cottee, J., Farr-Wharton, G., Green, N., Helfgott, A., Larcom, S., McCann, J., O’Reilly, P., Salmoral, G., Scott, M., Todman, L.  C., van Gevelt, T., & Yan, X. (2018). Scaling the nexus: Towards integrated frameworks for analysing water, energy and food. The Geographical Journal, 185(4) 1–13. doi: 10.1111.geoj.12256. Miller, D. (1998). Why some things matter. In D. Miller (Ed.), Material cultures: Why some things matter (pp. 3–21). Chicago, IL: University of Chicago Press. Mitchell, V., Leder Mackley, K., Pink, S., Escobar-Tello, C., Wilson, G. T., & Bhamra, T. (2015). Situating digital interventions: Mixed methods for HCI research in the home. Interacting with Computers, 27(1), 3–12. doi:10.1093/iwc/iwu034 Moroşanu, R. (2016). An ethnography of household energy demand: Everyday temporalities of  digital media usage. New York: Palgrave Macmillan, Springer Nature. doi:10.1057/ 978-1-137-59,341-2. Murtagh, N., Gatersleben, B., & Uzzell, D. (2014). Difference in energy behaviour and conservation between and within households with electricity monitors. PLoS ONE, 9(3), e92019, 1–12. doi:10.1371/journal.pone.0092019. Murtagh, N., Gatersleben, B., Cowen, L., & Uzzell, D. (2015). Does perception of automation undermine pro-environmental behaviour? Findings from three everyday settings. Journal of Environmental Psychology, 42, 139–148. Ng, K. H., Shipp, V., Mortier, R., Benford, S., Flintham, M., & Rodden, T. (2015). Understanding food consumption lifecycles using wearable cameras. Personal and Ubiquitous Computing, 19(7), 1183–1195. doi:10.1007/s00779-015-0871-y.

A Digital Nexus: Sustainable HCI and Resource Consumption   215 Ofwat. (2011). The use of revealed customer behaviour in future price limits: Final report. Cascade Consultancy, Economics for the Environment Consultancy Ltd. https://www. ofwat.gov.uk/wp-content/uploads/2015/11/rpt_com_201105eftec_casc_reveal.pdf. Ofwat. (2018). The long term potential for deep reduction in household water demand. Artesia Consulting. https://www.ofwat.gov.uk/wp-content/uploads/2018/05/The-long-term-potentialfor-deep-reductions-in-household-water-demand-report-by-Artesia-Consulting.pdf. Oikonomou, V., Becchis, F., Steg, L., & Russolillo, D. (2009). Energy saving and energy efficiency: Concepts for policy making. Energy Policy, 37, 4787–4796. Oliveira, L., Mitchell, V., & May, A. (2016). Reducing temporal tensions as a strategy to promote sustainable behaviours. Computers in Human Behavior, 62, 303–315. doi: 10.1016/j. chb.2016.04.004. Owen, L., Bramley, H., & Tocock, J. (2009). Public understanding of sustainable water use in the home. Final Report to the Department for Environment, Food and Rural Affairs. Synovate, DEFRA: London. Paddock, J. (2017). Household consumption and environmental change: Rethinking the policy problem through narratives of food practice. Journal of Consumer Culture, 17(1), 122–139. doi:10.1177/1469540515586869. Palmer, J., & Terry, N. (2014). Powering the nation 2: Electricity use in homes and how to reduce it. Department of Energy and Climate Change (DECC), Department for Environment, Food and Rural Affairs (DEFRA). https://assets.publishing.service.gov.uk/government/ uploads/system/uploads/attachment_data/file/325741/Powering_the_Nation_2_260614. pdf. Pierce, J., Odom, W., & Blevis, E. (2008). Energy aware dwelling: A critical survey of interaction design for eco-visualizations. In Proceedings of the 20th Australasian conference on computer-human interaction: Designing for habitus and habitat (pp. 1–8). December 08–12, Cairns, Australia. ACM. doi:10.1145/1517744.1517746. Pink, S. (2011). Ethnography of the invisible: Energy in the multi-sensory home. Enthnologia Europeae: Journal of European Ethnology, 41(1), 117–128. Pink, S., & Leder Mackley, K. (2012). Video and a sense of the invisible: Approaching domestic energy consumption through the sensory home. Sociological Research Online, 17(1, 3). http://www.socresonline.org.uk/17/1/3.html. doi: 10.5153/sro.2583. Pink, S., Mackley, K., Mitchell, V., Hanratty, M., Escobar-Tello, C., Bhamra, T., & Moroşanu, R. (2013). Applying the lens of sensory ethnography to sustainable HCI. ACM Transactions on Computer-Human Interaction, 20(4), 1–18. Polanyi, K. (1944/1957). The great transformation. Boston, MA: Beacon Press. Pullinger, M., Browne, A., Anderson, B., & Medd, W. (2013). Patterns of water: The water related practices of households in southern England, and their influence on water consumption and demand management. Final report of the ARCC-Water/SPRG Patterns of Water projects March 2013. http://www.sprg.ac.uk/uploads/patterns-of-water-final-report.pdf. Riche, Y., Dodge, J., & Metoyer, R. A. (2010). Studying always-on electricity feedback in the home. In Proceedings of the SIGCHI conference on human factors in computing systems: Home eco behavior (pp. 1995–1998). April 10–15, 2010, Atlanta, GA. ACM. Rouillard, J. (2012). The pervasive fridge. A smart computer system against uneaten food loss. In ICONS 2012, The seventh international conference on systems (pp. 135–140). Saint-Gilles–Reunion. Sahakian, M., & Wilhite, H. (2014). Making practice theory practicable: Towards more sustainable forms of consumption. Journal of Consumer Culture, 14(1), 25–44. doi:10.1177/ 1469540513505607.

216   Nicola Green ET AL. Schwartz, T., Stevens, G., Ramirez, L., & Wulf, V. (2013). Uncovering practices of marking energy consumption accountable: A phenomenological inquiry. ACM Transactions in Computer-Human Interaction, 20(2), Art. 12, 1–12. doi:10.1145/2463579.2463583. Seyfang, G. (2006). Ecological citizenship and sustainable consumption: Examining local organic food networks. Journal of Rural Studies, 22(4), 383–395. doi:10.1016/j.jrurstud.2006.01.003. Shove, E. (2003). Comfort, cleanliness and convenience: The social organization of normality. Oxford: Berg. Shove, E., Patzner, M. & Watson, M. (2012). The dynamics of social practice: Everyday life and how it changes. London: Sage. Shove, E., & Spurling, N. (Eds.). (2013). Sustainable practices: Social theory and climate change. London: Routledge. Shove, E., & Walker, G. (2014). What is energy for?: social practice and energy demand. Theory, Culture and Society, 31(5), 41-58. doi:10.1177/0263276414536746. Shove, E., & Warde, A. (2002). Inconspicuous consumption: The sociology of consumption, lifestyles and the environment. In R.  Dunlap, F.  Buttel, P.  Dickens, & A.  Gijswijt (Eds.), Sociological theory and the environment: Classical foundations, contemporary insights (pp. 230–241). Lanham, MA: Rowman and Littlefield. Smajgl, A., Ward, J., & Pluschke, L. (2016). The water-food-energy Nexus—Realising a new paradigm. Journal of Hydrology, 533, 533–540. doi:10.1016/j.hydrol.2015.12.033. Sofoulis, Z. (2005). Big water, everyday water: A socio-technical perspective. Continuum: Journal of Media and Cultural Studies, 19(4), 445–463. doi: 10.1080/10304310500322685. Sofoulis, Z., & Williams, C. (2008). From pushing atoms to growing networks: Cultural innovation and co-evolution in urban water conservation. Social Alternatives, 27(3), 50–57. Southerton, D. (2013). Habits, routines and temporalities of consumption: From individual behaviours to the reproduction of everyday practices. Time and Society, 22(3), 335–355. doi: 10.1177/0961463X12464228. Southerton, D., & Shove, E. (2000). Defrosting the freezer: From novelty to convenience: A  narrative of normalization. Journal of Material Culture, 5(3), 301–319. doi: 10.1177/ 135918350000500303. Spaargaren, G. (2003). Sustainable consumption: A theoretical and environmental policy ­perspective. Society and Natural Resources, 16, 687–701. doi:10.1080/08941920309192. Spaargaren, G. (2011). Theories of practices: Agency, technology and culture. Exploring the relevance of practice theories for the governance of sustainable consumption in the new world-order. Global Environmental Change, 21, 813–822. Spaargaren, G., & Oosterveer, P. (2010). Citizen-consumers as agents of change in globalizing modernity: The case of sustainable consumption. Sustainability, 2(7), 1887–1908. doi:10.3390/ su2071887. Spaargaren, G., & Van Vliet, B. (2000). Lifestyles, consumption and the environment: The ecological modernization of domestic consumption. Environmental Politics, 9(1), 50–76. doi:10.1080/09644010008414512 Srinivasan, V., Stankovic, J., & Whitehouse, K. (2011). Watersense: Water flow disaggregation using motion sensors. In Proceedings of the third ACM workshop on embedded sensing systems for energy-efficiency in buildings (pp. 19–24). ACM. Stern, P. C. (2000). New environmental theories: Toward a coherent theory of environmentally significant behavior. Journal of Social Issues, 56(3), 407–424. doi:10.1111/0022–4537.00175. Strengers, Y. (2011). Negotiating everyday life: The role of energy and water consumption feedback. Journal of Consumer Culture, 11(3), 319–338. doi:10.1177/1469540511417994.

A Digital Nexus: Sustainable HCI and Resource Consumption   217 Strengers, Y. (2012). Peak electricity demand and social practice theories: Reframing the role  of change agents in the energy sector. Energy Policy, 44, 226–234. doi:10.1016/j. enpol.2012.01.046. Strengers, Y., & Maller, C. (2017). Adapting to “extreme” weather: Mobile practice memories of keeping warm and cool as a climate change adaptation strategy. Environment and Planning A, 49(6), 1432–1450. doi:10.1177/0308518X17694029. Strengers, Y., Nicholls, L., & Maller, C. (2014). Curious energy consumers: Humans and non-humans in assemblages of household practice. Journal of Consumer Culture, 16(3), 761–780. doi:10.1177/1469540514536194. Suchman, L. (2006). Human–machine reconfigurations: Plans and situated actions (2nd ed.). Cambridge: Cambridge University Press. Taherzadeh, O., Bithell, M., & Richards, K. (2018). When defining boundaries for nexus analysis, let the data speak. Resources, Conservation and Recycling, 137, 314–315. doi:10.1016/j. resconrec.2018.06.012. Taylor, A. (2015). After interaction. interactions (pp. 49–53). September-October. ACM. Thieme, A., Comber, R., Miebach, J., Weeden, J., Kraemer, N., Lawson, S., & Olivier, P. (2012). We’ve bin watching you: Designing for reflection and social persuasion to promote sustainable lifestyles. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 2337–2346). May 05–10, 2012, Austin, Texas. ACM. doi:10.1145/2207676.2208394. Tirado Herrero, S., Nicholls, L., & Strengers, Y. (2018). Smart home technologies in everyday life: Do they address key energy challenges in households? Current Opinion in Environmental Sustainability, 31, 65–70. doi:10.1016/j.cosust.2017.12.001. United Nations Department of Economic and Social Affairs. (2017a). World Population Prospects: The 2017 Revision. Population Division, United Nations. https://www.un.org/ development/desa/publications/world-population-prospects-the-2017-revision.html. United Nations Department of Economic and Social Affairs. (2017b). Sustainable Development Goals. Sustainable Development Knowledge Platform. https://sustainabledevelopment. un.org/. United Nations Department of Economic and Social Affairs (2018). 2018 Revision of World Urbanization Prospects. https://www.un.org/development/desa/en/news/population/2018revision-of-world-urbanization-prospects.html. Uzzell, D., Muckle, R., Jackson, T., Ogden, J., Barnett, J., Gatersleben, B., Hegarty, P., Papathanasopoulou, E. (2006). Choice matters: Alternative approaches to encourage sustainable consumption and production. Project Report. Report to the Department of the Environment, Food and Rural Affairs (DEFRA). Environmental Psychology Research Group, University of Surrey, London: UK. http://epubs.surrey.ac.uk/729053/. Van Vliet, B., Chappells, H., & Shove, E. (2005). Infrastructures of consumption: Environmental innovation in the utility industries. Oxford: Earthscan. Vannini, P., & Taggart, J. (2016). Onerous consumption: The alternative hedonism of off-grid domestic water use. Journal of Consumer Culture, 16(1), 80–100. doi:10.1177/1469540513509642. Vassileva, I., Dahlquist, E., Wallin, F., & Campillo, J. (2013). Energy consumption feedback devices’ impact evaluation on domestic energy use. Applied Energy, 106, 314–320. doi:10.1016/ j.apevergy.2013.01.059. Warde, A. (2013). Sustainable practices: Social theory and climate change. London: Routledge. Warde, A. (2014). After taste: Culture, consumption and theories of practice. Journal of Consumer Culture, 14(3), 279–303. doi: 10.1177/1469540514547828.

218   Nicola Green ET AL. Watson, S. (2017). Consuming water smartly: The significance of socio-cultural differences to water-saving initiatives. Local Environment, 22(10), 1237–1251. doi:10.1080/13549839.2017.133 4143. Wood, G., & Newborough, M. (2003). Dynamic energy-consumption indicators for domestic appliances: Environment, behaviour and design. Energy and Buildings, 35(8), 821–841. Woodruff, A. & Mankoff, J. (2009). Environmental Sustainability. IEEE Pervasive Computing, 8(1), 18-21. WRAP (Waste & Resources Action Programme). (2017). Household food waste in the UK, 2015. http://www.wrap.org.uk/sites/files/wrap/Household_food_waste_in_the_UK_2015_ Report.pdf. Wutich, A. (2009). Estimating household water use: A comparison of diary, prompted recall and free recall methods. Field Methods, 21(1), 49–68. doi:10.1177/1525822X08325673.

section 3

C OM M U N IC AT ION AND R E L AT IONSH I PS

chapter 8

ESRC R ev iew Communication and Relationships Simeon J. Yates, Rich Ling, Laura Robinson, Catherine Brooks, Adam Joinson, Monica Whitty, and Elinor Carmi

Introduction This chapter explores the outcomes of the literature review and expert Delphi review ­process for the Communication and Relationships domain. As with the other review chapters, the goal is not to work through a large number of examples from the literature. Instead, building on the methods described in chapter 2, we will first set out the results of the digital humanities-based analyses of the literature, highlighting the major topics, themes, and concepts within the literature, providing a few general examples. These are not intended to be the “most important” examples from the literature but rather simply indicative of the types of work. This is then followed by the presentation of the content analysis that sought to identify the key theories and methods in use within the literature. Next, we outline the results from the Delphi review of experts. This concludes with the key questions, topics, and challenges we identified, comparing these to the results from the literature work. The last section presents the recommendations for areas of future study. As a reminder, the initial scoping question for this area of work was: “How are our relationships being shaped and sustained in and between various domains, including family and work?”

Initial Comments The original ESRC Domain question was criticized in the Delphi process for being too broad and ambiguous. Importantly, it was asked whether it constitutes a viable stand-alone question, since communicating and building relationships necessarily forms a pivotal

222   Simeon J. Yates et al. Table 8.1  Scoping Questions Question category

Example questions

Digital literacies

What literacies are required for effective communication using digital technologies? Should these literacies be taught, or can we assume that they develop organically? To what extent does an individual’s digital legacy and digital capability affect their interactions with others in work and leisure?

Norms and values

What normative pressures do people experience related to relationships shaped and sustained by digital technologies? What is the new normal for relationships now they are shaped and sustained by digital technologies across multiple domains?

Platform affordances

What are the Platform affordances of digital technology that construct or constrain relationships? How do particular platforms affect various kinds of relationships: social, sexual, familial, collegial, activism, fandom, etc.?

Quality of relationships and communication

How does communication via digital technologies facilitate the quantity and quality of our relationships? How are our relationships being shaped, sustained, and diminished by digital technologies, in and between the domains of work and family?

Relationship management

How are family, friend, and work relationships shaped by, and reshaping, the trajectories that new digital technologies are taking. How are our friendships being shaped, sustained, and diminished by digital technologies?

strand of nearly all activity in relation to “ways of being” in a digital society. Therefore, looking for one very specific starting point was not seen as straightforward, especially given the multiple ways in which relationships are expressed. Consequently, both social behavior and results from research can vary as the context of interests, conditions, and constraints ebb and flow with changing digital technology. As a result, the analysis put the initial ESRC scoping question to one side and utilized those derived from the Delphi first round, shown in Table 8.1. Of all the domains examined, the question of how media and technologies have affected relationships and communication is one of the oldest, going back to Classical Greek debates over the value of orality and literacy. In the context of digital media, much early research in the 1980s and 1990s sought to understand how interaction without face-to-face presence would function. This work has its roots in prior research comparing social presence in various pre-digital media (e.g., Short et al., 1976; Rutter 1987). These ideas were taken up in relation to digital media in the 1980s around ideas of “cuelessness” and formed the foundation of works such as Kiesler, Siegel and McGuire’s (1984) examination of the effects of “reduced social cues.” Much of this work had a strongly social-psychological focus around group behavior. The cumulation and to

ESRC Review: Communication and Relationships   223 a large extent rejection of this line of work can be found in the SIDE model of online group behavior (see: Postmes et al., 1998; Spears et al., 2002) that is also discussed in chapter 14. This is also the basis of more recent work on deception and “anonymity” in online interaction. Some clear parallels can be drawn between the “flaming” behavior identified by Kiesler, Siegel, and McGuire and more recent work on contemporary antisocial behavior online (e.g., trolling). Other work examined the content of interaction such as the examination of socio-emotional content in computer conferencing by Rice and Love (1987), and how computer-mediated communication could foster organizational innovation (Rice, 1987, extending the Short et al., 1976 work on social presence) which early on contributed to the rejection that computer-mediated communication necessarily is cueless and therefore generates de-regulated mediated content, such as flaming and depersonalization. Separately, socio-linguistic work examined the textual and linguistic differences between speech, writing, and online interaction (e.g., Herring, 1996; Yates, 1996).

Literature Analysis The literature analysis was designed to create two analytic outcomes. First, the goal was to identify key topics within the existing literature. This would allow for a comparison with areas of future importance identified by the Delphi review. Second, we applied content analysis of the literature to explore the predominance of specific theories, methods, and approaches within the domain.1

Topics As noted in chapter 2, the literature data were subjected to two analyses. The first round of collected literature was analyzed to create concept pairs and trios, while the combined first and second rounds of literature were analyzed to identify key topic clusters. The results of these two approaches were then compared. The 10 most common concept pairs identified by the Round 1 literature analysis are listed in Table 8.2. These represent the concepts covering 2% or more of the identified cases. Table 8.3 lists concept pairings. All the literature collected from both rounds was then analyzed using Wordstat. Wordstat identified 21 concepts, which are presented in Table 8.4. These map closely to the concept pairings identified in the above analysis. As with the other domains, we can see a shift in focus within the literature between 2000 and 2016 (shown in Figures 8.1 and 8.2). The broad comparison of change over time in the frequency of concept pairs associated with the subject “communication” based on the smaller curated literature shows considerable differences between the periods 2000–2004 and 2012–2016. Early on, the most frequent pairs involve relationships,

Table 8.2  Analysis Concepts Ranked Concepts

Percent

Friend Media Pair Group Adolescent Phone Communication Relationship Time Medium Level Teen Life Parent

9.9 8.2 8.0 4.3 4.3 4.0 3.9 2.5 2.5 2.3 2.1 2.1 2.0 1.9

Table 8.3  Concept Pairings—Main and Secondary Concepts Concepts

Percent

Concepts

Percent

Concepts

Percent

Adolescent Adult Life Realism Uncertainty

4.3 2.0 1.5 0.3 0.5

social-media Communication Group Information Interaction Medium Member Pair Relationship Student Tie Work

8.2 0.9 0.4 0.8 0.4 0.9 0.6 0.9 0.9 0.5 0.8 1.0

Friend Friendship Instant Judgment Newcomer Pair Photo Post Tie

9.9 2.4 0.3 0.5 0.7 1.3 1.4 1.3 2.1

Group Identification In-Group Out-Group Poster Sip Socialization

4.3 1.2 0.8 0.7 0.4 0.5 0.7

pair Percentage Rate Relation Sociability Status Total Week Whole Writing

8.0 0.9 1.3 1.3 1.1 1.1 0.6 0.4 0.4 0.9

Parent Phone

1.9 1.9

Communication Controllability Correspondent Monograph Propinquity Sip        

3.9 0.7 1.0 0.9 0.9 0.4        

Level Move Pair Var

2.1 0.7 0.9 0.6

Phone Plan Punishment Someone Subgroup Teens

4.0 1.1 0.4 0.9 0.5 1.0

Life Pew Writing

2.0 1.4 0.6

Relationship Root Work

2.5 0.6 1.9

Medium Multitasking Richness Storytelling

2.3 0.3 1.5 0.5

Teen Twitter Voice      

2.1 1.2 1.0      

Table 8.4  Wordstat Analysis of Topics Topics

Keywords

Eigen-value

Freq

Cases

% Cases

Social network platforms

SOCIAL; COMMUN; THI; AR; INTERACT; PEOPL; SPACE; INFORM; NETWORK; THEI; SYSTEM

1.61

108,173

569

97.1

Facebook

FACEBOOK; ELLISON; SITE; NETWORK; FRIEND; SN; SNSS; BOYD; CAPIT; SOCIAL

1.91

44,414

559

95.9

Measurement

MEASUR; VARIABL; WA; SAMPL; ITEM; SURVEI; DATA

1.64

28,226

552

94.7

Twitter

TWEET; TWITTER; HASHTAG; RETWEET; USER; REPLI; API; PLATFORM; ACCOUNT; CHAPTER

11.88

28,460

537

92.1

Higher education

STUDENT; COLLEG; TEACHER; EDUC; SCHOOL; LEARN

1.73

13,949

521

89.4

CMC vs. FTF

CMC; FTF; CUE; WALTHER; PARTNER; INTERACT

2.26

13,697

511

87.7

Storytelling

CCM; STORYTEL; CREATIV; AUSTRALIAN; AUSTRALIA; ART; DIGIT; PROJECT

2.39

14,149

507

87.0

Nations and countries NATION; EUROPEAN; COUNTRI; EUROP; POLIT; GLOBAL

1.70

13,864

506

86.8

Gender and language

WOMEN; MEN; MALE; FEMAL; GENDER; LINGUIST; FEMINIST; LANGUAG; SEX; SPEECH

2.69

16,931

503

86.3

SNA

PAIR; CERIS; TIE; MULTIPLEX; FREQUENC; FACULTI; TI; FRIENDSHIP; EXCHANG; EMPLOYE

2.96

9430

498

85.4

Corporations

COMPANI; MARKET; BUSI; CORPOR; CONSUM; SERVIC; ADVERTIS

2.01

11,752

490

84.1

Critical theory

MARX; LABOUR; FUCH; DIALECT; LUK¡C; IDEOLOGI; ECONOMI; CAPIT; CRITIC; CLASS

3.37

11,067

464

79.6

Privacy

ION; PRIVACI; ER; AL; PROTECT

1.49

7717

452

77.5

Health care

CARE; PATIENT; TELECONSULT; HEALTH; HOME

1.57

5937

421

72.2

Blogging

BLOG; BLOGGER; READER; COMMENT

1.81

5106

403

69.1

Media consumption

FILM; CINEMA; NARR; IMAG GAME; PLAYER; VIDEO; AVATAR

1.46

5557

340

58.3

Adolescents and sexuality

ADOLESC; SEXUAL; EXPOSUR; SEIM; SEX

3.71

8015

326

55.9

Social club

CLUB; FAN; SPORT; TEAM

1.45

1305

194

33.3

Children and families

BOI; GIRL

1.40

2285

167

28.6

Old media

TELEVIS; AUDIENC; WATCH; TV; BROADCAST; VIEWER; MEDIA PHONE; CELL; TEEN; MOBIL

1.94

22,518

523

89.7

1.88

8567

421

72.2

Mobile phone

226   Simeon J. Yates et al.

information

exchange/pai frequency/me exchange/rel

frequency/l

exchange/inf

multiplexit

medium/tie

communication/ relati

communication/ link/tie

communicati

medium/work relationship/work

link/work

communication/ medium

relationship/ti

internet/use

communication communication link/pair

community/n

medium/ meet

pair/tie pair/relationship

communication/t

medium/pair information/p medium/relatio

frequency/t

link/medium

pair/work

medium/use tie/work

frequency/r

communication/pa link/relati

information/

friendship/t information/

exchange/me friendship/

frequency/p

communicati group/ member

medium/type informatio communicat friendship

fequency/

Figure 8.1  Communication 2000–2004: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2000–2004. The diameter of each circle reflects the frequency of the concept pair.

pair/tie/link, communication, medium, and work. Less frequently included were terms such as information, use, Internet, and exchange. Thus the focus was on relationships or network ties involving the process of communication, the medium of communication, and the context of the relationship (work and information). By 2012–2016, there was much less emphasis on general relationships and specific links, and more on the specific medium of Facebook and related terms such as user, network, and friend. This obviously reflects, in terms of more recent research, the commercial and social dominance of the new platforms (especially Facebook) in western societies. It may also point to the fact that data from these platforms is easily harvested along with the fact that many studies appear to be of adolescents and colleague students—the concepts of “teen” and “college student” are also notable in the analysis. Within this there is a distinct shift to social network analysis informed approaches, and this domain is one where this approach is highlighted in the analysis. Contexts shifted from work to college, the family, students,

ESRC Review: Communication and Relationships   227

facebook/med facebook/inf

communicatio

twitter/user internet/use

boy/girl

facebook/teen care/patient

facebook/user

audience/siz

network/user facebook/use access/intern

effect/medi

individual/n

capital/netwo

communication/medium

network/tie

perceiver/target

medium/user

profile/teen friend/teen

facebook/site

college/stud

facebook/friend

medium/theory friend/paren

child/parent

communication/t facebook/stud friend/netw

capital/face

television/vi

friend/user

facebook/network parent/teen

information/s medium/state care/telecons

teen/user audience/user

network/siz

medium/teen

medium/twitt facebook/ti

mediatizatio

facebook/rel

bahavior/pe

datum/valen

teen/twitte

Figure 8.2  Communication 2012–2016: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2012–2016. The diameter of each circle reflects the frequency of the concept pair, with the most frequent pair beginning in the center.

teens, and patient care. The mediated social network became central. Overall there appears to be a shift from studies that may have sought to generalize about digital media—computer-mediated communication—to ones with a strong “platform focus.” The challenges of a “platform” focus are discussed in chapter 25. In examining the papers and publications collected for this domain we found that the identified themes and topics consistently cross cut and overlap. A paper on Facebook would also likely raise issues about young adults, or a social network analysis would address issues of community. Underlying much of the literature are comparisons with face-to-face, and occasionally other media (writing, TV, mass media). Such comparative goes back to very early studies of digital media (computer-mediated communication) use discussed above. We do not intend to review this work here though we would note that such analyses are important in comparatively grounding studies in relation to existing

228   Simeon J. Yates et al. media practices. We would also note that many recent studies, and the examples examined below, are more likely to explore the use of a specific digital media as part of a citizen’s or user’s “suite” of communicative and media practices. We have therefore pulled out three themes as starting points for the presentation of example literature in this domain where the cross-cutting overlaps can be seen: • Social media platforms • Young people and adolescents • Social network analysis Social media “platform” studies. Taking Twitter and Facebook as examples of new platforms that have become the context of study, we find a range of different foci in the research literature. Many of these cross-cut the other themes identified in this domain but also reflect broader social and media discussion. Much of the media coverage of Twitter, Facebook, and other social media platforms has raised concerns about the level and extent of data and information sharing by young people. Such concerns are also reflected in research. For example, Madden et. al (2013) examined the way teens share information on social media. In fact, according to Madden et al.’s findings, few teens embrace a fully public approach to social media. Instead, they take an array of steps to restrict and prune their profiles, and their patterns of reputation management on social media vary greatly according to their gender and network size. As with many studies of new platforms, there is a need to understand the basic features and demographics of their use. At the time of writing Madden et al. noted an increase in the use of Twitter by teens from 16% in 2011 to 24% in 2013. They also noted that the median number of friends in Facebook was 300 and the number of Twitter followers was 79. Their research found that teenagers were sharing more information about themselves on such sites than they had in the past on other platforms and media. The research looked at five different types of personal information sharing (examples of personal photos, or details of school, city, email, and phone information) comparing 2006 and 2012. All five types significantly increased. Nonetheless, many (60%) teenage Facebook users keep their profiles private and a majority expressed high levels of confidence in managing settings. At the same time the research found that they have limited concerns about the use of their data by third parties. Overall the respondents were found to utilize a range of methods to manage their presentation of self and sharing of data online— including the deletion of other people from their networks. The focus group discussions undertaken by the researchers found that respondents had a waning enthusiasm for Facebook for a range of reasons. These included a dislike for the increased presence of  adults on the platform, pressures to be present, and issues around excessive and demanding levels of posting. Yet they kept using the platform because participation was an important part of overall teenage socializing (Madden et al., 2013). This trend has been seen to continue in more recent studies (e.g., Anderson & Jiang, 2018). This change

ESRC Review: Communication and Relationships   229 points out three key topics researchers should emphasize when exploring the impact of specific platforms: • Document the social contexts, demographics, interactional behaviors, and general uses over time. • Understand their use in the context of other digital and non-digital interactions and contexts. • Explore the broader underlying issues, in this case sharing of personal information and presentation of self (c.f. Goffman, 2002, 1959), that have broader social science importance but which may be articulated in specific ways in specific platforms. Similarly, Marwick and boyd (2014) examine how teenagers negotiate content in social media. They argue that the dynamics of sites such as Facebook have forced teens to alter their conceptions of privacy to account for the networked nature of social media. The researchers draw on examples from a large-scale ethnographic study consisting of 166 semi-structured interviews with teenagers and participant observation conducted across 17 US States to explore what they refer to as “networked privacy.” They argue that teens conceive of privacy as the ability to control a particular situation that happens in a particular place, stating that, To manage an environment where information is easily reproduced and broadcast, we find that many teenagers conceptualize privacy as an ability to control their situation, including their environment, how they are perceived, and the information that they share.  (p. 1056)

To achieve privacy, teens therefore use various strategies to gain control over the way their information is distributed. Online privacy therefore becomes context-specific and changes over time. Marwick and boyd note that, How people achieve privacy depends not solely on their ability to navigate technology, but requires them to fully understand the context in which they are operating, influence others’ behaviors, shape who can interpret what information, and possess the knowledge and skills necessary to directly affect how information flows and is interpreted within that context.  (pp. 1062–1063)

In addition, they argue that teens developed tactics to regulate who can access the information they share online, for example, encoding the content itself in order to limit the audience (rather than using the social media affordances of privacy settings). Thus, . . . achieving privacy requires that people have an understanding of and influence in shaping the context in which information is being interpreted. This can be done by co-constructing the architecture of the systems, or it can be done by embedding meaning and context into the content itself.  (p. 1063)

230   Simeon J. Yates et al. Overlapping with the social network analysis theme, literature exploring specific social media often focuses on the nature of relationships in these networks (what it means to link as a contact, friend or follower, etc.) For example, Mesch et al. (2012) examine the effect of individual, relational (e.g., tie homophily, relationship type, tie duration, and tie closeness), and cultural variables on communication via instant messaging (IM). The study focuses on the frequency of interaction among users from Israel and Canada. The researchers collected data from 785 participants between 2005 until 2006. Participants in Israel completed a paper-and-pencil questionnaire in Hebrew, and participants in Canada completed an online survey in English. Their findings show that in both countries, IM was used primarily to keep in touch with close friends. Hours of daily IM use was positively associated with frequency of communication via IM in both countries. Relationship type predicted the frequency of communication via IM; for example, people were messaging their romantic partner more frequently than with a close friend. Mesch et al. argued that relationship variables are key to understanding IM behavior (rather than ones relating to technology features). As they note, The most salient result of this study is the explanatory power of relational variables in the understanding of the use and content of IM. The current study provides strong support for the argument that online communication is used primarily, but not exclusively, to maintain existing ties rather than to develop new ties.  (p. 750)

They also identified potential social and cultural variations, finding that, Gender similarity was not associated with IM topic multiplexity in Canada, but had a negative association in Israel. This finding suggests that in Canada same-sex and opposite-sex pairs discuss diverse topics to the same extent, whereas in Israel same-sex pairs are less likely than opposite-sex pairs to discuss a diverse set of topics.  (p. 750)

They also found that IM had a specific role—mainly coordinating social activity—in relationships and interactions, irrespective of the length or that relationship: . . . it seems that regardless of relationship duration, IM is used more for instrumental purposes (i.e., coordinating activities and scheduling meetings) than ­predictive purposes (i.e., companionship and social support). This distinction in use may explain the non-significant effect of relationship duration on frequency of communication.  (p. 751)

Overall, though, Mesch et al. found considerable similarities between the two groups of users in Israel and Canada, stating that, The results show that young people in both countries have strikingly similar patterns of usage. Participants in both countries indicated that their primary communication partners to be close friends, and family members. Contacts who met online were rare

ESRC Review: Communication and Relationships   231 in both countries, suggesting that IM is used to maintain existing relationships rather than to generate new online ties.  (p. 753)

This result reminds us that much digital media use is embedded in the everyday lived lives of people and not in some separate “cyberspace” world. This does not mean that there are not online contexts that function primarily or solely online, but rather to point out that digital media are now well embedded into the management of everyday social interaction. Work on relationships in digital media and digital platforms also often cuts into issues of community (see chapter 14). For example, Gruzd et al. (2011) examine the concept of community on Twitter using Benedict Anderson’s idea of “imagined communities” (1983). In addition to relying on Anderson’s work, they also apply two other notions of online communities: Jones’s (1997) notion of “virtual settlement” and McMillan and Chavis’s (1986) compilation of what constitutes a “sense of community.” In order to examine this, the study used one of the researcher’s own Twitter accounts and examined his network by using Twitter’s API to automatically retrieve a list of his followers and sources and to also determine who follows whom. So as to trace changes in the Twitter network of mutual followers, the researchers collected these data twice: in August 2009 and February 2010. The researchers utilized a mix of social network analysis and content analysis of the messages. They argued that, An ‘imagined’ community on Twitter is dual-faceted. It is at once both collective and personal. It is collective in the sense that all [users] belong to the worldwide set of [users] who understand Twitter’s norms, language, techniques, and governing structure.  (Gruzd et al., 2011, p. 1312)

They noted that Twitter communities formed around “high centers” that include “ . . . popular individuals, celebrities, or organizations such as media companies. Yet even less popular individuals on Twitter can play the role of local high centers of predominantly mutual networks” (p. 1313). Taking a sociological view of the results, Gruzd et al. argued that, Twitter turns out to be an implementation of the cross-cutting connectivity between social circles that 19th-century sociologist Émile Durkheim (1893/1993) argued was the key to modern solidarity.  (Gruzd et al., 2011, p. 1314)

In a similar approach, McEwen and Wellman (2013) examine how communities operate in different contexts in light of social media such at Twitter. They argue that groups in such media as Twitter are alternative places for people to connect with each other and that such online interactions are just as real and authentic offline contact: For the networked individual, ‘community’ is not geospecific but is defined as networks of personal communities that provide sociability, support, information, a sense of belonging, and social identity, managed on and offline using ICTs.  (p. 170)

232   Simeon J. Yates et al. As with many other studies, they find that Twitter groups are extensions of other social groups or communities—and that only a small proportion of Internet users met someone new online. Thus: These places are just alternate spaces for people of all ages to connect with their friends and peers; technology-enabled interaction fits seamlessly into their everyday lives and complements other practices.  (p. 170)

As a result, social media platforms are not the sole focus of specific relationships; rather, they mark one of many locations where relationship and community building work is done: When the networked individual manages relationships through a wide variety of media, such as email, landline telephone, instant messaging, Facebook, Twitter, mobile phone, and so on, we describe both the relationship and the media as being multiplexed.  (p. 173)

It is clear just from these example studies that social media platforms are key to understanding communication patterns and relationships in a digital age and this behavior and individual platforms are not separate from broader social interaction. The papers have also cross-cut other domains, especially Community and Identity (chapter 14). We also find that long-standing themes in social science—from presentation of self to community formation—form the underlying basis of the analysis. Young people and adolescents. The use of digital media and its impacts on young people is prevalent in the literature for this domain. The review did not specifically seek to explore the use of digital media by children—this is an area that has been extensively explored in recent years from research and policy perspectives (see Livingstone, 2002; Drotner & Livingstone, 2008; Livingstone & Sefton-Green, 2016). The literature discussed here therefore focuses on adolescents and young adults and much of this work explores how digital media are utilized in social interaction, relationships and socialization. For a comprehensive review of how college students manage multiple media for purposes such as relationships, see chapter 9. Such research questions and concerns have a long history in the study of media— digital or traditional, focusing on the use by and often the potential hazards that media may hold for young people. These issues have also often been the focus of media debates about adolescent behaviors—including many cases of media “moral panics” (Critcher, 2003). Such debates have also influenced the direction and focus of research questions. For example, prior work has explored the role of media in the socialization of adolescents, with Arnett (1995) noting that: . . . media are part of the process by which adolescents acquire—or resist acquiring—the behaviors and beliefs of the social world, the culture, in which they live.  (Arnett, 1995, p. 525)

Arnett (1995) provides a typology of adolescent media uses, including: entertainment, identity formation, high sensation, coping, and youth culture identification. Exploring

ESRC Review: Communication and Relationships   233 these five uses in relation to adolescent socialization, Arnett notes that media use and consumption differs from other socializing agents such as family, school, community, and the legal system. The key difference is that adolescents have greater control over their media choices than they do over their socialization from these other sources: The independence granted to adolescents in making media choices may contribute to their alienation, as they attempt to sort out the dissonance between the socialization messages in the media they use and the socialization messages promoted by adults in their families, schools, and communities.  (Arnett, 1995, p. 530)

Issues of socialization, media use, and family and community relationships are also all bound up in issues of identity and its expression. Within the context of digital media use this is often explored through the presentation of self online, or through form and content of interactions via digital platforms. Here again concerns over potential harms as well as benefits of digital media use can be found in both academic research and media coverage. As an example, Valkenburg and Peter (2008) investigate the effects of adolescents’ online identity on their offline social competence and self-conception, with an underlying concern that digital media use might increase social anxiety. They conducted an online survey in 2006 among 1,158 Dutch teens between 10 and 17 years old. They developed a set of measurement scales of off-line social competence that included four subscales: initiation, supportiveness, self-disclosure, and assertiveness. Their findings that even though adolescents experimented with their online identity more have more often communicated with people from different ages and backgrounds online, and “ . . . although adolescents’ self-concept showed considerable variance, there was no evidence that their level of self-concept unity is affected by engaging in online identity experiments” (Valkenburg & Peter, 2008, pp. 225–226). For some of these adolescents, this experience had positively contributed to their social competence: Although we did not find a positive relationship between social anxiety and online identity experiments, our result did reveal that lonely adolescents significantly more often used the Internet to experiment with their identity than nonlonely adolescents. Lonely adolescents apparently benefit from the relative anonymity of the Internet to learn how to relate to people and to practice their social skills.  (p. 226)

There is an element of “technological determinism” in some of this work, as many studies are formulated around the assumption or hypothesis that the use of digital media will have a direct influence on behaviors, experiences, and outcomes. Very often, though, the picture is quite complex and non-digital factors (in other words social and demographic factors) are found to be either necessary, and often sufficient, for all explanations. Subrahmanyam and Lin (2007) examined the relationship between adolescents’ online activity and their well-being, conducting a survey of 192 adolescents ranging from the age of 15 to 18. The survey explored their access to and use of the Internet, focusing on loneliness and social support. Overall they found that,

234   Simeon J. Yates et al. Contrary to our expectations, loneliness was not related to whether participants knew and were familiar with their online partners but was related to participants' gender and their perceived relationship with their online partners. (Subrahmanyam & Lin, 2007, p. 672)

Such results remind us that many key explanatory variables underpinning interaction and relationships via digital media are not “new”; they are based on a whole range of wellknown social, psychological, and cultural behaviors and factors. What may be new is the specific manner and form in which digital media are used to support interaction and relationships. For example, for a review of how computer-mediated communication are related to social support, especially during times of transition, see Mikal et al. (2013). Similarly, research has focused on how teenagers and adolescents have appropriated technologies and developed new forms of interaction in digital media. As an example, Greenfield and Subrahmanyam (2003) examined the way participants in an adolescents’ online chatroom adapt to the features of chat to create coherence and distinct registers. Once again, as noted above, the focus is on how digital interaction differs (or not) from face-to-face interaction. In order to examine the strategies that adolescents use to achieve coherence in online chats, Greenfield and Subrahmanyam conducted participant observation in teen chatrooms, and analyzed the transcripts of the interactions. The researchers find two main strategies: The strategies for achieving coherence in this environment address two important functions—identifying a conversational partner and determining a relevant response. We suggest that adapting to the demands of online chatrooms uses resources from both oral and written discourse to produce a new register for online chat. (Greenfield & Subrahmanyam, 2003, p. 714)

Many of the strategies used to achieve coherence were found to be similar to those in face-to-face conversation. These include such things as repetition and directly addressing intended conversational partners. There are also media- and channel-specific strategies tied to the technology or the specific norms of the group. There are also coherence behaviors similar to those in face-to-face interaction that are articulated via the constraints of the medium: In addition to specific cues, there are also general judgments of topical relevance, semantic relationship to a prior turn, and knowledge of who is participating in a particular thread at a particular time that must come into play, both for us and for the participants.  (p. 735)

The participants also used a range of textual and use visual cues and conventional codes, constructing a distinct register. Use of this register marked them out as “native speakers” of online chatrooms. The visual nature of the online computer medium helps participants to overcome the confusion of multiple overlapping conversations, changing participants, and

ESRC Review: Communication and Relationships   235 spatially and temporally separated conversation threads. Key strategies—such as nickname format, use of numerals, distinctive script, standard graphic format, and slot-filler framework—capitalize on the visual nature of the medium.  (p. 736)

Social network analysis. Throughout the history of the study of interactions via digital media the “networked” nature of the interaction—especially in the context of group interaction—has been a prominent feature. Many early studies of digital interaction focused on aspects of network structure, including power and influence, as well as the management of coherence in networked interaction (e.g., Paolillo,  2001), and how online network links were related to emotional content and reciprocity (Rice,  1982; Rice & Love, 1987). With the rise of “social networking sites” such as Facebook and Twitter (or their various precursors such as MySpace or even Usenet) the nature of social-networks has become a key topic for analysis. This has introduced the confusion of a “social network” as type of digital media with the longstanding idea of a “social network” as an object of analysis in social research. While also exploring their history and how academia had explored them to date, boyd and Ellison (2007) looked to define the key characteristics of social network sites (SNS). In this work boyd and Ellison argue that social “network sites” rather than “networking” is a more accurate term, as it describes people communicating within their networks rather than trying to be in these spaces solely for the sake of “networking.” They define SNS as “services that allow individuals to (1) construct a public or semi-public profile within a bounded system, (2) articulate a list of other users with whom they share a connection, and (3) view and traverse their list of connections and those made by others within the system. The nature and nomenclature of these connections may vary from site to site” (boyd & Ellison, 2007, p. 211). As with other digital media, boyd and Ellison note how users appropriate the technology to their needs, sometimes subverting the intentions or expectations of the technologies designers. In particular they note the development of groups within SNS—networks within the network—defined by social, demographic, political, or cultural factors: While SNSs are often designed to be widely accessible, many attract homogeneous populations initially, so it is not uncommon to find groups using sites to segregate themselves by nationality, age, educational level, or other factors that typically segment society . . . , even if that was not the intention of the designers.  (p. 214)

Importantly, boyd and Ellison point out something that has now become a core feature of many studies of digital media—the use of SNS as a source of potentially “naturalistic” (that is non-experimental) data for social and digital research (such as profile and linkage data); though more recent work has pointed out the potential biases of such data sets (Blank, 2017; Blank & Lutz, 2017), and harmful implications for citizenship and governance (chapters 16 and 18). This question of methods is also present in much of the literature, as researchers look to explore and examine new digital tools used to analyses SNS or new digital data sources derived from SNS. Bruns and Stieglitz (2013) address aspects of this by

236   Simeon J. Yates et al. considering the use of standardized metrics to comparatively and systematically analyses Twitter interaction. They are looking to outline metrics which examine the total activity and visibility of individual participants; metrics which establish the temporal flow of conversation, and of specific forms of conversation; and metrics which combine these aspects to examine the relative contributions of specific, more or less active, user groups during each unit of time.  (Bruns & Stieglitz, 2013, p. 92)

They describe a catalogue of widely applicable, standardized metrics for analyzing Twitter-based communication, with particular focus on hashtagged exchanges in data “at large scale.” They note the value of user-focused metrics but also look to address the analysis of Twitter data over time, arguing, While user-based metrics are valuable for analyzing the overall shape of the user base of a specific hashtag, for highlighting especially active or visible contributors, and for examining whether hashtags are used mainly for posting original thoughts, for engagement within the community, or for sharing information, a second major group of metrics emerges from a breakdown of the total data-set not by user, but by time.  (p. 99)

Bruns and Stieglitz suggest three areas for metrics: metrics which describe the contributions made by specific users and groups of users; metrics which describe overall patterns of activity over time; and metrics that combine these aspects to examine the contributions by specific users and groups over time. Such metrics and analyses also draw upon well-established methods for social ­network analysis developed within sociology and information studies for the analysis of links between individuals, groups, organizations or artefacts (e.g., Wasserman & Faust, 1994). The application of such methods to explore the digital interactions or SNS interactions of users and citizens often focus on specific communities (e.g., Rice, 1982’s over-time study of computer conferencing groups), and therefore overlapping with the Community and Identity (chapter 14) and Citizenship and Politics (chapter 16) domains. For example, Vromen et al. (2015) examined how politically engaged young people integrate social media use into their organizations, political communication, and civic engagement. They conducted in-person focus groups with 12 civic groups of students from the United States, United Kingdom, and Australia. All the groups reported that they use social media to maintain the group, distribute related information, and organize different kinds of events. As with many other similar studies, they found an integration between digital media use, traditional media use, and physical and digital social networks: While Facebook discussion does not replace meetings and events for the group members at large, it has become essential for organizing any kind of offline group

ESRC Review: Communication and Relationships   237 meeting and ensuring event attendance. This is consolidated through social media functionality, such as the public display of members saying they are attending an event, and especially the diary functions Facebook events add to. (Vromen et al., 2015, p. 89)

The analysis found four main ways that social media created or shaped the respondents’ political communication: “broadcast, new information, everyday political talk and new political action” (Vromen et al., 2015, p. 90). Importantly, the researchers compared these behaviors across three countries to explore the impact of cultural context, finding that, the three dutiful-oriented party groups had more in common with one another in terms of their citizenship norms and practices than they did with the identity and issue-based groups within their own country.  (p. 95)

In the context of social network analysis, a key measure or research focus is that of social capital—however measured. Ellison et al. (2007) examined the relationship between use of Facebook and the formation and maintenance of social capital. They also explored the dimension of social capital that assesses one’s ability to stay connected with members of a previously inhabited community, which they called “maintained social capital.” The researchers conducted a survey with 800 undergraduate students from Michigan State University. The students reported spending between 10 and 30 minutes on average using Facebook each day and report having between 150 and 200 friends listed on their profile. Ellison et al. noted that Facebook had a role in the processes by which their student respondents formed and maintained ­various aspects of social capital social capital. They also examined their well-being (self-esteem and satisfaction from life). Students who reported low satisfaction and low self-esteem seemed to gain social capital if they used Facebook more intensely. As a result they concluded that Facebook use is important for developing bridging social capital: This form of social capital—which is closely linked to the notion of ‘weak ties’— seems well-suited to social software applications, as suggested by Donath and boyd (2004), because it enables users to maintain such ties cheaply and easily. (Ellison et al. 2007, p. 1162)

As has been noted above and in many places throughout this volume, the study also found that SNS use was integrated into everyday life. As a result, this digital media use formed part of, rather than was a separate activity from, ongoing relationships: Online interactions do not necessarily remove people from their offline world but may indeed be used to support relationships and keep people in contact, even when life changes move them away from each other.  (p. 1164)

238   Simeon J. Yates et al.

Theory, Method, and Approach As with the other ESRC review chapters, the following analysis builds on Borah (2017). Most of the analyzed papers (64%) were inductive, either describing findings or building theory (Table 8.5), while only 14% undertook theory testing. Reflecting this, 64% of the papers undertook primary data collection with 23% being discursive reviews of or reflective on existing research (Table 8.6). The main disciplines from which theory was used or for which theory was developed were: psychology (39%), sociology (32%), and communication and media (16%). Only actual use for the purposes of deign or analysis Table 8.5  Epistemological Approach   No clear epistemology Deductive (testing of existing theory) Inductive (conclusions driven by data)

Percent 22.1 13.9 64.0

Table 8.6  Empirical Approach   Discursive/descriptive (no new data or theory) Primary empirical (data collected and analyzed) Secondary empirical (analysis of existing data) Theoretical (synthesis of current or prior work)

Percent 22.9 63.8 5.1 7.7

Table 8.7  Research Method   Content analysis Ethnography Experiment Focus groups Interview(s) Literature review (general or narrative) Meta-analysis or systematic review Other Social network analysis Survey Textual (linguistic-discourse analysis) Theory building

Percent 5.4 6.9 9.5 5.4 23.7 20.3 0.5 18.0 4.1 36.0 4.1 6.2

ESRC Review: Communication and Relationships   239 Table 8.8  Study Population  

Percent

Case study(ies) General population Specific group No study group Grand Total

1.5 8.0 34.8 56.0 44.3

were coded. General reference to prior work and theory were not coded. There was considerable variety in the specific theories applied from these disciplines and no clear preference. No one theory appeared more than three times. The main research methods (Table 8.7) were surveys (36%), interviews (24%), and literature reviews (20%). Though many studies undertook to analyses respondents’ social networks (often via surveys), only a small number of papers (4%) conducted formal statistical social network analysis from scraped or surveyed SNS use. The majority of the empirical work focused on specific groups (e.g., Facebook users) with a limited number of general population studies (Table 8.8). Less than 2% of studies overtly stated that they were using a “big data” approach.

Delphi Review The following sections detail the results of the Delphi process for the Communication and Relationship domain, covering three main areas: suggested scoping or research questions, key topics to address within these questions, and key challenges to researching these questions (see the initial comments section at the start of the chapter). The Delphi review identified a set of scoping questions for the domain and these were coded into the five categories detailed in Table 8.9: digital literacies, norms and values, platform affordances, quality of relationships and communication, and relationship management. The ranking of these categories by the number of questions allocated to the ­category is provided in Table 8.10, and by their ranked importance from the confirmatory survey is given in Table 8.11. The two categories of scoping questions rated as the most important were digital literacies, and quality of relationships and communication. It is important to note that ranked importance is almost the inverse of the ­number of questions allocated to the category. As has been noted already in regard to the literature, many of areas identified in the scoping questions and challenges are cross-cutting of this and the other domains (see chapter 25), a key one of these being digital literacy.

240   Simeon J. Yates et al. Table 8.9  Delphi Review Scoping Questions Question category

Example questions

Digital literacies

What literacies are required for effective communication using digital technologies? Should these literacies be taught, or can we assume that they develop organically? To what extent do individuals’ digital legacy and digital capability affect their interactions with others in work and leisure?

Norms and values

What normative pressures do people experience related to relationships shaped and sustained by digital technologies? What is the new normal for relationships now that they are shaped and sustained by digital technologies across multiple domains?

Platform affordances

What are the platform affordances of digital technology that construct or constrain relationships? How do particular platforms affect various kinds of relationships: social, sexual, familial, collegial, activism, fandom, etc.?

Quality of relationships and communication

How does communication via digital technologies facilitate the quantity and quality of our relationships? How are our relationships being shaped, sustained, and diminished by digital technologies, in and between the domains of work and family?

Relationship management

How are family, friend, and work relationships shaped by, and reshaping, the trajectories that new digital technologies are taking? How are our friendships being shaped, sustained, and diminished by digital technologies?

Table 8.10  Scoping Questions Ranked by Number of Cases Relationship management Platform affordances Quality of relationships and communication Digital literacies Norms and values

Table 8.11  Scoping Questions Ranked by Importance   Digital literacies Quality of relationships and communication Norms and values Relationship management Platform affordances

Percent 85.7 71.4 64.3 50.0 28.6

ESRC Review: Communication and Relationships   241

Scoping Questions The consultation workshop identified a set of issues or additional scoping questions for each of the five categories, shown in Table 8.12. The workshop also noted that the following topics appeared to be missing from the results of the Delphi work: • Issues of cultural specificities • Cultural analysis • Mixed modal interaction

Topics The topics identified in the Delphi review were coded into 25 categories as detailed in Table 8.13. The categories occurring the most frequently include friendships and r­elationship formation, age, privacy and ethics, work and organizations, education, and  social and ­community support. The consultation workshop also highlighted the following issues: • Age (user age versus user experience) • Social media “bubbles” • Cross over to the Data and Representation Domain • Research methods Table 8.12  Consultation Workshop Scoping Categories and Example Questions Scoping question category

Example questions

Digital literacies

Who needs help with digital literacies? Are these taught or learned? Understanding our “digital communication assets”

Norms and values

What are the origins of normative pressures? How are communicative norms formed and transmitted? Which behaviors and activities are “normal”?

Platform affordances

What types of relationship are supported? What types are “new”? Changes to proximities/propinquity? Managing privacy? Platform is the message—or platform focus may be to technological determinist?

Quality of relationships and communication

Interaction versus functioning online? Why focus on old categories of work, home, family? Overlaps to well-being? Overlaps to relationship management?

Relationship management

Interaction versus functioning online? Why focus on old categories of work, home, family? Overlaps to well-being? Overlaps to quality of relationships?

242   Simeon J. Yates et al. Table 8.13  Key Topics Ranked by Percent of Cases Topics Friendships and relationship formation Age Privacy and ethics Work and organizations Education Social and community support (Social) Media “bubbles” Data and representation Exclusion Politics Social change Dependency Family

Percent 12 10 10 8 6 6 4 4 4 4 4 2 2

Topics Identity Integration Interpersonal Methods Other Place Platforms Psychology Quality and variety Sexuality Textuality Theory  

Percent 2 2 2 2 2 2 2 2 2 2 2 2  

The ranked importance of these from the confirmatory survey are presented in Table 8.14. As with the scoping questions, there is also divergence between those topics that were most commonly cited in the Delphi workshop and those deemed most important in the final workshop. However, two of the three top topics were the same: friendships and relationship formation, and privacy and ethics, indicating these are central and important topics for consideration. The workshop participants also identified the following potential gaps in the Delphi topics list: • Culture • Misinformation and miscommunication • Teaching of digital literacies • Exclusion/inclusion/participation • Friendship formation (especially regarding young people)

Challenges The challenges in undertaking research in this area identified by the Delphi panel were placed into 16 categories. These categories are detailed in Table 8.15 and ranked by the number of coded items, with four of those deemed to be domain specific by the consultation workshop marked shown in in bold: • Multi-platform studies • Co-design • Ethics and privacy • Multi-disciplinary working

ESRC Review: Communication and Relationships   243 Table 8.14  Key Topics Ranked by Importance from Delphi Survey Topics

Very important

Important

Neutral

Unimportant

Very unimportant

Privacy and ethics

57.1%

35.7%

7.1%

0.0%

0.0%

Friendship and relationship formation

57.1

35.7

0.0

7.1

0.0

Social change

42.9

42.9

14.3

0.0

0.0

Social and community support

35.7

57.1

7.1

0.0

0.0

Education

35.7

28.6

35.7

0.0

0.0

Exclusion

28.6

57.1

14.3

0.0

0.0

Age factors—cohort and age

28.6

50.0

14.3

7.1

0.0

(Social) Media “bubbles”

21.4

42.9

21.4

7.1

7.1

Work and organizations

14.3

57.1

28.6

0.0

0.0

Political communication

14.3

50.0

35.7

0.0

0.0

Data and representation

14.3

50.0

28.6

7.1

0.0

Table 8.15  Challenges Ranked by Percentage of Cases Challenge Multi-platform studies Theory Co-design Big data Ethics and privacy Surveys Methods Multidisciplinary working

Percent 17 17 13 10 8 6 4 4

Challenge Community Data access Exclusion Longitudinal studies New forms of publication Old media Other Uses and gratifications

Percent 2 2 2 2 2 2 2 2

Note: Domain-specific challenges in bold.

The first category—multi-platform studies—raises the issue of multimodal r­ elationships. It questions how we should explore and how we assess the influence of any one particular technological platform, when many important relationships involve so many platforms (as well as face-to-face, and “legacy media” such as phone, texting, or mass media, etc.)? The question becomes how do we assess these complex combinations?

244   Simeon J. Yates et al. As a result, how do we research or follow people’s digital communication in their everyday lives—especially as looking at only one medium will likely only give us part of the communications or relationships (social network) picture? Research on this domain should therefore not make conclusions about relationships from single-media studies but aim to understand communications platforms as multi-media and hybrid media, addressing dynamic network analytics. Within this is the need to understand the physical and embodied use of the digital in communication activities and processes. The second category—co-designing technologies—was proposed as it was argued that many SNS systems have been implemented without such a focus. The challenge here is how to work with and alongside communities that are often ignored (especially marginalized communities) so as to co-design technologies that are of use to them and of value in their lives. Such work should focus on improving relationships rather than distancing ourselves from others. It was argued that technologies are often designed for communities with some “user testing” but little engagement with people and their lives. Thus, social scientists, working alongside designers and engineers, can use methodologies and approaches central to social science to work alongside communities to understand and communicate their needs and broker relationships. The third category—ethics and privacy—should look at the question of how using SNS data, especially to effectively mine data about relationships (as SNS platforms themselves do) affect our use, trust, or selection of digital technologies, whether for research, business or service provision? Finally, multidisciplinary working is relevant to all the domains. Here it points to the need for the research to integrate ideas from a range of disciplines to best examine and explore the technical, performative, and dynamic nature of digital communication. Such collaborations should include critical approaches (e.g., Marx, Gramsci, Hall, critical theory, Bourdieu, Foucault) so as to question and reflect on the impacts of digital media use. In conclusion, as with the other domains we believe that the complexity and variety of potential work warrants consideration to be taken of all the questions topics and challenges identified. Table 8.16 shows the eight most frequent challenges ranked by importance, with three of the domain-specific in the top four listed as “very important”: ethics and privacy, multidisciplinary working, and multi-platform studies. Noting this, we would argue that the analysis of the Delphi data suggests the following key areas for future research (see Tables 8.10, 8.13, and 8.15): • The norms and values of digital communication and relationships • The “affordances” that different platforms provide for digital communication and relationships • The quality of relationships and communication supported by digital media and technologies • The management of relationships via digital media and technologies Within these areas, future projects need to consider some key cross-cutting topics:

ESRC Review: Communication and Relationships   245 Table 8.16  Challenges Ranked by Importance from Delphi Survey Challenge

Very important

Ethics and privacy Theory Multidisciplinary working Multi-platform studies Big data Methods Surveys Co-design

64.3% 53.8 46.2 42.9 35.7 28.6 14.3 0.0

Important Neutral Unimportant Very unimportant 14.3% 30.8 38.5 35.7 28.6 42.9 21.4 38.5

21.4% 7.7 7.7 21.4 35.7 28.6 50.0 38.5

0.0% 7.7 7.7 0.0 0.0 0.0 7.1 15.4

0.0% 0.0 0.0 0.0 0.0 0.0 7.1 7.7

• Social and community aspects • Privacy and ethics • Exclusion • Social change • Work and organizations Furthermore, key domain-specific challenges include • Multi-platform studies • Ethics and privacy

Conclusion Communication behaviors and relationships are fundamental to almost all online activities, folded into and overlapping the other Domains. Digital media use on current scales and developments likely to be undertaken (e.g., with the rise of the Internet of Things; see chapter  23) make such engagements ubiquitous and almost invisible for many citizens. The overall impact of this expansion remains potentially unknown territory. Researching such change requires inter- and multi-disciplinary research methods and groups. It was widely recognized in the literature, workshops, and by the team that a whole new axis in communication has been brought about by the development and use of social media. Already, scholarly research is abundant; however, many commentators felt there were still under-researched areas, especially in terms of theory. Foremost was how people are able integrate digital media so easily into their everyday lives. Experts acknowledge that there will be benefits and further potential in social media but also that the well-documented concerns are still not well understood. These include a range of behaviors that could normatively be described as negative, for example, hyper sociability, sexting, cyberbullying, online grooming, trolling, and more generally, the broad areas of Internet safety and problematic use (see chapters 3 and 4).

246   Simeon J. Yates et al. There is an enduring concern with the virtual versus the physical aspects of c­ ommunication, with questions raised around costs and benefits of functioning effectively in a digital world and particularly if individuals were “being shaped and diminished” by digital technologies as opposed to proactively assessing and shaping future technologies. Understanding what a digital person or a digital citizen becomes problematic as digital forms of communication are folded seamlessly into lives. A general observation was raised, that communication and relationships are impacted differently depending on the particular stages in the life course, e.g., children, adolescents, students, adults and seniors (see chapters 5, 6, and 9) and also by the type of social relations. The team noted that the literature in its breadth highlights how communication density is intensified by digital technologies, so attention must be given to formulating research questions that take this into account. This is likely reflected in the topics and challenges identified in the Delphi work around “multi-platform studies” within which there needs to be focus on communication and relationships as they intersect with • Other people • Things and artefacts • Our personal “curation” of self on platforms • “Nodes” (people/artefacts/bots etc.) and networks themselves Overall, reflecting on the literature and the data, the team noted the following general issues that appeared to cross-cut both the Delphi data and the literature analyses, and which stand out as potential new questions: • What normative pressures do people experience related to relationships shaped and sustained by digital technologies? • What literacies are required for effective communication using digital technologies? • Should these literacies be taught, or do they develop organically? • How do digital media facilitate the quality and quantity of our relations (e.g., “to what extent does an individual’s digital literacy and digital capability affect interactions with others in work and leisure?”) The literature also indicates that Twitter and Facebook are well represented in contemporary literature, but research studies need to include investigations and ­ ­comparisons of other social media platforms. Moreover, the team had concerns about the attractiveness of big data analytics, reflected in the Delphi results, as this might undermine more holistic multi-method approaches required to get at the dynamics of offline and online aspects of communication and relationships. Overall, contemporary research in the Communication and Relationships domain studied here appears to have focused on: comparisons of computer-mediated communication to other media platforms such as Facebook and Twitter; digital media use by

ESRC Review: Communication and Relationships   247 younger people and adolescents; and understanding social networks. Existing work has employed fairly traditional methods such as surveys and interviews. It is orientated towards psychological and sociological approaches, with some linguistic and information studies aspects. The work does not appear to have extensively employed digital tools and big data methods, though those approaches are increasing rapidly. Most notably the work appears to have been “platform driven” and “platform specific” with a bias towards younger people. The future research identified in the Delphi process is different, though there are some overlapping areas. The focus has shifted towards more general studies of communication and relationship in everyday life and the need to understand the integration of multiple media into communication and relationship behavior. The key questions, topics, and challenges include: norms and values; the “affordances” that different platforms provide; the quality of relationships and communication supported by digital media and technologies; and the management of relationships via digital media and technologies. Within these areas key issues to consider are: social and community aspects, ­privacy and ethics, exclusion, social change, and work and organizations.

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualizations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https://waysofbeingdigital.com/literature-analysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each Domain, and the relative frequency of concepts associated with each cluster.

References Anderson, M. & Jiang, J. (2018). Teens, social media & technology 2018. Pew Research Center, 31 (http://publicservicesalliance.org/wp-content/uploads/2018/06/Teens-Social-MediaTechnology-2018-PEW.pdf) Arnett, J.  J. (1995). Adolescents’ uses of media for self-socialization. Journal of Youth and Adolescence, 24(5), 519–533.

248   Simeon J. Yates et al. Blank, G. (2017). The digital divide among Twitter users and its implications for social research. Social Science Computer Review, 35(6), 679–697. Blank, G., & Lutz, C. (2017). Representativeness of social media in Great Britain: Investigating Facebook, Linkedin, Twitter, Pinterest, Google+, and Instagram. American Behavioral Scientist, 61(7), 741–756. Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. boyd, d. n., & Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of Computer-mediated Communication, 13(1), 210–230. Bruns, A. & Stieglitz, S. (2013). Towards more systematic Twitter analysis: Metrics for tweeting activities. International Journal of Social Research Methodology, 16(2), 91–108. Critcher, C. (2003). Moral panics and the media. London, UK: McGraw-Hill Education. Donath, J. & Boyd, D. (2004). Public displays of connection. BT Technology Journal, 22(4), 71–82. Drotner, K., & Livingstone, S. (Eds.). (2008). International handbook of children, media and culture. Thousand Oaks, CA: Sage. Durkheim, E. (1993). The division of labor in society. (transl. by G. Simpson), New York: The MacMillan Company (originally published 1893). Ellison, N.  B., Steinfield, C., & Lampe, C. (2007). The benefits of Facebook friends: Social ­capital and college students’ use of online social network sites. Journal of Computer-Mediated Communication, 12(4), 1143–1168. Greenfield, P. M., & Subrahmanyam, K. (2003). Online discourse in a teen chatroom: New codes and new modes of coherence in a visual medium. Journal of Applied Developmental Psychology, 24(6), 713–738. Goffman, E. (2002, 1959). The presentation of self in everyday life. New York, NY: Anchor Books. Gruzd, A., Wellman, B., & Takhteyev, Y. (2011). Imagining Twitter as an imagined community. American Behavioral Scientist, 55(10), 1294–1318. Herring, S. C. (Ed.). (1996). Computer-mediated communication: Linguistic, social, and crosscultural perspectives (vol. 39). Amsterdam: John Benjamins Publishing. Jones, Q. (1997). Virtual-communities, virtual settlements & cyber-archaeology: A theoretical outline. Journal of Computer-Mediated Communication, 3(3). Kiesler, S., Siegel, J., & McGuire, T.  W. (1984). Social psychological aspects of computermediated communication. American Psychologist, 39(10), 1123–1134. Livingstone, S. (2002). Young people and new media: Childhood and the changing media ­environment. Thousand Oaks, CA: Sage. Livingstone, S., & Sefton-Green, J. (2016). The class: Living and learning in the digital age. New York: NYU Press. Madden, M., Lenhart, M., Cortesi, S., Gasser, U., Duggan, M., Smith, A., & Beaton, M. (2013, May 21). Teens, social media, and privacy. Pew Research Center. http://assets.pewresearch. org/wp-content/uploads/sites/14/2013/05/PIP_TeensSocialMediaandPrivacy_PDF.pdf Marwick, A. E., & boyd, d. (2014). Networked privacy: How teenagers negotiate context in social media. New Media & Society, 16(7), 1051–1067. McEwen, R., & Wellman, B. (2013). Relationships, community, and networked individuals. In R.  Teigland & D.  Power. (Eds.), The immersive internet (pp. 168–179). London: Palgrave Macmillan.

ESRC Review: Communication and Relationships   249 McMillan, D.  W., & Chavis, D.  M. (1986). Sense of community: A definition and theory. Journal of Community Psychology, 14(1), 6–23. Mesch, G.  S., Talmud, I. & Quan-Haase, A. (2012). Instant messaging social networks: Individual, relational, and cultural characteristics. Journal of Social and Personal Relationships, 29(6), 736–759. Mikal, J. P., Rice, R. E., Abeyta, A., & DeVilbiss, J. (2013). Transition, stress and computermediated social support. Computers in Human Behavior, 29(5), A40–A53. Paolillo, J. C. (2001). Language variation on Internet Relay Chat: A social network approach. Journal of Sociolinguistics, 5(2), 180–213. Postmes, T., Spears, R., & Lea, M. (1998). Breaching or building social boundaries? SIDEeffects of computer-mediated communication. Communication Research, 25(6), 689–715. Rice, R. E. (1982). Communication networking in computer-conferencing systems: A longitudinal study of group roles and system structure. In M. Burgoon (Ed.), Communication yearbook, 6 (pp. 925–944). Beverly Hills, CA: Sage. Rice, R.  E. (1987). Computer-mediated communication and organizational innovation. Journal of Communication, 37(4), 65–94. Rice, R. E., & Love, G. (1987). Electronic emotion: Socioemotional content in a computermediated communication network. Communication Research, 14(1), 85–108. Rutter, D. R. (1987). Communicating by telephone. International Series in Experimental Social Psychology. Oxford, UK: Pergamon Press. Short, J., Williams, E., & Christie, B. (1976). The social psychology of telecommunications. London: Wiley. Spears, R., Lea, M., Corneliussen, R. A., Postmes, T., & Haar, W. T. (2002). Computer-mediated communication as a channel for social resistance: The strategic side of SIDE. Small Group Research, 33(5), 555–574. Subrahmanyam, K., & Lin, G. (2007). Adolescents on the net: Internet use and well-being. Adolescence, 42(168), 659–678. Valkenburg, P.  M., & Peter, J. (2008). Adolescents’ identity experiments on the Internet: Consequences for social competence and self-concept unity. Communication Research 35(2), 208–231. Vromen, A., Xenos, M. A., & Loader, B. (2015). Young people, social media and connective action: From organisational maintenance to everyday political talk. Journal of Youth Studies,18(1), 80–100. Wasserman, S., & Faust, K. (1994). Social network analysis: Methods and applications (Vol. 8). Cambridge, UK: Cambridge University Press. Yates, S. J. (1996). Oral and literate linguistic aspects of CMC discourse: A corpus based study in S. Herring (Ed.), Computer-mediated communication: Linguistic, social, and cross-cultural perspectives (pp. 29–46). Amsterdam: John Benjamins (International Pragmatics Series).

chapter 9

M edi a M astery by Col l ege Stu den ts A Typology and Review Ronald E. Rice, Nicole Zamanzadeh, and Ingunn Hagen

Introduction Digital media, from the early Arpanet and email through to current developments such as social media and the Internet of Things, have changed our everyday lives and social relationships.1 But every aspect of society has also become more and more de­pend­ent on that technology. We use digital media for information, education, entertainment, interaction, and consumption, and for managing countless aspects of our lives. Yet digital media also compete for people’s attention, energy, time, identity, and relationships in a way that can be challenging, risky, and harmful for individuals, groups, and society (Xu, Wang, & David, 2016). Our concern, then, is the tension between the process of trying to master digital media, and the process of being mastered by them (Rice, Hagen, & Zamanzadeh, 2018). The purpose of this chapter is to synthesize and understand the uses and effects of digital media among college students through the framework of media mastery, pervasive but latent in the current literature. We do so by reviewing examples of media mastery factors associated with social and individual contexts. Different age cohorts and life span periods are associated with different exposure to and use of information and communication technologies (ICTs), experiences of positive and negative aspects of ICTs, and cognitive and emotional abilities used to manage ICTs (Reinecke et al., 2017). We focus on college students, because they: have grown up with an increasingly diverse array of new media, are usually experiencing significant transitions from their family and high school friends, are engaging in a wide variety of new social interactions and contexts, and are required to take personal control over

Media Mastery by College Students   251 their tasks, schedules, and relationships (DeAndrea, Ellison, LaRose, Steinfield, & Fiore, 2012; Manago, Taylor, & Greenfield, 2012; Turkle, 2011). They are also prodigious users of digital media and experience a wide array of positive and negative uses and outcomes. We note just a few examples. Amount of use is considerable. In a 24-hour tracking study of the mobile phone use of 793 university students in four countries, Mihailidis (2014, p. 58) found that 31% of the participants logged into social networking apps more than 13 times in a 24-hour period, clearly demonstrating the centrality of mobile Internet use for the “tethered generation.” In Moreno et al.’s (2012) experience sampling study with 189 undergraduate students, participants multitasked 56.5% of the time they were online. Types of use are diverse. Analysis of a week’s worth of social media usage by college students documented content-sharing, text-based entertainment/discussion, relationships, and video consumption as the main clusters of activity (Wang, Niiya, Mark, Reich, & Warschauer, 2015). Negative implications are extensive. A panel study among 484 undergraduate students from the United States found negative effects of perceived “cyber-based overload” (e.g., e-mail volume, pressure to respond, perceived pressure to post content on social media etc.) on perceived stress and overall health status (Misra & Stokols, 2012, p. 740). In a survey study with 600 student participants, LaRose, Connolly, Lee, Li, and Hales (2014) explored the effects of “connection overload” (p. 59) arising from the communication demands resulting from social media and e-mail use.

The Concept of Media Mastery Definition Media mastery is the more or less conscious and more or less successful ongoing process of how people understand, manage, make sense of, cope with, and use one or more new media in their everyday lives, as well as how media in turn come to manage, control, or affect individuals and their social relations. Media mastery includes the choices, engagement, habits, and patterns people engage in and develop in their lives regarding the use of media, its content, and its social connections (see also Picone, 2017). Our concept of media mastery entails four main arguments. 1. Media mastery invokes the reciprocality among structure, actors, and technology of structurational theory (Jones & Karsten,  2008) and adaptive structuration (DeSanctis & Poole, 1994), in the context of individuals, groups, social contexts, and new media. Thus, we apply the concept of media mastery in two ways. The first is how we master the balance and use of one or more media in different

252   Ronald E. Rice ET AL. contexts. The second is the more subtle issue of the ways in and extent to which these media master us—as our activities, concerns, and relationships are being shaped through, facilitated and constrained by, and dependent upon, the use of these media. For example, users may learn about themselves, and benefit from, online identities, but managing and repairing those requires constant connectedness, awareness, revising, and tending (boyd,  2015). This awareness of the dual nature of media (or for that matter, any technology) is not new: Postman (1996), in the context of television and computers, claimed that more important than learning how to use media is learning how they use us. 2. Crucial to the media mastery concept is the awareness, interpretation, and management of both (often simultaneously) positive and negative aspects and implications of media use. Katz and Rice (2002) applied a syntopian approach to the study of Internet use specifically to reject either a utopian or dystopian perspective. Smith (2015), summarizing a U.S. Pew survey, noted that while from 70% to over 90% reported positive benefits relative to disadvantages of their smartphone use, younger users were more likely to report both positive as well as negative emotions about their use. Best, Manktelow, and Taylor’s (2014) review of a decade’s worth of studies on online communication, social media, and adolescent wellbeing, found both positive implications (self-esteem, perceived social support, increased social capital, safe identity experimentation and increased opportunity for self-disclosure) and negative effects (exposure to harm, social isolation, depression and cyber-bullying). Much research and popular literature underscores the potential for various media dependencies, problematic use, and addiction (David, Kim, Brickman, Ran, & Curtis, 2015). 3. Tensions, contradictions, and paradoxes arise in and from media experiences. Research on new media in general and the Internet and mobile phones in particular has identified tensions, contradictions and paradoxes in their use, social construction, and implications, though with varying definitions and foci. For example, Rice, Hagen, and Zamanzadeh (2018) identified a variety of paradoxes associated with college students’ use of new media, such as being both stimulating and exhausting, and both flexible and uncontrollable. Jarvenpaa and Lang’s (2005) analysis of urban mobile device users in Helsinki, Tokyo, Hong Kong, and Austin grouped an initial set of 23 paradoxes into eight: empowerment/enslavement, independence/dependence, fulfills needs/creates needs, competence/incompetence, planning/improvisation, engaging/disengaging, public/private, and illusion/disillusion. 4. Media mastery is highly contextual, shaped by the user’s own and their social groups’ values and attitudes towards media, their motivations for using media, and the characteristics, capabilities, convergence, affordances, mobility, and personalization of media. Thus the balance between mastery of media by users, or of users by media, shifts across media, contexts, and time. Using media for some goals in some contexts may have different and even opposed interpretations or outcomes, or activate or even preclude other goals, in other contexts.

Media Mastery by College Students   253

Related Concepts We distinguish the concept of media mastery from a range of established as well as recent terms, ranging from the more individual to the more societal. Some approaches focus on individuals’ self-regulation and attention, emphasizing both psychological and cognitive aspects. For example, Wu (2015) identified four dimensions of an online learning motivated attention and regulatory strategies scale, comprising perceived attention discontinuity, and social media notifications (constituting knowledge of attention), and behavioral strategies, and mental strategies (constituting regulation of attention). Thus Wu highlights the important role of meta-attention or motivated attention. Integrating those with a variety of other measures, Wu clustered users into five categories: (1) motivated strategic, (2) the unaware, (3) the hanging on, (4) the non-responsive, and (5) the self-disciplined. Media mastery involves self-regulation but that is only one component of an individual’s experience of mastery or of being mastered. A related approach is the concept and practice of mindfulness (Schonert-Reichl & Roeser, 2016)—which highlights the importance of paying attention (in a non-judgmental way) to what you are paying attention to, and to avoid being distracted—to the use of media (Hadar & Ergas,  2018; Johnson,  2015; Levy, 2017). Levy’s exercises help students become more mindful and reflective about their technology use, to reshape use and social interactions. Johnson suggests thinking about media use as an information diet, leading to “conscious consumption.” Media literacy is more general and more cognitively oriented, emphasizing the awareness of media practices and the development of media-related skills (O’Neill & Hagen, 2009). Rheingold (2012) integrates mindfulness with media literacy, also underscoring the importance of being aware of how we think about our media use. He proposes five central digital literacies: conscious attention and intention, critical evaluation of content, participation and managing your presentation, collaboration and sharing, and developing networks and social capital. Literacy bolsters users’ awareness of various aspects of media, but via media mastery requires individuals to personalize this information to their capacities, desires, and social surroundings. James (2014) identifies three value-oriented ways of thinking about media use in relation to others, manifesting different levels of conscientious connectivity, “the use of ethical thinking skills, a sensitivity to the moral and ethical dimensions of online situation, and a motivation to reflect on and wrestle with the associated dilemmas” (James, 2014, p. 109). This varies both individually and across online communities. Thus, she asks, what are young people thinking when they use new media? The vastly increased ability to interact with (knowingly or not) diverse others across time and space deepens the gaps between (1) consequence thinking (concerned with implications of a specific action for oneself), (2) moral thinking (an application of principles with known individuals or a group) and (3) ethical thinking (an other-focused consideration of the implications for a broader community or public, concerned with roles and responsibilities thinking, complex perspective taking, and community thinking; James, 2014, pp. 5–7). A 2008–2012 study by James and colleagues identified five ethically related themes related to the use of social network sites, blogs, content-sharing sites, and gaming communities by youth and young

254   Ronald E. Rice ET AL. adults: “online identity, credibility, privacy, property, and participation” (p. 18). The concept of conscientious connectivity is about when and where youth thinking is sensitive to moral or ethical issues, and where there are blind spots (favoring self-interest over others’ interests, and where other concerns diminish ethical concerns, but also including blind spots about technical aspects of new media, such as the extent to which postings can be viewed by the general public) and disconnects (more conscious and intentional dismissal of or indifference to others’ interests in favor of self-interests). Domestication theory explains how new media, through adoption, integration, and conversion, become embedded into daily practices (initially in the home, but then applied in wider contexts), and blur traditional home/work/life boundaries (Haddon, 2003; Silverstone & Hirsch, 1992). What is new eventually becomes an artifact (Rice,  1999). Taken-for-grantedness is the social condition whereby a medium has become fully integrated into society, embedding expectations, interdependencies, and social practices (Ling, 2012), or, in diffusion of innovations terms, structured and routinized (Rogers, 2003). This concept overlaps with the more individual behavior of media use habit, or habitual media use, which itself overlaps with dependency, addiction, and general problematic use (Wilmer & Chein, 2016). Domestication and routines play a role at individual and social levels of media mastery. Over time, as individuals and societies adapt to, and adopt, the tools they’ve created, used, learned, applied, and become familiar with or dependent on, the skills for managing media and their positive or negative outcomes will change and may improve. Given the expanding realm of media choices, the concept of polymedia emphasizes that understanding, choice, and use of a medium is relative to comparisons with other available media (Madianou & Miller, 2013). Rainie and Wellman (2012) and others have discussed the growth of this multiple media environment. Experiences, from small to large, now involve multiple, multitasking, interdependent, layered, and blended media (Hilbert, Vásquez, Halpern, Valenzuela, & Arriagada, 2017). Helles (2013) characterizes this new environment with the term intermediality, especially as, with the widespread adoption and constant evolution of the mobile phone, “the user becomes a mobile terminus for mediated communicative interaction across the various contexts of daily life” (p. 14). Formerly distinct, independent, or location-specific features, and content are now available through smartphones, laptops, and tablet computers. Thus digitization, mobility, and networking create convergence across content and media, and allow or require comparisons across media choices (Jensen, 2010). Burchell (2017, p. 409) highlights that “the individual’s perception of [the] environment of increasingly differentiated communication possibilities becomes a site for managing and partially negotiating the limits, form and organization of one’s social world.” A related concept is Couldry’s (2012) media manifold, where activities are embedded in a pervasive environment of networked media. Other conceptualizations such as mediapolis (Silverstone, 2007) and medialife (Deuze,  2012) refer to the increasing embeddedness, interrelatedness and invisibility of media, creating a pervasive social, sensory, and cognitive experience (Miller, 2014). Gershon (2010) discussed media ideologies, which shape perceptions of media practice norms. Mediatization focuses more on how media are at the center of

Media Mastery by College Students   255 significant cultural, political and social developments, and become embedded and hidden (Deacon & Stanyer, 2014; Hjarvard, 2009; Miller, 2014; Livingstone, 2009). Media mastery takes a more micro focus (individuals and their social relations) than do social construction of technology or social shaping of technology approaches. The social construction of technology (Klein & Kleinman, 2002; Pinch & Bijker, 1987) centers around five major components: Interpretive flexibility (social circumstances and intergroup negotiations affect interpretation and meaning of a technology, and thus varying final designs); multiple relevant social groups (shared and competing interpretations and meanings within and across groups affect technology development and outcome); closure and stabilization (moving through and negotiating conflicting interpretations to resolution, closure, and a stable artifact); the wider context (society, culture, politics, power); and the technological frame (cognitive frame of a relevant group, with shared goals, problems, theories, procedures, and exemplars). The social shaping of technology approach(es) places more emphasis on the social, economic, and policy, in addition to the technical, aspects of innovation processes and technology form. Social, cultural, economic and institutional forces affect each (conscious and unconscious) choice among technical options, often exhibiting path dependence and varying levels of lock-in or closure, with subsequently different innovation trajectories and social implications (MacKenzie & Wajcman,  1985; Williams & Edge, 1996). Thus media mastery does not explicitly consider the origin, development, and design of technological innovations; rather, it is about the construction and shaping by (mastering), and of (being mastered), individuals in their social settings of the meanings, choices, uses, and consequences of, and by, new media already available to them.

Development of the Concept Our initial interest in college students’ use of digital media arose from our observations of the way computers and mobile phones seemingly were already central technologies in their daily lives in the early 2000s. Thus, we initiated the Media Mastery Project, where the focus is on exploring the way college students attempt to use and master digital (especially multiple) media. We first conducted a literature review and analyzed focus group interviews with students at two universities in the U.S. and Norway in 2005/2006 (Rice & Hagen, 2010). Based on those results and an updated literature review, we iteratively developed and refined a detailed Media Mastery typology. We used that to code another round of similar focus groups in 2016 (Rice, Hagen , & Zamanzadeh , 2018). For example, we discovered that students experienced attempts (conscious or not) to master media through their experiencing of paradoxes, contradictions, and tensions, while also being themselves somewhat mastered (conscious or not) by these media. Based on those results, we extended and further refined the typology to use in coding the current set of articles. Essentially, we followed Chaffee’s (1991) claim that “In practice the scholar begins reading prior studies, moves to various steps in the explication process, refines the preliminary definition, and then returns to the literature search with a sharpened definition” (p. 22). Our approach expands beyond an emergent-only or solely grounded-theory approach, which would ignore a vast existing

256   Ronald E. Rice ET AL. set of concepts and literature, as well as a solely a priori approach, which would exclude insights beyond the initial framework. Rather, it takes what Boell and Cecez-Kecmanovic (2014) call a hermeneutic approach, by engaging in iterations between (cycles of) search and acquisition, and (cycles of) analysis and interpretation. But it takes that approach even further, by including content and thematic analyses from a set of focus groups a decade apart. Both sources provided some concepts not found in the other, and revealed some different insights in different time periods. Thus the current review synthesizes how the concept of media mastery illuminates the research literature about college students’ experiencing of digital media, within social and individual contexts.

Materials and Coding Scope of the Literature The initial literature review was based on Proquest Social Sciences databases, Google Scholar, and other publications we were aware of, as well as foundational publications from the 2010 literature review. New concepts or issues arising from the focus groups lead us to seek additional relevant publications. Once the typology was fully developed, we then conducted two literature searches. Both were for the period Jan 1 2010 – Jan 1 2018, full text articles in scholarly peer-reviewed journals, or book chapters. Search terms were (student* AND (college OR university)) AND (digital OR social media OR laptop OR mobile phone OR smartphone OR personal computer OR tablet computer OR IPad OR Internet OR World Wide Web). We first searched in abstracts in Proquest (ERIC, PsychArticles, PsychInfo, Sociological Abstracts), retrieving 65, of which 7 were relevant. Then we searched in the title or abstract in the Social Sciences Citation Index, which returned 3896 publications, which were sorted by relevance (using the SSCI feature); the top 10% of the title and abstracts were read for relevance. Publications about “young adults” were included if they specifically indicated college ages. Publications were not included about: use of media for campus campaigns, interventions, or activism; studies of technology for pedagogy or educational policy, or evaluation of digital media use in classroom on performance, unless from the students’ perspective; and samples of college students without explicit focus on media use. Finally several recent highly relevant books and book chapters were added. From all these sources, we identified 218 publications. Thus our review is extensive and well-grounded and -developed, but is neither comprehensive nor statistically representative. Where possible, we obtained the full publication (.pdf, .html, .docx); 26 were not available, so we used the title and abstract. These were imported into NVIVO 11, along with the full coding typology (component code, subcodes, and subsubcodes, each of which can be aggregated to its higher level analysis). We prepared a spreadsheet with the reference and abstract for each publication. The articles and spreadsheet were separated into three sets, grouped alphabetically, one for each author.

Media Mastery by College Students   257

The Media Mastery Typology The media mastery typology includes three sets of contextual factors or occasions for media mastery (Technology, Social Aspects, and Individual Aspects), and a set of Media Mastery factors. These media mastery factors are ways in which media can more or less “master” the user, and ways in which users can attempt to more or less “master” media. Table 9.1 summarizes and briefly defines these factors (codes) and their levels of subcodes. The NVIVO project also included a Context category (location of media use by the respondents, the type of respondents, and country of the study), and a new emergent category of Theory and Frameworks (includes explicit naming of a theory or a model, as well as of primary concepts, used to frame or motivate the study), and a working category for emergent New Codes for later discussion, relabeling, and integration into the appropriate subcodes.

Table 9.1  Media Mastery Typology Codes and Sublevels Typology Codes

Subcode and (Subsubcodes)

Range of subsubcodes

Technology (refers to the technology—devices & sites, features, and uses)

Devices, services, sites (explicit mention of devices, services, sites) Features (mention of attributes, affordances, features, abilities of the technology (device, service, etc.)) Uses (ways, purposes, or activities for which respondents use the technology; also extent or type of use)

alarm clock to YouTube accessories to technical aspects achievement/ productivity/ completion to writing

Social Aspects (emphasizing the social and relational aspects and contexts— relations, influence, and self-presentation)

Social relations (bonds, relationships, interactions, social use contexts) Social influence (process, concern, behavior related to influence of one’s social context) Self-presentation (issues of and representation of self in social contexts)

affection to social ties-network co-dependence to traditional social values authenticity to superficial

Individual Aspects (individual aspects involved in or arising from or associated with use—problematic use, health, individual traits, individual cognition)

Problematic use (questionable or harmful use, whether to self or others) Health (individual psychological, physical, spiritual health issues, needs, concerns) Traits (individual personality or psychological traits) Cognition (rational, mental information processing and outcomes (attention, learning, recall, etc.))

addiction-hooked to withdrawal adjustment to symptoms disinhibition to self-esteem/self-worth academic performance to recall

(continued)

258   Ronald E. Rice ET AL. Table 9.1  Continued Typology Codes

Subcode and (Subsubcodes)

Range of subsubcodes

Media Mastery (aspects related to use, management, and implications of the technology, including contradictions, obstacles, using the content, access, boundaries, and awareness of that use)

Access (access to or accessing, the device, information, self and others) Boundaries (when or where tech use cross boundaries; where user becomes involved across system or social boundaries; the interface between tech and social) Constraints (contradictory, paradoxical, unintended, positive and negative uses or consequences) Managing content (using the tech to create, process, use, obtain content, including about self) Obstacles (difficulties in using technology) Use awareness (level and type of user awareness, intention, consciousness, self-reflexivity, decision making about their use)

access to social coordination ability accountabilityresponsibility to work-nonwork ambivalence to unintended consequences ambiguity-uncertainty to temporary or ephemeral access (technical difficulties) to viruses-malware attitudes about one’s use to use of multiple media

Note: Only the first and last subsubsubcodes for each subcode are listed here. See Table 9.2 for a list of each subsubcode for the Media Mastery subcodes. The full list of codes, subcodes, subsubcodes, and subsubsubcodes, with short operational definition, is available in the supplemental codebook.

The Coding Process Before coding, the three authors carefully reviewed and discussed every code on the typology. We next read our spreadsheet’s set of titles and abstracts to get an overview of the range of topics and terms. For coding, we first read each of our publications (excluding abstract and references) to determine and code for (1) the motivating theory, model or concept, and (2) for the population and country context. For each publication, we then checked each paragraph (excluding abstract and references) for (3) any indicators of media mastery codes. If so, we coded that paragraph for (4) any and all specific subsubcodes of the media mastery component, (5) any instance of a specific technology/ device/site subsubcode, (6) any instance of a subcode in the other two Technology subcodes, or of a Social or Individual subcode, and (7) any emergent codes into Theories and Frameworks, or into New Codes, for later discussion, grouping, and inclusion. Finally, we each also maintained and later discussed a journal within NVIVO to document any questions or suggestions. We first each coded several of the same articles, and met to discuss ambiguities or additions. From then on, each author worked on their set of one-third articles, grouped alphabetically. In the next week we separately coded 10 articles each, met to discuss the

Media Mastery by College Students   259 codings and clarifications, and documented any changes or additional codes. The following week we repeated the coding and discussion process with the next 10 articles each. Finding few additional codes by that time, we then proceeded to code and then discuss the next 20 articles each from our separate sets, and repeated that process until all articles were coded. At each meeting we discussed any new Theories or Frameworks codes or any New Codes, and then each updated our coding file with those so all coders had access to any new codes. As both the list of codes, and the coding process, evolved most during the early stages, when all articles were coded, we removed the codes from each of our first 10 articles, and recoded them using the full coding set and coding procedures. When all articles were coded, we then met to discuss all the new Theories and Frameworks, and the New Codes, and grouped similar ones, especially those with few instances. For example, under Theories and Frameworks, bridging and bonding components were grouped with the general theory of social capital; or under New Codes, managing and expressing emotions were grouped under emotions. Finally, we decided where to move each of the New Codes into the prior codes. The final typology reflects this extensive, iterative, multi-year, multi-study, and multi-data process. The typology and coding operationalizations, as well as the full list of analyzed references, are available from the first author.

Description of the Sample Technologies, Context, and Theories were not the focus of the study, were coded for occurrence anywhere in the article, and thus were not specifically related to text indicating the media mastery components. Therefore, they are not included in the co-occurrence review later in the chapter. A wide range of technologies appeared throughout the retrieved literature. These include 58 devices (from alarm apps to YouTube), representing 182 articles and 380 codings; 9 features (from accessories to personalizing settings), representing 12 articles and 20 codings; and 51 uses (from achievement to writing), involving 30 articles and 60 codings. Nearly all (202) of the 242 population subsubcodes were of college students, with a few samples consisting of adolescent or young adults that included college students. There were 209 country subsubcodes, with the most to the United States (88) and China (32), Turkey (14), and Taiwan (11), with at least one coding to another 36 countries (from Argentina to the United Kingdom). 133 out of the 218 articles mentioned a total of 131 theories, models, frameworks, or primary concepts, with 239 coding instances, ranging from the accessibility hypothesis (Yang et al., 2017) to vertical discourse (Bennett & Maton,  2010). The theories and frameworks mentioned in the most articles included social capital (14 articles), uses-and-gratifications perspective (12), addiction (10), internet addiction (9), digital natives, social cognitive theory (6 each), attachment theory, problem behavior theory (4), and cognitive-behavioral theory, cyberbullying, diffusion of innovations, life satisfaction, personality, smart phone addiction, and subjective well-being (3). Table 9.2 lists each of the Social, Individual, and Media Mastery codes and subcodes, and in the case of Media Mastery the subsubcodes, along with the number of subsubcodes,

Table 9.2  Co-occurrences of Media Mastery Subcodes with Social and Individual Aspects Subcodes Media Mastery Components and Subsubcodes

Social Aspects Subcodes

Individual Aspects Subcodes

Access (A:158, R:750)

Social relations Social influence (SC:24, A:105, (SC:21, A:36, R:307) R:82)

Self-presentation (SC:17, A:49, R:114)

Problematic use (SC:22, A:86, R:265)

Health Traits Cognition (SC:25, A:117, (SC:9, A:58, (SC:6, A:69, R:309) R:109) R:217)

1. access

6

0

0

9

17

1

6

2. access (to a medium)

0

0

0

0

2

1

3

3. access to others

0

1

0

0

0

0

0

105

28

26

74

64

29

38

5. availability of self and others through media

75

8

10

20

25

20

13

6. collaboration through media

13

6

0

4

4

2

11

7. convenience—ease

17

2

4

17

16

4

9

8. notifications

1

0

0

0

0

0

0

9. passive—low effort

4

2

0

6

3

2

3

8

1

1

0

6

1

3

Self-presentation Problematic use Health

Traits

4. accessibility—content & people

10. social coordination ability Boundaries (A:161, R:894) 1. accountability—responsibility

Social relations Social influence

Cognition

11

5

5

9

3

6

0

2. anonymity

7

1

3

9

4

7

0

3. audience

8

5

12

5

3

3

2

4. balancing online and offline self

1

0

2

2

1

1

0

5. balancing online and offline social networks

2

1

1

0

1

0

0

6. barriers to or facilitators of integration across boundaries

0

0

0

0

0

0

0

7. blending—blurring

12

5

4

4

5

3

17

8. constant connection

35

9

8

28

31

9

14

1

0

5

1

2

0

0

3

2

0

0

1

0

3

12

0

4

7

8

4

13

12. identity disjuncture

0

0

0

0

2

0

0

13. parental access

1

1

0

1

1

0

0

14. permanence

3

3

4

3

1

0

1

15. perpetual—per­sist­ent—contact (subset)

9

1

2

4

3

0

3

16. personal space

0

0

0

0

1

0

0

17. pervasive awareness

4

2

0

2

3

2

0

18. privacy

9

4

12

17

6

2

4

19. public

5

4

5

3

2

0

2

20. safety

8

2

6

6

8

1

0

22

10

41

5

6

9

2

0

0

0

0

0

0

0

23. surveillance

5

3

1

4

3

3

1

24. transitions

25

2

4

3

18

3

7

5

1

3

0

2

1

0

9. context collapse 10. continuous co-presence 11. divides

21. self-broadcasting 22. self-editing or self-censorship

25. trust

(continued)

Table 9.2  Continued Media Mastery Components and Subsubcodes

Social Aspects Subcodes

Access (A:158, R:750)

Social relations Social influence (SC:24, A:105, (SC:21, A:36, R:307) R:82)

26. ubiquity

Individual Aspects Subcodes Self-presentation (SC:17, A:49, R:114)

Problematic use (SC:22, A:86, R:265)

Health Traits Cognition (SC:25, A:117, (SC:9, A:58, (SC:6, A:69, R:309) R:109) R:217)

3

1

2

4

2

1

5

27. visibility— transparency

13

5

10

8

8

5

3

28. vulnerability

14

3

4

27

36

14

3

29. watchfulness

5

4

0

4

2

2

5

0

0

0

1

0

0

Self-presentation Problematic use Health

Traits

30. work—nonwork Constraints (A:101, R:291) 1. ambivalence

Social relations Social influence

4 Cognition

2

2

1

4

2

1

1

2. contradiction— paradox—tension

30

12

9

35

30

8

26

3. double-standards

0

1

0

0

1

0

0

4. interpretive flexibility (contrasts)

0

0

0

0

0

0

0

5. irony

1

3

0

1

2

1

1

17

2

3

9

7

5

7

7. negotiating

2

0

0

0

0

0

1

8. unintended consequence

7

4

1

14

14

3

6

6. loss or change of some traditional skills or activities or relations

Managing Content (A:129, R:551)

Social relations Social influence

Self-presentation Problematic use Health

Traits

Cognition

1. ambiguity— uncertainty

1

0

1

1

0

0

1

2. awareness

3

2

3

7

2

1

2

3. commodification

5

0

3

2

0

0

0

4. consumption

7

4

3

7

6

0

1

5. control over own content (5.4 managing content)

0

0

0

0

1

0

0

26

8

12

16

13

10

14

4

0

1

5

5

1

5

8. media multitasking

19

10

4

16

24

7

82

9. personal info

23

7

22

17

7

4

1

10

7

10

6

2

4

4

0

0

0

0

0

0

Self-presentation Problematic use Health

Traits

6. gratifying—satisfying 7. media literacy (learning how to use media)

10. produsers—to share or post 11. temporary or ephemeral Obstacles (A:49, R:213)

Social relations Social influence

1 Cognition

1. access

4

0

1

4

3

1

6

2. battery—no elec outlet

0

0

0

0

0

0

0

3. break drop lose phone or computer

0

0

0

0

0

0

0

4. change in technology; updating or upgrading

0

0

0

0

0

0

1 (continued )

Table 9.2  Continued Media Mastery Components and Subsubcodes

Social Aspects Subcodes

Individual Aspects Subcodes

Access (A:158, R:750)

Social relations Social influence (SC:24, A:105, (SC:21, A:36, R:307) R:82)

Self-presentation (SC:17, A:49, R:114)

Problematic use (SC:22, A:86, R:265)

Health Traits Cognition (SC:25, A:117, (SC:9, A:58, (SC:6, A:69, R:309) R:109) R:217)

5. compatibility

0

0

0

0

0

0

0

6. complexity

0

0

0

0

0

0

1

7. connections

3

1

0

0

1

0

5

8. costs (financial, time, psych)

4

0

2

6

5

0

11

9. distracting

6

2

0

16

8

1

32

10. frustration

1

2

0

1

2

0

0

11. info overload

1

0

0

1

5

1

5

12. interference

4

0

0

3

4

1

7

13. interruptions

2

3

0

2

2

1

5

14. passwords

0

0

0

0

0

0

0

15. spam

0

0

0

0

0

0

0

16. tech problems

1

0

0

1

1

0

1

17. techno-stress

1

2

0

2

2

0

0

18. time zones

0

0

0

0

1

0

0

19. viruses—malware

0

0

0

0

0

0

0

Self-presentation Problematic use Health

Traits

Use Awareness (A:139, R:630) 1. attitudes about one’s use

Social relations Social influence 16

6

4

16

12

11

Cognition 10

2. balance of active sharing or just viewing

0

0

2

0

0

0

0

3. balancing self and group needs

8

5

3

1

2

1

2

24

9

14

16

17

8

9

5. expertise

7

2

3

6

4

8

21

6. filtering

3

1

1

2

1

0

0

7. media comparisons

9

8

3

2

0

1

2

8. media convergence

1

0

0

0

0

2

0

9. media habit

0

0

0

0

0

0

0

10. meta-attention

0

0

0

1

0

0

0

11. monitoring or checking frequently

0

1

0

1

0

0

1

12. multiple conversations

1

0

0

0

0

0

0

13. preparing responses

5

0

4

1

3

2

0

4. choices—how when use

14. self-regulation

4

0

0

19

9

3

7

15. strategizing media use for coordination

0

0

0

0

0

0

1

16. taken-for-grantedness

0

0

0

7

3

0

6

17. techno-resistance

1

1

2

0

0

0

1

18. tool awareness

4

2

2

9

1

1

5

19. use of multiple media

1

1

1

4

3

4

2

Note: SC: Number of subsubcodes A: Number of unique articles coded C: Number of times the subcode was used As explained in the text, three components are not analyzed here: Technology: devices, services, sites (SC:58, A:182; C:380), features (SC:9, A:12, R:20), and uses (SC:51, A:30, R:60); Context: location (of the respondents) (SC:8, A:8, R:10), population (respondent type) SC:12, A:202, R:242), and country of the sample (SC:37, A:195, R:209); and Theories and Frameworks (SC:131, A:133; C:239).

266   Ronald E. Rice ET AL. the number of articles jointly coded for a specific Media Mastery subsubcode and either a Social or an Individual subcode, and the number of times each code was used, all provided by NVIVO. For the following review, we then retrieved co-occurrences in NVIVO of media mastery subsubcodes with each of the social aspects and individual aspects subcodes. Here, each of the authors coded two different sets of the subsubcodes, so that coders and materials were crossed between coding and reviewing. Only a few examples from the most frequent and/or most interesting or unique co-occurrences are used. (All citations with three or more authors are referred to as “et al.”)

Review: Co-occurrences of Media Mastery Components with Social and Individual Aspects Access The most frequent association of the subsubcodes for media access is accessibility of content and people, with social relations. The second most frequent subsubcode is availability of self and others through media, also as it relates to social relations. The third most frequent subsubcode is accessibility of content and media, in relation to problematic use for individuals. Accessibility of content and people in association with the health of individuals is also frequent. Accessibility of content and people, and social relations. Research on college students’ use of communication technologies such as cell phones and internet platforms suggests that the students are “power users” of such communication technologies (Abeele & Roe, 2011). These communication tools are significant because they enable students to stay in touch with family and friends at home, and also with new college friends, as well as search for information and enjoy online leisure activities. Aharony (2017) focused on the functions provided by mobile phones, and how these relate to students’ personality characteristics and motivation. Openness to experience and self-disclosure were important personality characteristics associated with mobile phone use. College students are particularly engaged in use of social networking sites (SNSs), which provide them with access to more information and more experiences than they would in a more closed university environment (Chen & Marcus,  2012). If properly framed, these expanded exposures through SNSs can facilitate learning for university students, by use of new social connections to share ideas, build their new student identities, and develop their own learning paths. Thus, SNSs seem to facilitate the transition for students to their new environment, both in terms of socialization and a sense of connection to their institution (DeAndrea et al., 2012; Gray et al., 2013).

Media Mastery by College Students   267 In their recent book Technology and engagement Rowan-Kenyon and Alemán (2018) use the term “ecology of transition” to describe how social media were important in making it meaningful for new students to be in college, as well in assisting their integration into the university setting. However, the use of social media and being online can also be a double-edged sword for students: “Well, I think that it’s really bad if I’m not on top of my e-mail, and it’s really bad if I’m not up to date on the class stuff that I have to do on the Internet. I also definitely value keeping in touch with friends. So all those are things that I really like to be able to do. But it’s tough, because it becomes a real time tradeoff. Often when I do those things, I end up going to other places on the Internet that aren’t so valuable to me.” (Davis, 2011, p. 1972)

Accessibility of content and people, and problematic use. In addition to being communication tools, mobile phones provide access to the Internet and are digital environments where students can seek entertainment, shop, and manage finances (Gökçearslan et al., 2016). However, overuse creates new social problems (Bian & Leung, 2015). The dramatic increase in use of smartphones worldwide has resulted in problematic use related to accessibility of content and people. Smartphone addiction is a concern in many countries (Hong et al.,  2012). According to Demirci et al. (2015), smartphone addiction can be defined as the overuse of smartphones to the extent that it disturbs users’ daily lives. These authors, like Aker et al. (2017), find that psychological problems such as depression and anxiety, and challenges such as insomnia and lack of family social support, can predict smartphone addiction. Chen et al. (2016) find that both internet and mobile phone addiction are closely related to interpersonal problems. According to Aladwani and Almarzoug (2016), low self-esteem also correlates with compulsive use of social media. More generally, social media and easy access to the internet seem to make students more vulnerable to compulsive media use. However, for people with inter-personal problems like social anxiety, face-to-face interaction can be challenging, so having contact with people on Facebook and via their smartphone could be easier (Clayton et al., 2013). Several researchers are also concerned with how overuse of smartphones can have damaging effects on students’ academic performance (Aljomaa et al., 2016). Students may use the phone and be inattentive during lectures, or they disturb others by sharing content like new tones, songs, and YouTube videos with classmates and fellow students. However, as the authors mentioned emphasize, one should not forget the potential positive effects of smartphones in facilitating communication and in sharing information among teachers and students. Also, many students develop various strategies to manage the distractions of smartphones and laptops (Ames, 2013). Still, controlling intrusions can be challenging because both work and distractions are present on the same devices. Thus, multi-tasking is a constant temptation for a number of students (Flanagin & Babchuk, 2015). This is the reason why Chen et al. (2016) characterize mobile phones as a “double-edged sword” for young adults. Thus, Flanagin and Babchuk (2015) characterize

268   Ronald E. Rice ET AL. social media as “academic quicksand”; when you get in, it is hard to escape, even though students describe how they try hard to manage social media. A problematic aspect of the increased accessibility from the viewpoint of the film and music industry is the increasing digital piracy (Duarte et al., 2016). Even though the internet has increased possibilities for distribution of their products, digital piracy is a daily worry for these companies. Another problematic aspect of increasing accessibility of people via new technologies is the growing phenomenon of cyberbullying (Crosslin & Golman, 2014). Experiencing cyberbullying can be so detrimental to the victims that some of them—like the media-exposed cases of Tyler Clementi and Jessica Logan—have committed suicide. Availability of self and others through media, and social relations. Smartphones and social media are central in making self and others available, and thus impacting people’s lives and especially their social relations (Amankwah & Ha, 2015). These authors discuss smartphones as providing great self-broadcasting power, often through the use of SNSs. However, while smartphones are demanding attention from users, many students would emphasize that they would not interrupt a F2F interaction with a phone call (Ames, 2013). A number of students explained that they avoided checking messages or even having their phone out when with other people or in special social situations (like a date or a dinner party). Others went out of their way not to be too accessible by phone. Students, like other young people, are increasingly dependent on social networking sights for their socialization, information-seeking, and self-broadcasting. Fang and Ha (2015) claim that students’ SNS consumption is positively associated with social capital and social support, especially for individuals with low psychological resources. However, Manago et al. (2012) ask whether there is a trade-off between having a large network on SNSs like Facebook and being able to develop intimacy and social support among fellow emerging adults. Their results confirm that Facebook mainly facilitates more distant kinds of relationships, like acquaintances and activity-based connections, while also reinforcing and expanding the number of close relationships. According to Manogo and co-authors, the major function of people’s status updates was emotional disclosure, which plays an important role in developing intimacy. These results indicate that the nature of intimacy is being transformed, and that large networks were related to higher levels of life satisfaction, and also of perceived social support.

Boundaries The most frequent media boundaries subsubcode co-occurrences are self-broadcasting as it relates to self-presentation; vulnerability with both health and problematic use; constant connection in association with social relations, health and problematic use; and transitions with social relations. Constant connection, with social relations. Constant connection easily creates tension, due to the social effect of mediated communication, multi-tasking, and having constant

Media Mastery by College Students   269 technology access, especially facilitated through smartphones. Ames (2013) notes that so-called digital natives negotiate with (both embracing and rejecting) the social expectations enabled by new technologies. Marlowe et al.’s (2017) study interviewed students in Auckland from five ethnic minority groups to examine the role of social media in their social interactions. New media are especially impactful influences on their daily lives, affecting friendship and family networks, providing access to community engagement, and helping a sense of belonging in their diverse society. In recent decades, new webbased technologies and social media sites are also increasingly being integrated into learning contexts as well as daily life, becoming “inevitable” (Pilli, 2015, p. 345). Constant connection, with health. Internet addiction is increasingly a risk area among college students, which, along with drug use, has been identified as risk factors for youth suicide (Aricak et al., 2015). Generally, it seems that smartphone addiction co-occurs with other social, domestic, and academic problems among students. Hassell and Sukalich (2016) note studies finding that higher levels of internet or social media use are negatively associated with life satisfaction. Turkle (2011, p. 276) underscores that “The time-consuming constant demands for attention and performance becomes stressful and distracting, limiting creativity and reflection.” Hong et al. (2012) write about how mobile phone addiction relates to anxiety and self-esteem, claiming that mobile phone addiction “has an indirect effect upon the relationship between anxiety and mobile phone usage behavior and between self-esteem and mobile phone usage behavior” (p. 2158). Similarly, Hussain et al. (2017) report that smartphone dependency, mediated by constant use, can lead to anxiety when the phone is not available. Constant connection, with problematic use. A number of students report how the smartphone allows for constant connection, being in contact with friends, wasting time, and playing games. Ames (2013), for example, reports how some students felt that they have lost their independence by being constantly connected and needing to check on and with people. They also expressed that being constantly connected, they were hardly fully present anywhere. A number of students also expressed resistance towards these constant connection norms and habits, and that multi-tasking was a constant temptation or threat. Students depend on their technological devices to the extent that they feel anxious and tense when the technology is not readily available or when their attention is drawn towards what the technology has to offer (Bicen & Arnavut, 2015). Self-broadcasting, with self-presentation. According to Aricak et al. (2015), social networking sites increasingly serve as mandatory experiences for young people’s identity construction. Self-presentation could be a central part of this process. Chen and Marcus (2012) notice how SNSs provide new arenas for individuals to present themselves, access and broadcast information, nurture their social networks, and establish and maintain connections with others. Amankwah and Ha (2015) conducted a study of smartphones and self-broadcasting among college students via social media. As much as 85.2% of college students self-broadcast at least once a month by updating their status on SNS. Network size, years of experience using social media, and the time spent on social media predicted frequency of self-broadcasting (which occurs mainly within one’s network).

270   Ronald E. Rice ET AL. While most students set their profile as private or semi-private, that did not affect selfbroadcast frequency. Students spend much of their lives in an electronic world, which continuously demands their attention. According to Dalton and Crosby (2013), social media have the most seductive influence on college students’ attention. Since social media have become so engaging for young students, they develop a digital identity, which is “the composite of images that individuals present, share, and promote for themselves in the digital domain” (p. 1). Kim and Lee (2011) distinguish between positive and more honest selfpresentation on Facebook. With honest self-disclosure one is more likely to receive support from Facebook friends, which can be beneficial to students’ social wellbeing and happiness. Sponcil and Gitimu (2013) similarly suggest that intimate self-disclosures help produce greater intimacy in computer-mediated communication. PunyanuntCarter and her colleagues (2017) discuss self-presentation on SnapChat, which is geared towards already solidified interpersonal relationships, like close friends and family. The information disclosed is high in intimacy, and often mundane. Thus they report that on SnapChat “the ‘true’ self can be represented rather than the ‘best’ self as is the norm with other social media sites” (p. 891). Transitions, with social relations. The transition between high school and college is a major one, where students often move to a new location and have to establish new social and academic networks, while trying to maintain old networks with family and friends. Abeele and Roe (2011, p. 237) find that for American new students, “the transition to college involves the task of building up a new social network, and communication technologies play an important role in supporting this process.” Many students need support to cope with problems they might face in order to adjust to transitions (Fang & Ha, 2015). These authors argue for using the concept of self-efficacy to understand how young people continuously use information from both offline and online environments to reevaluate themselves. DeAndrea et al. (2012) discuss a social media intervention intended to increase incoming students’ feelings of connectedness to the university, reduce uncertainty about college, and influence positive expectancies, as understood within a social capital framework, to foster a healthy college transition. Vulnerability, with health and problematic use. Internet addiction disorder has become a clinical concept during the last two decades. Numerous studies find that smartphone addiction is related to a number of psychological and behavioral problems. Kuang-Tsan and Fu-Yuan (2017) argue that smartphone addiction may also be related to life-stress for university students. Floros et al. (2014) suggest that college students are particularly vulnerable to internet addiction disorder “due to the particular psychological and developmental characteristics of late adolescence/young adulthood, ready access to the Internet, and an expectation of computer/Internet use during studies” (p. 672). Students who scored high on internet addiction disorder were higher on psychopathology and distress; they were more lonely, had lower self-esteem, and reported more anxiety and depression than others. Moreno et al. (2015) found increased risk for problematic internet use and addiction among those with most severe depression.

Media Mastery by College Students   271 According to Polo et al. (2017), the mobile phone has caused traditional socialization spaces to be replaced by virtual ones. Mobile phone use can be risky for young people, especially because they use mobile phones almost constantly. This could be detrimental to young people’s psychological and social functioning. Heavy users are also more likely to become problematic or addicted users. Polo et al.’s results indicate that “age, field of knowledge, victim/aggressor profile, and hours of mobile phone use are crucial variables in the communication and emotional conflicts arising from the misuse of mobile” (p. 245). The large presence of self-injury sites on social media and YouTube24 is seen as alarming, as self-injury exposure is feared to be socially contagious, inspiring vulnerable individuals to experiment with self-harm (Jarvi et al., 2017). Some vulnerable individuals such as those with a tendency to seek novel and intense sensations and experiences may use dating apps to look for drugs or sexual intercourse (Choi et al., 2017). Cyberbullying is a very serious outcome of inappropriate use of technology, which has resulted in mental health problems among victims and even suicide (Crosslin & Golman, 2014). Moreover, cyberbullies themselves are impacted negatively by their bullying behaviors.

Constraints The four most frequent occurrences involved contradiction/paradox/tensions (at the heart of media mastery), with social relations, problematic use, health, and cognition. These were followed by a loss or change of some traditional skills or activities or social relations, and unintended consequences related to both problematic use, and to health. Here, we focus on four of these few topics that are not much already covered in other sections. Contradictions, and social influence. Social media help users keep in touch with peers and groups, a source of social influence, and enable access to social support. But social media can also create pervasive anxiety from a “fear of missing out” (FOMO) from salient activities and discussions experienced by those others (Alt, 2015). Indeed, a central contradiction of these new media is that while connectivity is generally positive, the need to constantly monitor others’ communication, and expectations from others for constant connectivity, creates stress and excessive use. Many students are aware of, and concerned about, this contradiction (Ames, 2013). Also, in attempts to strengthen one’s group identity and gain status, users may post messages and photos of activities that are quite harmful (self-injury, for example; Jarvi et al., 2017). Contradictions, and cognition. While thoughtful social media use can improve learning experiences (Castillo-Manzano et al., 2017), compulsive use can degrade academic performance (Aladwani & Almarzouq, 2016), due to factors such as studying for shorter periods and even being more susceptible to being victims of crime (Aljomaa et al., 2016). Some students do attempt to engage in what Ames (2013) calls “techno-resistance” to expectations for constant connectivity, by establishing boundaries or even disconnecting

272   Ronald E. Rice ET AL. from their devices, aiming to lower negative cognitive implications of multitasking. Some studies help explain some contradictions in results by showing that use and knowledge of different Facebook features or activities differentially affect academic outcomes (Wohn & LaRose, 2014). Loss or change, with social relations. Many authors have argued that new media, as with prior communication technologies when they were new (Jensen,  1990; Marvin, 1990), are profoundly affecting individual and social relationships and norms (boyd, 2015; Turkle. 2011). For example, social networks may develop and endure based on members’ using similar technology and apps, in order to avoid inconveniences in communicating with everyone (Bicen & Arnavut, 2015). Or, because online social interactions are so common and normative, factors such as introversion and extraversion may be far less influential on people’s experienced lives (Yao et al., 2014), and users may be more aware of, and able to participate in, a more diverse array of identities and types of relationships (Yang, 2014). More subtly, people are more likely to communicate with multiple others online while with others offline, and even just stay “connected” while not actually exchanging messages (Vorderer et al., 2016). Unintended consequences, with health. The very utility and attractiveness of smartphones, social media, and other digital devices lead to a variety of unintended consequences. Primary among these is internet/smartphone addiction or problematic internet use, associated with a wide variety of health dysfunctions, from depression to obesity (Li et al., 2015). Sleep deprivation may be a consequence of excessive device use (Demirci et al., 2015), with attendant schoolwork procrastination, and then students staying up late to (ineffectively) rush through their work (Li et al., 2015). As college Facebook posts and profiles present more images of alcohol use and drinking parties, from a peer norms theoretical perspective, viewers (incorrectly) perceive higher alcohol use as descriptively normative (Clayton et al., 2013), leading to more positive attitudes toward, and behaviors of, excessive drinking. The pervasiveness of and dependence on digital devices for schoolwork as well as social relations and entertainment may also be associated with musculoskeletal symptoms (Dockrell, Bennett, & Culleton-Quinn, 2015).

Managing Content The most frequent co-occurrences with managing content (including people) involved having a gratifying-satisfying experience in relation to social relations as well as problematic use; media multitasking in association with social relations, health, and cognitions, and personal information in the context of self-presentation and problematic use. Gratifying and satisfying, and social relations. While using media for gratification can become problematic, the social component of media mastery provides an explanation for why media are gratifying. Social connection and social information are inherently embedded in media, especially social media, which has increased the availability

Media Mastery by College Students   273 and diversity in connections and information available. Particularly, the need for social support and connection motivate a great deal of media use. For example, New Zealand college students with ethnic minority or migrant identities used social media not only to establish and maintain intimacies, but also to exchange information about oneself, and others, to determine who will be admitted into existing friendship networks (Marlowe, Bartley, & Collins, 2017). Though the social support received through media use can improve well-being, the need for social interaction and new social information often becomes habitual (Meier et al., 2016). This suggests that media’s role in sharing social information can become a source of tension as it both contributes to well-being and potentially detracts from it. For example, Meier et al. (2016) find the constant checking that becomes habitual leads to usage conflicts which can ultimately reduce well-being and task performance. Gratifying and satisfying, and problematic media use. Mastering media includes using media for one’s own needs. People must manage content in ways in that fulfill their own desires. However, in the literature there is evidence that even when one can use media to gratify their needs, that use can become problematic. Research on the addiction to mobile devices and Internet finds that the reward experienced from media use can become dysfunctional and excessive. Meier et al. (2016), for instance, showed that students use media to provide relief, reward and relaxation from work and from negative experiences. However, in pursuit of reward students express that they begin to procrastinate. This procrastination on academic work can become detrimental. Likewise, Chiu (2014) found that when experiencing life stressors, young adults use mobile phones to alleviate their negative emotions. The gratification and stimulation experienced from using media, however, had led to addiction for those who were not capable of self-regulation. While all college students may experience the gratifications of using media, the motivation for, media choices and outcomes of that use may reflect individual differences. Media multitasking, and cognition. Within the media mastery framework, media multitasking or the splitting of attention between media and other tasks, is a method of managing the various content available. Thus far, cognition or the ability to focus and learn is the commonly studied aspect of individual differences in the media multitasking literature. Media multitasking appears not only a method for managing content but also for managing focus and learning. Some master this management, others do not. Ames (2013) reported that while some students report frequently media multitasking, the majority of students expressed that they have set rules in order to reduce the negative cognitive effects of media multitasking. This implicates that young adults are sensitive and strategic in managing their media and media multitasking habits to avoid cognitive harm. In addition to frequency, college students differ in the types of tasks with which they media multitask. Fan et al. (2017) noted that those that display higher metacognition engaged in less irrelevant media multitasking during difficult learning tasks. This suggests that managing content via media multitasking involves managing the cognitive load and effects of the media used.

274   Ronald E. Rice ET AL. Media multitasking, and social influence. However, managing media via media multitasking behaviors is not only driven by cognitive preferences and capacities. Rather, Ames (2013) concluded that the social pressure to be available both to immediate surroundings and extended networks formed a double-standard that created pressure to media multitask. Students reported that media multitasking is coupled with constant guilt both for not being fully present to their offline reality and for not being fully present to their online reality. Personal information, and self-presentation. Within the construct of managing content, people’s desire to connect with others via the Internet requires them to manage and interpret the information they share with and receive from their social network. In our coding we referred to this subcomponent as personal information, and defined it as involving personal self-disclosure and information available about others. Personal information was commonly cross-coded with the individual component of self-presentation. The findings demonstrate the tension in mastering sharing personal information through media. Moreno et al. (2011) showed that young adults frequently expressed that they knew that people exaggerate and even misrepresent themselves on social networking sites. However, the students still found this information valuable and used it to form first impressions of others. They shared that sometimes the personal information they found about others was even accurate. They had friends whose SNS reflects them well. The credibility of information online even on social networks was understood as flawed yet useful. However, loss of control or management over one’s personal information and images can severely affect one’s self-presentation, leading to cyberbullying, “revenge porn,” and even suicide (Virden, Trujilo, & Predeger, 2014). Personal information, and health. Though the personal information shared online has social value, it can cause harm to young adult’s well-being. For instance, Tandoc et al. (2015) explain how the information about others found online is also used to inform the user about what’s attractive, and how people’s feedback can be used to identify how attractive one is to his/her social network, leading to social comparison. Tandoc et al. (2015) provide an example of how young adults use this information to identify their social rank. They contend that if students find themselves unattractive, they often feel envious and depressed. Thus, though the information may be useful for navigating one’s social network, this also fuels comparison which can be detrimental.

Obstacles Physical and technical obstacles to college students’ use of media do not much appear in research publications, though they were mentioned in the focus groups. The most frequently co-occurring were distracting, with problematic use, health, and cognition; costs with problematic use, and cognition; interference, with cognition; and access, with cognition. Access, with cognition. It is obvious that not having access to relevant new media constitutes a grave challenge to students (Goode, 2010). While many students use mobile

Media Mastery by College Students   275 devices for academic practices, Fasae and Adegbilero-Iwari (2015) reported that many students are challenged by obstacles of low quality Internet connections and high data subscription costs. Ironically, some students are concerned that pervasive access to social media (diverting attention, energy, and time from academic work) may lead to obstacles to success later on (Flanagin & Babchuk, 2015). Further, relationships between use of social media for online content creation are affected by more than just traditional digital literacy—they include the kinds of peer support, practices, and technologies that university students have access to, and bring with them, in the first place (Brown et al., 2016), which also extends the concept of the digital divide. In turn, experience in digital creation can provide advantage in the global society, possibly widening certain kinds of disparities in access and use. Costs, with problematic use. Intriguingly, smartphone costs are not only a form of obstacle to access, but also an aspect of problematic use, as over-dependence on smartphones can foster excessive overspending on accessories, upgrades, apps, and data (Aljomaa et al., 2016). Indeed, some studies note that students want to have the most recent device or product regardless of price (Bicen & Arnavut, 2015). Distracting, with health. Many studies refer to the “double-edge” sword nature of the mobile phone, which can provide personal, social, and business benefits as well as disadvantages and harm. For example, many refer to the distractions from one’s own use and the use by others, reducing focus and attention on activities and social relationships, and creating physical and mental health problems (Chen et al., 2016). People may turn to smartphone or internet over-dependence as a distraction from other health or life stress issues (Chiu, 2014; Kuang-Tsan, & Fu-Yuan, 2017). Impaired inhibition and attention deficit hyperactivity disorder (ADHD) are associated with increased risk of Internet addiction (Dalbudak et al., 2015). The increased need to maintain constant connectivity, and engage in multitasking, can harm mental and emotional development and create ongoing distractions from relationships and self-reflection (Davis, 2011). Distracting, with cognition. Digital device use during class creates distractions for the user, surrounding students, and even the instructor (Aljomaa et al., 2016; Jacobsen & Forste, 2011), negatively affecting user academic performance (related to issues such as reduced details in class note-taking, less cognitive processing of the content, and poorer recall; Kuznekoff & Titsworth, 2013). Multitasking in general is frequent in class, and typically negatively affects students’ ability to learn content (Judd, 2014; Junco, 2012). Similar issues arise, but with much graver potential consequences, for students who are distracted by their devices while walking or driving (Kim & Kim, 2017).

Use Awareness Here, the most frequent co-occurrences involved attitudes about one’s use, with social relations and problematic use; choices as to how and when to use a medium, with social

276   Ronald E. Rice ET AL. relations, self-presentation, problematic use, and health; digital expertise, with cognitions; and self-regulation, in association with problematic use. Attitudes about one’s use, and social relations. One aspect of media mastery becoming increasingly important with the ever-growing popular social media is people’s (especially young adults’) perceptions of media as means and context for social connection. Pilli (2015) found that students’ perceptions that Facebook was useful for their social adjustment and relationship maintenance explained why socially competent Facebook users exhibited better psychosocial well-being. This result highlights that individual dispositions play a role in students’ likelihood to have positive attitudes towards their media use. Attitudes about use, and problematic media use. While problematic use includes addiction and dependency, it also includes other dangers such as cyberbullying and revenge porn. We found frequent co-occurrences between problematic use and the attitudes people have about their use of media. Virden et al. (2014) highlight how, especially among young adults, perceptions of the use of media (for instance to explore their sexuality via sexting) affects their likelihood to use media in ways that put them at risk. They found that few young adults recognize the risk of engaging in these online sexual behaviors. The interpretation of their risk in using media affects their vulnerability to that risk. Some studies (and our focus groups) find that students have generally positive attitudes about their digital media, but are aware of wasting time, fear of missing out, being over-dependent, and other problematic uses and effects, but feel they must continue to use their media, and are even resigned to doing so, both on psychological and pragmatic grounds (Turkle, 2011). Choices, and self-presentation. Choices about how to use media are both individually and socially motivated as people manage their online identities in the face of potential context collapse; Thomas et al.,  2017) and privacy issues. Hoy and Milne (2010) detailed, for instance, how privacy protection practices varied from lying to post-hoc changes and image management. These practices also varied by gender: men and women differed in their concerns about self-presentation as a privacy issue and their strategies to cope with potential problems, and those choices in turn were related to various media effects. With the understanding of which practices are least to most successful, understanding the choices users make in self-presentation could lead to more targeted and effective interventions. Choices, and health. Scholars are discovering ways to identify the profiles of usage that are more likely related to diminished well-being. For example, Park et al. (2013) found a relationship between depressive symptoms and uses of Facebook such that a user’s activeness and uses of features could be associated with specific symptoms, highlighting that the ways people use media can reveal and reflect the state of their mental and emotional health. The researchers explore the possibility of using these profiles of how and when people use media to improve or increase diagnostic capacity for depression. Expertise, and self-presentation. In the articles discussing these two concepts, an important and interesting set of tensions arises. Axelsson (2010) discusses that young adulthood is a special developmental period in which the need to express oneself

Media Mastery by College Students   277 becomes increasingly important. However, the ability to do so competently can rely on technological skills, including one’s ability and understanding of Internet uses. He contends that a lack of understanding and competence about the internet can translate into a lack of competence in developmental integral capacities, including self-expression. Ishii et al. (2017) argue that young adults with greater communication competence prefer face-to-face communication for self-disclosure in order to most benefit from the increased cues. Their perspective implies that while all young adults need self-expression, their choice to not express online might not be due to a lack of Internet skills but rather better traditional communication skills. Expertise, and traits. Expertise or the ability to use media with skill is perhaps one of the most seemingly obvious forms of media mastery within use awareness. Chang et al. (2014) documented that traits such as internet self-efficacy not only increased confidence in an online course but affected perceptions of the online course as relevant and was associated with better course performance. There were many similar findings across the literature. Media comparison, and social influence. Group level differences such as shared norms also contribute to one’s preferences, comparisons, interpretations and uses of media. Cultural differences provide one example of such social influence. Ishii et al. (2017) concluded that US and Japan college students differentially perceived text-messaging as a form of communication. The US students perceived text messaging as having more media richness, reduced cues, quickness, ubiquity of the sender and the receiver, satisfaction, effectiveness, and level of comfort than did the Japanese students. The authors proposed that these perceptions follow cultural norms of communication; for example, that Americans prefer direct communication. Students interviewed by Ames (2013) said that pressures such to be constantly available did lead them to alter the particular medium they used in order to meet both their own contexts and the expected contexts of their communication partners. Self-regulation, and problematic media use. Self-regulation represents the capacity to control and manage one’s behaviors. It occurs frequently in the literature, and is an essential aspect of use awareness within the media mastery typology. Wu (2015) validated four dimensions of motivated attention and regulatory strategies by students using social media: perceived attention discontinuity, behavioral strategies, mental strategies, and social media notifications. Integrating those with a variety of related measures (from Internet self-efficacy to academic achievement), Wu identified five categories of students with respect to attention and regulation: motivated strategic, the unaware, the hanging on, the non-responsive, and the self-disciplined. Self-regulation and problematic media use (cyberbullying, to cyberstalking, to dependency and addiction) are frequently co-occurring concepts in the literature. For instance, Gökçearslan et al. (2016) explain that those who are low on self-regulation tend to experience more ego depletion and less focus, and therefore are more likely to engage in cyberloafing where they do not contribute to or benefit from the group. Similarly, Jiang and Shi (2016)

278   Ronald E. Rice ET AL. indicated that people with diminished self-control or trait-like self-regulation engage in problematic media use to alleviate negative emotions. Their engagement in problematic media use can also be diminished through interventions targeting self-control, which is the stable trait form of self-regulation. Thus self-regulation is one way through which the potentially mastered try to develop media mastery.

Conclusion In general, this review shows how the concept, and very detailed typology, of media mastery pervades the more familiar contexts, analyses, and results of research on college students’ use of new media. Each subsection of the review can generate one or more implications of the media mastery framework. For example, mastery of media is a subjective experience that involves believing in one’s expertise and capacity to use media. Masters of media can engage in beneficial media multitasking, while those mastered by media might engage in harmful media multitasking. Media mastery may occur through individual interpretation but can be heavily influenced by the practices and expectations of one’s social network. The masters and mastered potentially seek and react to the potential gratifications from media differently. The tensions in media mastery demonstrate how young adults attempt, sometimes successfully, sometimes not, to navigate the complexities of media use. The media mastery framework allows a more analytical approach, by integrating a variety of prior and new perspectives, highlighting the relevance of many diverse concepts, and revealing associations among many otherwise disjointed or typically unlinked concepts. As a complement to the theoretical perspectives appearing in most of the articles, media mastery would have provided an especially relevant framework for some of the studies. Media mastery could identify and describe a phenomenon heretofore with no name or with a variety of unintegrated names (as summarized in the Related Concepts section), especially the simultaneous two-way mastery of and by media. So the media mastery perspective provides a lens, and allows for nuances, into how users (here, college students) are potentially mastered by new media, but also attempt to potentially master those media. As just one example, this perspective on the diverse research of college students’ digital media use highlights the pervasive paradoxes and contradictions as manifestations of the tensions between attempts to master media and the ways in which media master us. The media mastery framework illuminates the double-edged nature of media technology and especially social media in the lives of students as well as other young people. Moreover, the notion of media mastery may also capture the contradictory and mixed feelings (ranging from pleasure to guilt) that young students and others experience in their daily use of contemporary media technologies. The vast range of ways mastery or being mastered occurs—in association with social and individual aspects, among others—may also be a reflection of the complex, interdependent, and contextual nature of digital media. The detailed and extensively developed media mastery framework may

Media Mastery by College Students   279 help researchers think in new ways about what questions their work attempts to answer— i.e., what aspects of media mastery does their work highlight or extend?

Note 1. We acknowledge support for data collection and software through Dr. Ronald  E.  Rice’s endowed Arthur N. Rupe Professorship in the Social Effects of Mass Communication in the Department of Communication, UC Santa Barbara.

References from Introductory Material Best, P., Manktelow, R., & Taylor, B. (2014). Online communication, social media and adolescent wellbeing: A systematic narrative review. Children and Youth Services Review, 41(June), 27–36. Boell, S. K. & Cecez-Kecmanovic, D. (2014). A hermeneutic approach for conducting literature reviews and literature searches. Communications of the Association for Information Systems, 34(1), 257–286. boyd, d. (2015). It’s complicated: The social lives of networked teens. New Haven, CT: Yale University Press. Burchell, K. (2017). Everyday communication management and perceptions of use: How media users limit and shape their social world. Convergence, 23(4), 409–424. Chaffee, S. H. (1991). Explication. Newbury Park, CA: Sage. Couldry, N. (2012). Media society world. Cambridge, UK: Polity. David, P., Kim, J.-H., Brickman, J. S., Ran, W., & Curtis, C. M. (2015). Mobile phone distraction while studying. New Media & Society, 17(10), 661–679. Deacon, D., & Stanyer, J. (2014). Mediatization: Key concept or conceptual bandwagon? Media, Culture & Society, 36(7), 1032–1044. DeAndrea, D. C., Ellison, N. B., LaRose, R., Steinfield, C., & Fiore, A. (2012). Serious social media: On the use of social media for improving students’ adjustment to college. The Internet and Higher Education, 15(1), 15–23. DeSanctis, G. & Poole, M. S. (1994). Capturing the complexity in advanced technology use: Adaptive structuration theory. Organization Science, 5(2), 121–147. Deuze, M. (2012). MediaLife. Malden, MA: Polity. Gershon, I. (2010). Media ideologies: An introduction. Journal of Linguistic Anthropology, 20(2), 283–293. Hadar, L. L. & Ergas, O. (2018). Cultivating mindfulness through technology in higher education: A Buberian perspective. AI and Society, 1–9. https://link.springer.com/article/10.1007/ s00146-018-0794-z Haddon, L. (2003). Domestication and mobile telephony. In J. E. Katz (Ed.), Machines that become us (pp. 43–56). New Brunswick, NJ: Transaction Press. Helles, R. (2013). Mobile communication and intermediality. Mobile Media & Communication, 1(1), 14–19. Hilbert, M., Vásquez, J., Halpern, D., Valenzuela, S., & Arriagada, E. (2017). One step, two step, network step? Complementary perspectives on communication flows in twittered citizen protests. Social Science Computer Review, 35(4), 444–461.

280   Ronald E. Rice ET AL. Hjarvard, S. (2009). Soft individualism: Media and the changing social character. In K. Lundby (Ed.), Mediatization: Concept, challenges, consequences (pp. 159–177). New York, NY: Peter Lang. James, C. (2014). Disconnected: Youth, new media, and the ethics gap. (The John  D.  and Catherine T. MacArthur Foundation Series on Digital Media and Learning). Cambridge, MA: The MIT Press. Jarvenpaa, S.  L. & Lang, K.  R. (2005). Managing the paradoxes of mobile technology. Information Systems Management, 22(4), 7–23. Jensen, J. (1990). Redeeming modernity: Contradictions in media criticism. Newbury Park, CA: Sage. Jensen, K. B. (2010). Media convergence: The three degrees of network, mass, and interpersonal communication. London, UK & New York, NY: Routledge. Jones, M. R. & Karsten, H. (2008). Giddens’s structuration theory and information systems research. MIS Quarterly, 32(1), 127–157. Katz, J. E. & Rice, R. E. (2002). Social consequences of Internet use: Access, involvement and interaction. Cambridge, MA: The MIT Press. Klein, H. K. & Kleinman, D. L. (2002). The social construction of technology: Structural considerations. Science, Technology, & Human Values, 27(1), 28–52. LaRose, R., Connolly, R., Lee, H., Li, K., & Hales, K. D. (2014). Connection overload? A cross cultural study of the consequences of social media connection. Information Systems Management, 31, 59–73. doi:10.1080/10580530.2014.854097 Levy, D. M. (2017). Mindful tech: How to bring balance to our digital lives. New Haven, CN: Yale University Press. Ling, R. (2012). Taken for grantedness: The embedding of mobile communication into society. Cambridge, MA: The MIT Press. Livingstone, S. (2009). On the mediation of everything: ICA presidential address 2008. Journal of Communication, 59(1), 1–18. MacKenzie, D. & Wajcman, J. (Eds.) (1985). The social shaping of technology: How the refrigerator got its hum. London, UK: Open University Press. Madianou, M. & Miller, D. (2013). Polymedia: Towards a new theory of digital media in interpersonal communication. International Journal of Cultural Studies, 16(2), 169–187. Manago, A. M., Taylor, T., & Greenfield, P. M. (2012). Me and my 400 friends: The anatomy of college students’ Facebook networks, their communication patterns, and well-being. Developmental Psychology, 48, 369–380. doi:10.1037/a0026338 Marvin, C. (1990). When old technologies were new: Thinking about electric communication in the late nineteenth century. New York: Oxford University Press. Mihailidis, P. (2014). A tethered generation: Exploring the role of mobile phones in the daily life of young people. Mobile Media & Communication, 2, 58–72. doi:10.1177/2050157913505558 Miller, J. (2014). Intensifying mediatization: Everyware media. In A. Hepp & F. Krotz (Eds.), Mediatized worlds: Culture and society in a media age (pp. 107–122). London, UK: Palgrave Macmillan UK. Misra, S. & Stokols, D. (2012). Psychological and health outcomes of perceived information overload. Environment and Behavior, 44, 737–759. doi:10.1177/0013916511404408 Moreno, M. A., Jelenchick, L., Koff, R., Eikoff, J., Diermyer, C., & Christakis, D. A. (2012). Internet use and multitasking among older adolescents: An experience sampling approach. Computers in Human Behavior, 28, 1097–1102. doi:10.1016/j.chb.2012.01.016 O’Neill, B. & Hagen, I. (2009). Media literacy. Chapter 18. In S.  Livingstone & L.  Hadden (Eds.), Kids online: Opportunities and risks for children. Bristol, UK: Policy Press.

Media Mastery by College Students   281 Pinch, T. & Bijker, W. (1987). The social construction of facts and artifacts: Or how the sociology of science and the sociology of technology might benefit each other. In W.  Bijker, T. Hughes, & T. Pinch (Eds.), The social construction of technological systems: New directions in the sociology and history of technology (pp. 17–50). Cambridge, MA: The MIT Press. Postman, N. (1996). The end of education: Redefining the value of school. New York, NY: Vintage. Rainie, L., & Wellman, B. (2012). Networked. The new social operating system. Cambridge, MA: The Massachusetts Institute of Technology. Reinecke, L., Aufenanger, S., Beutel, M. E., Dreier, M., Quiring, O., Stark, B., Wölfling, K. & Müller, K. W. (2017). Digital stress over the life span: The effects of communication load and Internet multitasking on perceived stress and psychological health impairments in a German probability sample. Media Psychology, 20(1), 90–115. https://doi.org/10.1080/1521326 9.2015.1121832 Rheingold, H. (2012). Net smart: How to thrive online. Cambridge, MA: The MIT Press. Rice, R. E. (1999). Artifacts and paradoxes in new media. New Media and Society, 1(1), 24–32. Rice, R. E. & Hagen, I. (2010). Young adults’ perpetual contact, social connectivity, and social control through the Internet and mobile phones. In C. Salmon (Ed.), Communication yearbook, 34 (pp. 2–39). London: Routledge. Rice, R. E., Hagen, I., & Zamanzadeh, N. (2018). College students’ media mastery: Paradoxes in using computers and mobile phones. American Behavioral Scientist, 62(9), 1229–1250. Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: Free Press. Schonert-Reichl, K. A. & Roeser, R. W. (Eds.) (2016). Handbook of mindfulness in education: Integrating theory and research into practice. Berlin, Germany: Springer. Silverstone, R. & Hirsch, E. (1992). Consuming technologies: Media and information in domestic spaces. London: Routledge. Silverstone, R. (2007). Media and morality: On the rise of the mediapolis. Cambridge, UK: Polity Press. Smith, A. (April 1, 2015). U.S.  smartphone use in 2015. Pew Research Center. http://www. pewinternet.org/2015/04/01/us-smartphone-use-in-2015/ Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. New York: Basic Books. Wang, Y., Niiya, M., Mark, G., Reich, S., & Warschauer, M. (2015). Coming of age (digitally): An ecological view of social media use among college students. Proceedings of the 18th ACM conference on computer supported cooperative work & social computing (pp. 571–582). New York: ACM. doi:10.1145/2675133.2675271 Williams, R. & Edge, D. (1996). The social shaping of technology. Research Policy, 25(6), 865–899. Wilmer, H. H. & Chein, J. M. (2016). Mobile technology habits: Patterns of association among device usage, intertemporal preference, impulse control, and reward sensitivity. Psychonomic Bulletin & Review, 23(5), 1607–1614. Wu, J. Y. (2015). University students’ motivated attention and use of regulation strategies on social media. Computers & Education, 89, 75–90. Xu, S., Wang, Z. J., & David, P. (2016). Media multitasking and well-being of university students. Computers in Human Behavior, 55, 242–250.

Cited Analyzed References Abeele, M. V. & Roe, K. (2011). New life, old friends: A cross-cultural comparison of the use of communication technologies in the social life of college freshmen. Young, 19(2), 219–240.

282   Ronald E. Rice ET AL. Aharony, N. (2017). Factors affecting LIS Israeli students’ mobile phone use: An exploratory study. The Electronic Library, 35(6), 1098–1121. Aker, S., Sahin, M. K., Sezgin, S., & Oguz, G. (2017). Psychosocial factors affecting smartphone addiction in university students. Journal of Addictions Nursing, 28(4), 215–219. Aladwani, A. M. & Almarzouq, M. (2016). Understanding compulsive social media use: The premise of complementing self-conceptions mismatch with technology. Computers in Human Behavior, 60, 575–581. Aljomaa, S.  S., Al Qudah, M.  F., Albursan, I.  S., Bakhiet, S.  F., & Abduljabbar, A.  S. (2016). Smartphone addiction among university students in the light of some variables. Computers in Human Behavior, 61, 155–164. doi:http://dx.doi.org/10.1016/j.chb.2016.03.041 Alt, D. (2015). College students’ academic motivation, media engagement and fear of missing out. Computers in Human Behavior, 49, 111–119. Ames, M. G. (2013). Managing mobile multitasking: The culture of iPhones on Stanford campus. Proceedings of the 2013 conference on computer supported cooperative work (pp. 1487–1498). doi:10.1145/2441776.2441945 Arıcak, O. T., Dündar, Ş., & Saldaña, M. (2015). Mediating effect of self-acceptance between values and offline/online identity expressions among college students. Computers in Human Behavior, 49, 362–374. Axelsson, A. S. (2010). Perpetual and personal: Swedish young adults and their use of mobile phones. New Media & Society, 12(1), 35–54. Bian, M. & Leung, L. (2015). Linking loneliness, shyness, smartphone addiction symptoms, and patterns of smartphone use to social capital. Social Science Computer Review, 33(1), 61–79. Bicen, H. & Arnavut, A. (2015). Determining the effects of technological tool use habits on social lives. Computers in Human Behavior, 48, 457–462. doi:http://doi.org/10.1016/j. chb.2015.02.012 boyd, d. (2015). It’s complicated: The social lives of networked teens. New Haven, CT: Yale University Press. Brown, C., Czerniewicz, L., & Noakes, T. (2016). Online content creation: Looking at students’ social media practices through a Connected Learning lens. Learning, Media and Technology, 41(1), 140–159. Castillo-Manzano, J. I., Castro-Nuño, M., L¢pez-Valpuesta, L., Sanz-D¡az, M. T., & Yñiguez, R. (2017). To take or not to take the laptop or tablet to classes, that is the question. Computers in Human Behavior, 68, 326–333. Chen, B. & Marcus, J. (2012). Students’ self-presentation on Facebook: An examination of personality and self-construal factors. Computers in Human Behavior, 28(6), 2091–2099. Chen, L., Yan, Z., Tang, W., Yang, F., Xie, X., & He, J. (2016). Mobile phone addiction levels and negative emotions among Chinese young adults: The mediating role of interpersonal problems. Computers in Human Behavior, 55, 856–866. Chiu, S. I. (2014). The relationship between life stress and smartphone addiction on Taiwanese university student: A mediation model of learning self-efficacy and social self-efficacy. Computers in Human Behavior, 34, 49–57. Choi, E. P., Wong, J. Y., Lo, H. H., Wong, W., Chio, J. H., & Fong, D. Y. (2017). Association between using smartphone dating applications and alcohol and recreational drug use in conjunction with sexual activities in college students. Substance Use & Misuse, 52(4), 422–428. Crosslin, K. & Golman, M. (2014). “Maybe you don’t want to face it”: College students’ perspectives on cyberbullying. Computers in Human Behavior, 41, 14–20.

Media Mastery by College Students   283 Dalbudak, E., Evren, C., Aldemir, S., Taymur, I., Evren, B., & Topcu, M. (2015). The impact of sensation seeking on the relationship between attention deficit/hyperactivity symptoms and severity of Internet addiction risk. Psychiatry Research, 228(1), 156–161. Dalton, J. C. & Crosby, P. C. (2013). Digital identity: How social media are influencing student learning and development in college. Journal of College and Character, 14(1), 1–4. doi:10.1515/ jcc-2013–0001 Davis, K. (2011). A life in bits and bytes: A portrait of a college student and her life with digital media. Teachers College Record, 113(9), 1960–1982. DeAndrea, D. C., Ellison, N. B., LaRose, R., Steinfield, C. & Fiore, A. (2012). Serious social media: On the use of social media for improving students’ adjustment to college. The Internet and Higher Education, 15(1), 15–23. Demirci, K., Akgönül, M., & Akpinar, A. (2015). Relationship of smartphone use severity with sleep quality, depression, and anxiety in university students. Journal of Behavioral Addictions, 4(2), 85–92. Dockrell, S., Bennett, K., & Culleton-Quinn, E. (2015). Computer use and musculoskeletal symptoms among undergraduate university students. Computers & Education, 85, 102–109. Duarte-Hueros, J., Duarte-Hueros, A., & Ruano-Lopez, S. (2016). The audiovisual content downloads among university students. Comunicar, 48. Fan, Y., Gong, S., Wang, Y., & Wang, Z. (2017). The effects of learning task and learners’ metacognition on college students’ media multitasking behaviors. Advances in Psychology, 7(7), 895–902. Fasae, J. K. & Adegbilero-Iwari, I. (2015). Mobile devices for academic practices by students of college of sciences in selected Nigerian private universities. The Electronic Library, 33(4), 749–759. Flanagin, A. E. & Babchuk, W. A. (2015). Social media as academic quicksand: A phenomenological study of student experiences in and out of the classroom. Learning and Individual Differences, 44, 40–45. doi:http://dx.doi.org/10.1016/j.lindif.2015.11.003 Floros, G., Siomos, K., Stogiannidou, A., Giouzepas, I., & Garyfallos, G. (2014). The relationship between personality, defense styles, internet addiction disorder, and psychopathology in college students. Cyberpsychology, Behavior, and Social Networking, 17(10), 672–676. Gökçearslan, Ş., Mumcu, F. K., Haşlaman, T., & Çevik, Y. D. (2016). Modelling smartphone addiction: The role of smartphone usage, self-regulation, general self-efficacy and cyberloafing in university students. Computers in Human Behavior, 63, 639–649. doi:http://dx. doi.org/10.1016/j.chb.2016.05.091 Goode, J. (2010). The digital identity divide: How technology knowledge impacts college students. New Media & Society, 12(3), 497–513. Gray, R., Vitak, J., Easton, E. W., & Ellison, N. B. (2013). Examining social adjustment to college in the age of social media: Factors influencing successful transitions and persistence. Computers & Education, 67, 193–207. Hassell, M. D. & Sukalich, M. F. (2016). A deeper look into the complex relationship between social media use and academic outcomes and attitudes. Information Research: An International Electronic Journal, 21(4), n4. Hong, F. Y., Chiu, S. I., & Huang, D. H. (2012). A model of the relationship between psychological characteristics, mobile phone addiction and use of mobile phones by Taiwanese university female students. Computers in Human Behavior, 28(6), 2152–2159.

284   Ronald E. Rice ET AL. Hoy, M.  G. & Milne, G. (2010). Gender differences in privacy-related measures for young adult Facebook users. Journal of Interactive Advertising, 10(2), 28–45. doi:10.1080/15252019. 2010.10722168 Hussain, Z., Griffiths, M. D., & Sheffield, D. (2017). An investigation into problematic smartphone use: The role of narcissism, anxiety, and personality factors. Journal of Behavioral Addictions, 6(3), 378–386. Ishii, K., Rife, T. S., & Kagawa, N. (2017). Technology-driven gratifications sought through text-messaging among college students in the US and Japan. Computers in Human Behavior, 69, 396–404. Jacobsen, W. C. & Forste, R. (2011). The wired generation: Academic and social outcomes of electronic media use among university students. Cyberpsychology, Behavior, and Social Networking, 14(5), 275–280. Jarvi, S. M., Swenson, L. P., & Batejan, K. L. (2017). Motivation for and use of social networking sites: Comparisons among college students with and without histories of non-suicidal self-injury. Journal of American College Health, 65(5), 306–312. Jiang, Z., & Shi, M. (2016). Prevalence and co-occurrence of compulsive buying, problematic Internet and mobile phone use in college students in Yantai, China: Relevance of self-traits. BMC Public Health, 16(1), 1211. Johnson, C. A. (2015). The information diet: A case for conscious consumption. Sebastopol, CA: O’Reilly Media, Inc. Judd, T. (2014). Making sense of multitasking: The role of Facebook. Computers & Education, 70, 194–202. Junco, R. (2012). In-class multitasking and academic performance. Computers in Human Behavior, 28(6), 2236–2243. doi:http://dx.doi.org/10.1016/j.chb.2012.06.031 Kim, B. & Kim, Y. (2017). College students’ social media use and communication network heterogeneity: Implications for social capital and subjective well-being. Computers in Human Behavior, 73, 620–628. Kim, J. & Lee, J.-E.  R. (2011). The Facebook paths to happiness: Effects of the number of Facebook friends and self-presentation on subjective well-being. Cyberpsychology, Behavior, and Social Networking, 14(6), 359–364. doi: 10.1089/cyber.2010.0374 Kuang-Tsan, C. & Fu-Yuan, H. (2017). Study on relationship among university students’ life stress, smart mobile phone addiction, and life satisfaction. Journal of Adult Development, 24(2), 109–118. Kuznekoff, J. & Titsworth, S. (2013). The impact of mobile phone usage on student learning. Communication Education, 62(3), 233–252. Li, W., O’Brien, J. E., Snyder, S. M., & Howard, M. O. (2015). Characteristics of internet addiction/pathological internet use in US university students: A qualitative-method investigation. PloS One, 10(2), e0117372. Manago, A. M., Taylor, T., & Greenfield, P. M. (2012). Me and my 400 friends: The anatomy of college students’ Facebook networks, their communication patterns, and well-being. Developmental Psychology, 48, 369–380. doi: 10.1037/a0026338 Marlowe, J. M., Bartley, A., & Collins, F. (2017). Digital belongings: The intersections of social cohesion, connectivity and digital media. Ethnicities, 17(1), 85–102. Meier, A., Reinecke, L., & Meltzer, C.  E. (2016). “Facebocrastination”? Predictors of using Facebook for procrastination and its effects on students’ well-being. Computers in Human Behavior, 64, 65–76. doi:http://dx.doi.org/10.1016/j.chb.2016.06.011

Media Mastery by College Students   285 Moreno, M. A., Swanson, M. J., Royer, H., & Roberts, L. J. (2011). Sexpectations: Male college students’ views about displayed sexual references on females’ social networking web sites. Journal of Pediatric and Adolescent Gynecology, 24(2), 85–89. Park, S., Lee, S. W., Kwak, J., Cha, M., & Jeong, B. (2013). Activities on Facebook reveal the depressive state of users. Journal of Medical Internet Research, 15(10). Picone, I. (2017). Conceptualizing media users across media: The case for ‘media user/use’ as analytical concepts. Convergence, 23(4), 378–390. Pilli, O. (2015). The changes in social media usage: Students’ perspective. The Anthropologist, 22(2), 345–354. Polo, M. I., Lázaro, S. M., del Barco, B. L., & Castaño, E. F. (2017). Mobile abuse in university students and profiles of victimization and aggression. Adicciones, 29(4), 245–255. [Full article in Spanish] Punyanunt-Carter, N. M., De La Cruz, J. J., & Wrench, J. S. (2017). Investigating the relationships among college students’ satisfaction, addiction, needs, communication apprehension, motives, and uses & gratifications with Snapchat. Computers in Human Behavior, 75, 870–875. Rowan-Kenyon, H. T, & Alemán, A. M. M. (2018). Technology and engagement: Making technology work for first generation college students. New Brunswick, NJ: Rutgers University Press. Sponcil, M. & Gitimu, P. (2013). Use of social media by college students: Relationship to communication and self-concept. Journal of Technology Research, 4, 1–13. Tandoc, E. C., Ferrucci, P. & Duffy, M. (2015). Facebook use, envy, and depression among college students: Is facebooking depressing? Computers in Human Behavior, 43, 139–146. Thomas, L., Briggs, P., Hart, A., & Kerrigan, F. (2017). Understanding social media and identity work in young people transitioning to university. Computers in Human Behavior, 76, 541–553. Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. New York, NY: Basic Books. Virden, A. L., Trujillo, A., & Predeger, E. (2014). Young adult females’ perceptions of high-risk social media behaviors: A focus-group approach. Journal of Community Health Nursing, 31(3), 133–144. Vorderer, P., Krömer, N., & Schneider, F. M. (2016). Permanently online—Permanently connected: Explorations into university students’ use of social media and mobile smart devices. Computers in Human Behavior, 63, 694–703. Wohn, D. Y. & LaRose, R. (2014). Effects of loneliness and differential usage of Facebook on college adjustment of first-year students. Computers & Education, 76, 158–167. Wu, J. Y. (2015). University students’ motivated attention and use of regulation strategies on social media. Computers & Education, 89, 75–90. Yang, H. C. (2014). Young people’s friendship and love relationships and technology: New practices of intimacy and rethinking feminism. Asian Journal of Women’s Studies, 20(1), 93–124. Yang, T., Yu, L., Oliffe, J.  L., Jiang, S., & Si, Q. (2017). Regional contextual determinants of internet addiction among college students: A representative nationwide study of China. The European Journal of Public Health, 27(6), 1032–1037. Yao, M. Z., He, J., Ko, D. M., & Pang, K. (2014). The influence of personality, parental behaviors, and self-esteem on Internet addiction: A study of Chinese college students. Cyberpsychology, Behavior, and Social Networking, 17(2), 104–110.

286   Ronald E. Rice ET AL. Appendix: Publications Analyzed Abeele, M. V., & Roe, K. (2011). New life, old friends: A cross-cultural comparison of the use of communication technologies in the social life of college freshmen. Young, 19(2), 219–240. Aharony, N. (2015). What’s App: A social capital perspective. Online Information Review, 39(1), 26–42. Aharony, N. (2017). Factors affecting LIS Israeli students’ mobile phone use: an exploratory study. The Electronic Library, 35(6), 1098–1121. Akçayır, M., Dündar, H., & Akçayır, G. (2016). What makes you a digital native? Is it enough to be born after 1980? Computers in Human Behavior, 60, 435–440. Aker, S., Sahin, M. K., Sezgin, S., & Oguz, G. (2017). Psychosocial factors affecting smartphone addiction in university students. Journal of Addictions Nursing, 28(4), 215–219. Aladwani, A. M., & Almarzouq, M. (2016). Understanding compulsive social media use: The premise of complementing self-conceptions mismatch with technology. Computers in Human Behavior, 60, 575–581. Al-Gamal, E., Alzayyat, A., & Ahmad, M. M. (2016). Prevalence of internet addiction and its association with psychological distress and coping strategies among university students in Jordan. Perspectives in Psychiatric Care, 52(1), 49–61. Aljomaa, S. S., Al.Qudah, M. F., Albursan, I. S., Bakhiet, S. F., & Abduljabbar, A. S. (2016). Smartphone addiction among university students in the light of some variables. Computers in Human Behavior, 61, 155–164. doi:http://dx.doi.org/10.1016/j.chb.2016.03.041 Alt, D. (2015). College students’ academic motivation, media engagement and fear of missing out. Computers in Human Behavior, 49, 111–119. Alwagait, E., Shahzad, B., & Alim, S. (2015). Impact of social media usage on students’ academic performance in Saudi Arabia. Computers in Human Behavior, 51, 1092–1097. Alzayyat, A., Al-Gamal, E., & Ahmad, M. M. (2015). Psychosocial correlates of Internet addiction among Jordanian university students. Journal of Psychosocial Nursing and Mental Health Services, 53(4), 43–51. Amankwah, F. N., & Ha, L. (2015). Smartphones and self-broadcasting among college students in an age of social media. In A. Mesquita & C-W Tsai (Eds.), Human behavior, psychology, and social interaction in the digital era (pp. 95–128). Hershey, PA: IGI Global. doi:10.4018/978-1-4666-8450-8. Ames, M. G. (2013). Managing mobile multitasking: The culture of iPhones on Stanford campus. Proceedings of the 2013 conference on computer supported cooperative work, 1487–1498. doi:10.1145/2441776.2441945 Annisette, L. E. & Lafreniere, K. D. (2017). Social media, texting, and personality: A test of the shallowing hypothesis. Personality and Individual Differences, 115, 154–158. Arıcak, O. T., Dündar, Ş., & Saldaña, M. (2015). Mediating effect of self-acceptance between values and offline/online identity expressions among college students. Computers in Human Behavior, 49, 362–374. Armstrong, A., Thomas, J., & Smith, M. (2017). College students’ experiences with anonymous social media: Implications for campus racial climate. Journal of College Student Development, 58(7), 1101–1107. Axelsson, A. S. (2010). Perpetual and personal: Swedish young adults and their use of mobile phones. New Media & Society, 12(1), 35–54. Barkley, J. E., & Lepp, A. (2016). Mobile phone use among college students is a sedentary leisure behavior which may interfere with exercise. Computers in Human Behavior, 56, 29–33.

Media Mastery by College Students   287 Bauman, S., & Baldasare, A. (2015). Cyber aggression among college students: Demographic differences, predictors of distress, and the role of the university. Journal of College Student Development, 56(4), 317–330. Bennett, S., & Maton, K. (2010). Beyond the “digital natives” debate: Towards a more nuanced understanding of students’ technology experiences. Journal of Computer Assisted Learning, 26(5), 321–331. doi:10.1111/j.1365–2729.2010.00360.x Benson, V., Saridakis, G., & Tennakoon, H. (2015). Purpose of social networking use and victimisation: Are there any differences between university students and those not in HE? Computers in Human Behavior, 51, 867–872. Bian, M., & Leung, L. (2015). Linking loneliness, shyness, smartphone addiction symptoms, and patterns of smartphone use to social capital. Social Science Computer Review, 33(1), 61–79. Bicen, H., & Arnavut, A. (2015). Determining the effects of technological tool use habits on social lives. Computers in Human Behavior, 48, 457–462. doi:http://doi.org/10.1016/j. chb.2015.02.012 Bjornsen, C. A., & Archer, K. J. (2015). Relations between college students’ cell phone use during class and grades. Scholarship of Teaching and Learning in Psychology, 1(4), 326–336. Bobkowski, P., & Smith, J. (2013). Social media divide: characteristics of emerging adults who do not use social network websites. Media, Culture & Society, 35(6), 771–781. Boswell, S. S. (2012). “I deserve success”: Academic entitlement attitudes and their relationships with course self-efficacy, social networking, and demographic variables. Social Psychology of Education, 15(3), 353–365. Brown, C., Czerniewicz, L., & Noakes, T. (2016). Online content creation: Looking at students’ social media practices through a Connected Learning lens. Learning, Media and Technology, 41(1), 140–159. Castillo-Manzano, J. I., Castro-Nuño, M., López-Valpuesta, L., Sanz-Díaz, M. T., & Yñiguez, R. (2017). To take or not to take the laptop or tablet to classes, that is the question. Computers in Human Behavior, 68, 326–333. Chang, C. S., Liu, E. Z. F., Sung, H. Y., Lin, C. H., Chen, N. S., & Cheng, S. S. (2014). Effects of online college student’s Internet self-efficacy on learning motivation and performance. Innovations in Education and Teaching International, 51(4), 366–377. Chan-Olmsted, S., & Shay, R. (2016). Understanding tablet consumers: Exploring the factors that affect tablet and dual mobile device ownership. Journalism & Mass Communication Quarterly, 93(4), 857–883. Chen, B., & Marcus, J. (2012). Students’ self-presentation on Facebook: An examination of personality and self-construal factors. Computers in Human Behavior, 28(6), 2091–2099. Chen, L., Yan, Z., Tang, W., Yang, F., Xie, X., & He, J. (2016). Mobile phone addiction levels and negative emotions among Chinese young adults: the mediating role of interpersonal problems. Computers in Human Behavior, 55, 856–866. Chi, X., Lin, L., & Zhang, P. (2016). Internet addiction among college students in china: Prevalence and psychosocial correlates. Cyberpsychology, Behavior, and Social Networking, 19(9), 567–573. Chiu, S. I. (2014). The relationship between life stress and smartphone addiction on Taiwanese university student: A mediation model of learning self-efficacy and social self-efficacy. Computers in Human Behavior, 34, 49–57. Choi, E. P., Wong, J. Y., Lo, H. H., Wong, W., Chio, J. H., & Fong, D. Y. (2017). Association between using smartphone dating applications and alcohol and recreational drug use in

288   Ronald E. Rice ET AL. conjunction with sexual activities in college students. Substance Use & Misuse, 52(4), 422–428. Choi, E. P., Wong, J. Y., Lo, H. H., Wong, W., Chio, J. H., & Fong, D. Y. (2016). The association between smartphone dating applications and college students’ casual sex encounters and condom use. Sexual & Reproductive Healthcare, 9, 38–41. Choi, M., & Toma, C. L. (2017). Social sharing with friends and family after romantic breakups: Patterns of media use and effects on psychological well-being. Journal of Media Psychology: Theories, Methods, and Applications, 29(3), 166–172. Chou, C., Wu, H. C., & Chen, C. H. (2011). Re-visiting college students’ attitudes toward the Internet-based on a 6-T model: Gender and grade level difference. Computers & Education, 56(4), 939–947. Christakis, D. A., Moreno, M. M., Jelenchick, L., Myaing, M. T., & Zhou, C. (2011). Problematic internet usage in US college students: A pilot study BMC Medicine, 9, 77. doi:10.1186/ 1741-7015-9-77 Clavio, G., & Walsh, P. (2014). Dimensions of social media utilization among college sport fans. Communication & Sport, 2(3), 261–281. Clayton, R. B., Osborne, R. E., Miller, B. K., & Oberle, C. D. (2013). Loneliness, anxiousness, and substance use as predictors of Facebook use. Computers in Human Behavior, 29(3), 687–693. Crosslin, K., & Golman, M. (2014). “Maybe you don’t want to face it”–College students’ perspectives on cyberbullying. Computers in Human Behavior, 41, 14–20. Dalbudak, E., Evren, C., Topcu, M., Aldemir, S., Coskun, K. S., Bozkurt, M., . . . & Canbal, M. (2013). Relationship of Internet addiction with impulsivity and severity of psychopathology among Turkish university students. Psychiatry Research, 210(3), 1086–1091. Dalbudak, E., Evren, C., Aldemir, S., Taymur, I., Evren, B., & Topcu, M. (2015). The impact of sensation seeking on the relationship between attention deficit/hyperactivity symptoms and severity of Internet addiction risk. Psychiatry Research, 228(1), 156–161. Dalton, J. C., & Crosby, P. C. (2013). Digital identity: How social media are influencing student learning and development in college. Journal of College and Character, 14(1), 1–4. doi:10.1515/ jcc-2013–0001 David, P., Kim, J.-H., Brickman, J. S., Ran, W. & Curtis, C. M. (2015). Mobile phone distraction while studying. New Media & Society, 17(10), 661–1679. Davila, J., Hershenberg, R., Feinstein, B.  A., Gorman, K., Bhatia,V., & Starr, L.  R. (2012). Frequency and quality of social networking among young adults: Associations with depressive symptoms, rumination, and corumination. Psychology of Popular Media Culture, 1(2), 72–86. http://dx.doi.org/10.1037/a0027512 Davis, K. (2011). A life in bits and bytes: A portrait of a college student and her life with digital media. Teachers College Record, 113(9), 1960–1982. De Leo, J. A., & Wulfert, E. (2013). Problematic Internet use and other risky behaviors in college students: An application of problem-behavior theory. Psychology of Addictive Behaviors, 27(1), 133–141. DeAndrea, D. C., Ellison, N. B., LaRose, R., Steinfield, C. & Fiore, A. (2012). Serious social media: On the use of social media for improving students’ adjustment to college. The Internet and Higher Education, 15(1), 15–23. Demirbilek, M. (2014). The “digital natives” debate: An investigation of the digital propensities of university students. Eurasia Journal of Mathematics, Science & Technology Education, 10(2), 115–123.

Media Mastery by College Students   289 Demirci, K., Akgönül, M., & Akpinar, A. (2015). Relationship of smartphone use severity with sleep quality, depression, and anxiety in university students. Journal of Behavioral Addictions, 4(2), 85–92. Demirhan, E., Randler, C., & Horzum, M. B. (2016). Is problematic mobile phone use explained by chronotype and personality? Chronobiology International, 33(7), 821–831. Dockrell, S., Bennett, K., & Culleton-Quinn, E. (2015). Computer use and musculoskeletal symptoms among undergraduate university students. Computers & Education, 85, 102–109. Dong, G., Wang, J., Yang, X., & Zhou, H. (2013). Risk personality traits of Internet addiction: a longitudinal study of Internet-addicted Chinese university students. Asia-Pacific Psychiatry, 5(4), 316–321. Duarte-Hueros, J., Duarte-Hueros, A., & Ruano-Lopez, S. (2016). The audiovisual content downloads among university students. Comunicar, 48. Egan, K.  G., & Moreno, M.  A. (2011). Prevalence of stress references on college freshmen Facebook profiles. Computers, Informatics, Nursing: CIN, 29(10), 586–592. Ekinci, B. (2014). The relationship between problematic internet entertainment use and problem solving skills among university students. International Journal of Mental Health and Addiction, 12(5), 607–617. Fan, Y., Gong, S., Wang, Y., & Wang, Z. (2017). The effects of learning task and learners’ metacognition on college students’ media multitasking behaviors. Advances in Psychology, 7(7), 895–902. Fang, L. & Ha, L. (2015). Do college students benefit from their social media experience? Social media involvement and its impact on college students’ self-efficacy perception. In A. Mesquita & C-W Tsai (Eds.), Human behavior, psychology, and social interaction in the digital era (pp. 259–278). Hershey, PA: IGI Global. doi:10.4018/978-1-4666-8450-8. Fasae, J. K., & Adegbilero-Iwari, I. (2015). Mobile devices for academic practices by students of college of sciences in selected Nigerian private universities. The Electronic Library, 33(4), 749–759. Flanagin, A. E. & Babchuk, W. A. (2015). Social media as academic quicksand: A phenomenological study of student experiences in and out of the classroom. Learning and Individual Differences, 44, 40–45. doi:http://dx.doi.org/10.1016/j.lindif.2015.11.003 Floros, G., Siomos, K., Stogiannidou, A., Giouzepas, I., & Garyfallos, G. (2014). The relationship between personality, defense styles, internet addiction disorder, and psychopathology in college students. Cyberpsychology, Behavior, and Social Networking, 17(10), 672–676. Forbush, E., & Foucault-Welles, B. (2016). Social media use and adaptation among Chinese students beginning to study in the United States. International Journal of Intercultural Relations, 50, 1–12. Garett, R., Liu, S., & Young, S. D. (2018). The relationship between social media use and sleep quality among undergraduate students. Information, Communication & Society, 21(2), 163–173. Genctanirim Kurt, D. (2015). Suicide risk in college students: The effects of Internet addiction and drug use. Educational Sciences: Theory and Practice, 15(4), 841–848. Gökçearslan, Ş., Mumcu, F. K., Haşlaman, T., & Çevik, Y. D. (2016). Modelling smartphone addiction: The role of smartphone usage, self-regulation, general self-efficacy and cyberloafing in university students. Computers in Human Behavior, 63, 639–649. doi:http://dx. doi.org/10.1016/j.chb.2016.05.091

290   Ronald E. Rice ET AL. Goode, J. (2010a). Mind the gap: The digital dimension of college access. The Journal of Higher Education, 81(5), 583–618. Goode, J. (2010b). The digital identity divide: How technology knowledge impacts college students. New Media & Society, 12(3), 497–513. Gray, R., Vitak, J., Easton, E. W., & Ellison, N. B. (2013). Examining social adjustment to college in the age of social media: Factors influencing successful transitions and persistence. Computers & Education, 67, 193–207. Han, L., Geng, J., Jou, M., Gao, F., & Yang, H. (2017). Relationship between shyness and mobile phone addiction in Chinese young adults: Mediating roles of self-control and attachment anxiety. Computers in Human Behavior, 76, 363–371. Han, P., Wang, P., Lin, Q., Tian, Y., Gao, F., & Chen, Y. (2017). Reciprocal relationship between internet addiction and network-related maladaptive cognition among Chinese college freshmen: A longitudinal cross-lagged analysis. Frontiers in Psychology, 8, 1047. Hassell, M. D., & Sukalich, M. F. (2016). A deeper look into the complex relationship between social media use and academic outcomes and attitudes. Information Research: An International Electronic Journal, 21(4), n4. Haug, S., Castro, R. P., Kwon, M., Filler, A., Kowatsch, T., & Schaub, M. P. (2015). Smartphone use and smartphone addiction among young people in Switzerland. Journal of Behavioral Addictions, 4(4), 299–307. Haverila, M. J. (2011). Cell phone feature preferences and gender differences among college students. International Journal of Mobile Communications, 9(4), 401–419. Hawi, N. S., & Samaha, M. (2016). To excel or not to excel: Strong evidence on the adverse effect of smartphone addiction on academic performance. Computers & Education, 98, 81–89. Hawi, N. S., & Samaha, M. (2017). The relations among social media addiction, self-esteem, and life satisfaction in university students. Social Science Computer Review, 35(5), 576–586. Henderson, M., Selwyn, N., Finger, G., & Aston, R. (2015). Students’ everyday engagement with digital technology in university: Exploring patterns of use and “usefulness”. Journal of Higher Education Policy and Management, 37(3), 308–319. doi:10.1080/13600 80X.2015.1034424 Hendrickson, B., & Rosen, D. (2017). Insights into new media use by international students: Implications for cross-cultural adaptation theory. Social Networking, 6, 81–106. Hong, F. Y., Chiu, S. I., & Huang, D. H. (2012). A model of the relationship between psychological characteristics, mobile phone addiction and use of mobile phones by Taiwanese university female students. Computers in Human Behavior, 28(6), 2152–2159. Hou, J., Ndasauka, Y., Jiang, Y., Ye, Z., Wang, Y., Yang, L., . . . & Xu, F. (2017). Excessive use of WeChat, social interaction and locus of control among college students in China. PloS One, 12(8), e0183633. Hoy, M. G., & Milne, G. (2010). Gender differences in privacy-related measures for young adult Facebook users. Journal of Interactive Advertising, 10(2), 28–45. doi:10.1080/15252019. 2010.10722168 Hussain, Z., Griffiths, M. D., & Sheffield, D. (2017). An investigation into problematic smartphone use: The role of narcissism, anxiety, and personality factors. Journal of Behavioral Addictions, 6(3), 378–386. Ishii, K., Rife, T. S., & Kagawa, N. (2017). Technology-driven gratifications sought through text-messaging among college students in the US and Japan. Computers in Human Behavior, 69, 396–404.

Media Mastery by College Students   291 Jacobsen, W. C., & Forste, R. (2011). The wired generation: Academic and social outcomes of electronic media use among university students. Cyberpsychology, Behavior, and Social Networking, 14(5), 275–280. Jarvi, S. M., Swenson, L. P., & Batejan, K. L. (2017). Motivation for and use of social networking sites: comparisons among college students with and without histories of non-suicidal self-injury. Journal of American College Health, 65(5), 306–312. Jenkins-Guarnieri, M. A., Wright, S. L., & Johnson, B. (2013). Development and validation of a social media use integration scale. Psychology of Popular Media Culture, 2(1), 38–50. Jiang, Z., & Shi, M. (2016). Prevalence and co-occurrence of compulsive buying, problematic Internet and mobile phone use in college students in Yantai, China: Relevance of self-traits. BMC public health, 16(1), 1211. Jiang, Z., & Zhao, X. (2016). Self-control and problematic mobile phone use in Chinese college students: The mediating role of mobile phone use patterns. BMC psychiatry, 16(1), 416. Jiang, Z., & Zhao, X. (2017). Brain behavioral systems, self-control and problematic mobile phone use: The moderating role of gender and history of use. Personality and Individual Differences, 106, 111–116. Johnston, K., Tanner, M., Lalla, N., & Kawalski, D. (2013). Social capital: the benefit of Facebook “friends”. Behaviour & Information Technology, 32(1), 24–36. Judd, T. (2014). Making sense of multitasking: The role of Facebook. Computers & Education, 70, 194–202. Junco, R. (2012a). In-class multitasking and academic performance. Computers in Human Behavior, 28(6), 2236–2243. doi:http://dx.doi.org/10.1016/j.chb.2012.06.031 Junco, R. (2012b). The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement. Computers & Education, 58(1), 162–171. Kabasakal, Z. (2015). Life satisfaction and family functions as-predictors of problematic Internet use in university students. Computers in Human Behavior, 53, 294–304. Karadağ, E., Tosuntaş, Ş. B., Erzen, E., Duru, P., Bostan, N., Şahin, B. M., . . . & Babadağ, B. (2015). Determinants of phubbing, which is the sum of many virtual addictions: A structural equation model. Journal of Behavioral Addictions, 4(2), 60–74. Katikalapudi, R., Chellappan, S., Montgomery, F., Wunsch, D., & Lutzen, K. (2012). Associating internet usage with depressive behavior among college students. IEEE Technology and Society Magazine, 31(4), 73–80. Kim, B., & Kim, Y. (2017). College students’ social media use and communication network heterogeneity: Implications for social capital and subjective well-being. Computers in Human Behavior, 73, 620–628. Kim, H. J., Min, J. Y., Kim, H. J., & Min, K. B. (2017). Accident risk associated with smartphone addiction: A study on university students in Korea. Journal of Behavioral Addictions, 6(4), 699–707. Kim, J, & Lee, J.-E.  R. (2011). The Facebook paths to happiness: Effects of the number of Facebook friends and self-presentation on subjective well-being. Cyberpsychology, Behavior, and Social Networking, 14(6), 359–364. doi: 10.1089/cyber.2010.0374 Kim, J. H., Kim, M. S., & Nam, Y. (2010). An analysis of self-construals, motivations, Facebook use, and user satisfaction. International. Journal of Human–Computer Interaction, 26(11–12), 1077–1099. Kim, Y., Liu, Y., & Shan, Z. (2017). Beyond touchdown: College students’ sports participation, social media use, college attachment, and psychological well-being. Telematics and Informatics, 34(7), 895–903.

292   Ronald E. Rice ET AL. Kim, Y., Sohn, D., & Choi, S. M. (2011). Cultural difference in motivations for using social network sites: A comparative study of American and Korean college students. Computers in Human Behavior, 27(1), 365–372. Kim, Y., Wang, Y., & Oh, J. (2016). Digital media use and social engagement: How social media and smartphone use influence social activities of college students. Cyberpsychology, Behavior, and Social Networking, 19(4), 264–269. Kobus, M. B., Rietveld, P., & Van Ommeren, J. N. (2013). Ownership versus on-campus use of mobile IT devices by university students. Computers & Education, 68, 29–41. Koc, M., & Gulyagci, S. (2013). Facebook addiction among Turkish college students: The role of psychological health, demographic, and usage characteristics. Cyberpsychology, Behavior, and Social Networking, 16(4), 279–284. Koh, H., & Mackert, M. (2016). A study exploring factors of decision to text while walking among college students based on Theory of Planned Behavior (TPB). Journal of American College Health, 64(8), 619–627. Kokkinos, C. M., Baltzidis, E., & Xynogala, D. (2016). Prevalence and personality correlates of Facebook bullying among university undergraduates. Computers in Human Behavior, 55, 840–850. Krasnova, H., Widjaja, T., Buxmann, P, Wenninger, H., & Benbasat, I. (2015). Why following friends can hurt you: An exploratory investigation of the effects of envy on social networking sites among college-age users. Information Systems Research, 26(3), 585–605. Kuang-Tsan, C., & Fu-Yuan, H. (2017). Study on relationship among university students’ life stress, smart mobile phone addiction, and life satisfaction. Journal of Adult Development, 24(2), 109–118. Kuznekoff, J. & Titsworth, S. (2013). The impact of mobile phone usage on student learning. Communication Education, 62(3), 233–252. Lachmann, B., Duke, É., Sariyska, R., & Montag, C. (2017). Who’s addicted to the Smartphone and/or the Internet? Psychology of Popular Media Culture, (Nov 20). Lau, W. W. (2017). Effects of social media usage and social media multitasking on the academic performance of university students. Computers in Human Behavior, 68, 286–291. Lee, E.  B. (2015). Too much information: Heavy smartphone and Facebook utilization by African American young adults. Journal of Black Studies, 46(1), 44–61. Lee, K. S., & Chen, W. (2017). A long shadow: Cultural capital, techno-capital and networking skills of college students. Computers in Human Behavior, 70, 67–73. Lee-Johnson, Y.  L., Jiang, Y., & Li, Y. (2016). Comparing the digital literacy repertoire among college students in China and the USA. Quarterly Journal of Chinese Studies, 4(4), 64–80. Lepp, A., Li, J., Barkley, J. E., & Salehi-Esfahani, S. (2015). Exploring the relationships between college students’ cell phone use, personality and leisure. Computers in Human Behavior, 43, 210–219. Li, W., O’Brien, J.  E., Snyder, S.  M., & Howard, M.  O. (2015). Characteristics of internet addiction/pathological internet use in US university students: A qualitative-method investigation. PloS one, 10(2), e0117372. Li, X., & Chen, W. (2014). Facebook or Renren? A comparative study of social networking site use and social capital among Chinese international students in the United States. Computers in Human Behavior, 35, 116–123. Lian, L., & You, X. (2017). Specific virtues as predictors of smartphone addiction among Chinese undergraduates. Current Psychology, 36(2), 376–384.

Media Mastery by College Students   293 Lian, L., You, X., Huang, J., & Yang, R. (2016). Who overuses Smartphones? Roles of virtues and parenting style in Smartphone addiction among Chinese college students. Computers in Human Behavior, 65, 92–99. Lin, J. H. (2015). The role of attachment style in Facebook use and social capital: Evidence from university students and a national sample. Cyberpsychology, Behavior, and Social Networking, 18(3), 173–180. Lin, J. H., Peng, W., Kim, M., Kim, S. Y., & LaRose, R. (2012). Social networking and adjustments among international students. New Media & Society, 14(3), 421–440. Lin, T. T., & Chiang, Y. H. (2017). Investigating predictors of smartphone dependency symptoms and effects on academic performance, improper phone use and perceived sociability. International Journal of Mobile Communications, 15(6), 655–676. Lin, Y., & Sackey, E. (2015). Use of Facebook: A Comparative Study of US American and Ghanaian College Students. Communication Research Reports, 32(3), 281–286. Liu, D., & Brown, B. B. (2014). Self-disclosure on social networking sites, positive feedback, and social capital among Chinese college students. Computers in Human Behavior, 38, 213–219. Liu, D., Kirschner, P. A., & Karpinski, A. C. (2017). A meta-analysis of the relationship of academic performance and Social Network Site use among adolescents and young adults. Computers in Human Behavior, 77, 148–157. Long, J., Liu, T. Q., Liao, Y. H., Qi, C., He, H. Y., Chen, S. B., & Billieux, J. (2016). Prevalence and correlates of problematic smartphone use in a large random sample of Chinese undergraduates. BMC psychiatry, 16(1), 408. Madu, S.  N., Otuka, C.  C., & Adebayo, I.  A. (2011). Attitude of male and female students towards the use of internet. Gender and Behaviour, 9(1), 3817–3826. Manago, A.M., Taylor, T., & Greenfield, P. M. (2012). Me and my 400 friends: The anatomy of college students’ Facebook networks, their communication patterns, and well-being. Developmental Psychology, 48, 369–380. doi: 10.1037/a0026338 Marlowe, J. M., Bartley, A., & Collins, F. (2017). Digital belongings: The intersections of social cohesion, connectivity and digital media. Ethnicities, 17(1), 85–102. Meier, A., Reinecke, L., & Meltzer, C.  E. (2016). “Facebocrastination”? Predictors of using Facebook for procrastination and its effects on students’ well-being. Computers in Human Behavior, 64, 65–76. doi:http://dx.doi.org/10.1016/j.chb.2016.06.011 Michikyan, M., Subrahmanyam, K., & Dennis, J. (2015). Facebook use and academic performance among college students: A mixed-methods study with a multi-ethnic sample. Computers in Human Behavior, 45, 265–272. Mihailidis, P. (2014a). A tethered generation: Exploring the role of mobile phones in the daily life of young people. Mobile Media & Communication, 2(1), 58–72. Mihailidis, P. (2014b). The civic-social media disconnect: Exploring perceptions of social media for engagement in the daily life of college students. Information, Communication & Society, 17(9), 1059–1071. Miller, R. A. (2017). “My voice is definitely strongest in online communities”: Students using social media for queer and disability identity-making. Journal of College Student Development, 58(4), 509–525. Miller, R., & Melton, J. (2015). College students and risk-taking behaviour on Twitter versus Facebook. Behaviour & Information Technology, 34(7), 678–684. Misra, S., & Stokols, D. (2012). Psychological and health outcomes of perceived information overload. Environment and Behavior, 44(6), 737–759.

294   Ronald E. Rice ET AL. Mitchell, M. E., Lebow, J. R., Uribe, R., Grathouse, H., & Shoger, W. (2011). Internet use, happiness, social support and introversion: A more fine grained analysis of person variables and internet activity. Computers in Human Behavior, 27(5), 1857–1861. Mok, J. Y., Choi, S. W., Kim, D. J., Choi, J. S., Lee, J., Ahn, H., . . . & Song, W. Y. (2014). Latent class analysis on internet and smartphone addiction in college students. Neuropsychiatric Disease and Treatment, 10, 817–827. Monacis, L., de Palo, V., Griffiths, M. D., & Sinatra, M. (2017). Exploring individual differences in online addictions: The role of identity and attachment. International Journal of Mental Health And Addiction, 15(4), 853–868. Montagni, I., Guichard, E., & Kurth, T. (2016). Association of screen time with self-perceived attention problems and hyperactivity levels in French students: A cross-sectional study. BMJ open, 6(2), e009089. Moreno, M. A., Jelenchick, L. A., & Breland, D. J. (2015). Exploring depression and problematic internet use among college females: A multisite study. Computers in Human Behavior, 49, 601–607. Moreno, M. A., Jelenchick, L. A., & Christakis, D. A. (2013). Problematic internet use among older adolescents: A conceptual framework. Computers in Human Behavior, 29(4), 1879–1887. Moreno, M. A., Jelenchick, L., Koff, R., Eikoff, J., Diermyer, C., & Christakis, D. A. (2012). Internet use and multitasking among older adolescents: An experience sampling approach. Computers in Human Behavior, 28(4), 1097–1102. Moreno, M. A., Swanson, M. J., Royer, H., & Roberts, L. J. (2011). Sexpectations: Male college students’ views about displayed sexual references on females’ social networking web sites. Journal of Pediatric and Adolescent Gynecology, 24(2), 85–89. Morreale, S., Staley, C., Stavrositu, C., & Krakowiak, M. (2015). First-year college students’ attitudes toward communication technologies and their perceptions of communication competence in the 21st Century. Communication Education, 64(1), 107–131. Noh, D., & Kim, S. (2016). Dysfunctional attitude mediates the relationship between psychopathology and Internet addiction among Korean college students: A cross-sectional observational study. International Journal of Mental Health Nursing, 25(6), 588–597. Odacı, H., & Kalkan, M. (2010). Problematic Internet use, loneliness and dating anxiety among young adult university students. Computers & Education, 55(3), 1091–1097. Oh, S., & Kim, S. (2014). College students’ use of social media for health in the USA and Korea. Information Research: An International Electronic Journal, 19(4), n4, article 643. Orzech, K. M., Grandner, M. A., Roane, B. M., & Carskadon, M. A. (2016). Digital media use in the 2 h before bedtime is associated with sleep variables in university students. Computers in Human Behavior, 55, 43–50. Ouyang, Z., Wang, Y., & Yu, H. (2017). Internet use in young adult males: From the perspective of pursuing well-being. Current Psychology, 36(4), 840–848. Pak, R., Rovira, E., McLaughlin, A. C., & Leidheiser, W. (2017). Evaluating attitudes and experience with emerging technology in cadets and civilian undergraduates. Military Psychology, 29(5), 448–455. Panek, E.  T., Nardis, Y., & Konrath, S. (2013). Mirror or Megaphone?: How relationships between narcissism and social networking site use differ on Facebook and Twitter. Computers in Human Behavior, 29(5), 2004–2012. Park, N., Song, H., & Lee, K. M. (2014). Social networking sites and other media use, acculturation stress, and psychological well-being among East Asian college students in the United States. Computers in Human Behavior, 36, 138–146.

Media Mastery by College Students   295 Park, S., Lee, S. W., Kwak, J., Cha, M., & Jeong, B. (2013). Activities on Facebook reveal the depressive state of users. Journal of Medical Internet Research, 15(10). Percheski, C., & Hargittai, E. (2011). Health information-seeking in the digital age. Journal of American College Health, 59(5), 379–386. Pilli, O. (2015). The changes in social media usage: students’ perspective. The Anthropologist, 22(2), 345–354. Polo, M. I., Lázaro, S. M., del Barco, B. L., & Castaño, E. F. (2017). Mobile abuse in university students and profiles of victimization and aggression. Adicciones, 29(4), 245–255. Primack, B. A., Shensa, A., Escobar-Viera, C. G., Barrett, E. L., Sidani, J. E., Colditz, J. B., & James, A. E. (2017). Use of multiple social media platforms and symptoms of depression and anxiety: A nationally-representative study among U.S.  young adults. Computers in Human Behavior, 69, 1–9. Punyanunt-Carter, N. M., De La Cruz, J. J., & Wrench, J. S. (2017). Investigating the relationships among college students’ satisfaction, addiction, needs, communication apprehension, motives, and uses & gratifications with Snapchat. Computers in Human Behavior, 75, 870–875. Raacke, J., & Bonds-Raacke, J. (2015). Are students really connected? Predicting college adjustment from social network usage. Educational Psychology, 35(7), 819–834. Ragan, E. D., Jennings, S. R., Massey, J. D., & Doolittle, P. E. (2014). Unregulated use of laptops over time in large lecture classes. Computers & Education, 78, 78–86. Rashid, T., & Asghar, H. M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior, 63, 604–612. Rayan, A., Dadoul, A. M., Jabareen, H., Sulieman, Z., Alzayyat, A., & Baker, O. (2017). Internet use among university students in South West Bank: Prevalence, advantages and disadvantages, and association with psychological health. International Journal of Mental Health and Addiction, 15(1), 118–129. Reed, L. A., Tolman, R. M., & Safyer, P. (2015). Too close for comfort: Attachment insecurity and electronic intrusion in college students’ dating relationships. Computers in Human Behavior, 50, 431–438. Roberts, J. A., & Pirog III, S. F. (2012). A preliminary investigation of materialism and impulsiveness as predictors of technological addictions among young adults. Journal of Behavioral Addictions, 2(1), 56–62. Roberts, J. A., Petnji Yaya, L. H., & Manolis, C. (2014). The invisible addiction: Cell-phone activities and addiction among male and female college students. Journal of Behavioral Addictions, 3(4), 254–265. doi:10.1556/JBA.3.2014.015 Roberts, N., & Rees, M. (2014). Student use of mobile devices in university lectures. Australasian Journal of Educational Technology, 30(4), 415–426. Rowan-Kenyon, H. T., & Alemán, A. M. M. (2018). Technology and engagement: Making technology work for first generation college students. New Brunswick, NJ: Rutgers University Press. Rutledge, C. M., Gillmor, K. L., & Gillen, M. M. (2013). Does this profile picture make me look fat? Facebook and body image in college students. Psychology of Popular Media Culture, 2(4), 251–258. Samaha, M., & Hawi, N. S. (2016). Relationships among smartphone addiction, stress, academic performance, and satisfaction with life. Computers in Human Behavior, 57, 321–325. Sánchez, R. A., Cortijo, V., & Javed, U. (2014). Students’ perceptions of Facebook for academic purposes. Computers & Education, 70, 138–149.

296   Ronald E. Rice ET AL. Selkie, E. M., Kota, R., Chan, Y. F., & Moreno, M. (2015). Cyberbullying, depression, and problem alcohol use in female college students: A multisite study. Cyberpsychology, Behavior, and Social Networking, 18(2), 79–86. Selwyn, N. (2016). Digital downsides: exploring university students’ negative engagements with digital technology. Teaching in Higher Education, 21(8), 1006–1021. Servidio, R. (2014). Exploring the effects of demographic factors, Internet usage and personality traits on Internet addiction in a sample of Italian university students. Computers in Human Behavior, 35, 85–92. Sevillano-García, M. L., Quicios-García, M. P., & González-García, J. L. (2016). The ubiquitous possibilities of the laptop: Spanish university students’ perceptions/Posibilidades ubicuas del ordenador portátil: percepción de estudiantes universitarios españoles. Comunicar, 24(46), 87–94. Sheldon, P. (2013). Voices that cannot be heard: Can shyness explain how we communicate on Facebook versus face-to-face? Computers in Human Behavior, 29(4), 1402–1407. Smith, E. E. (2016). “A real double-edged sword:” Undergraduate perceptions of social media in their learning. Computers & Education, 103, 44–58. Smith, R., Morgan, J., & Monks, C. (2017). Students’ perceptions of the effect of social media ostracism on wellbeing. Computers in Human Behavior, 68, 276–285. Sponcil, M. & Gitimu, P. (2013). Use of social media by college students: Relationship to communication and self-concept. Journal of Technology Research, 4, 1–13. Tandoc, E. C., Ferrucci, P. & Duffy, M. (2015). Facebook use, envy, and depression among college students: Is facebooking depressing? Computers in Human Behavior, 43, 139–146. Taylor, D. G., Voelker, T. A. & Pentina, I. (2011). Mobile application adoption by young adults: A social network perspective. International Journal of Mobile Marketing, 6(2), 60–70. Teoh, A. N., Chong, L. X., Yip, C. C. E., Lee, P. S. H., & Wong, J. W. K. (2015). Gender as moderator of the effects of online social support from friends and strangers: A study of Singaporean college students. International Perspectives in Psychology: Research, Practice, Consultation, 4(4), 254–266. Thinyane, H. (2010). Are digital natives a world-wide phenomenon? An investigation into South African first year students’ use and experience with technology. Computers & Education, 55(1), 406–414. Thomas, L., Briggs, P., Hart, A., & Kerrigan, F. (2017). Understanding social media and identity work in young people transitioning to university. Computers in Human Behavior, 76, 541–553. Tu, B. M., Wu, H. C., Hsieh, C., & Chen, P. H. (2011). Establishing new friendships- from faceto-face to facebook: A case study of college students. Proceedings of the 44th Hawaii International Conference on System Sciences, Kauai, 4–7 January 2011, 1–10. Turan, Z. (2013). The reasons for non-use of social networking websites by university students. Comunicar, 21(41), 137–145. Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. New York, NY: Basic Books. Tynes, B. M., Rose, C. A., & Markoe, S. L. (2013). Extending campus life to the Internet: Social media, discrimination, and perceptions of racial climate. Journal of Diversity in Higher Education, 6(2), 102–114. Villanti, A. C., Johnson, A. L., Ilakkuvan, V., Jacobs, M. A., Graham, A. L., & Rath, J. M. (2017). Social media use and access to digital technology in US young adults in 2016. Journal of Medical Internet Research, 19(6). e196

Media Mastery by College Students   297 Virden, A. L., Trujillo, A., & Predeger, E. (2014). Young adult females’ perceptions of high-risk social media behaviors: A focus-group approach. Journal of Community Health Nursing, 31(3), 133–144. Vorderer, P., Krömer, N., & Schneider, F. M. (2016). Permanently online–Permanently connected: Explorations into university students’ use of social media and mobile smart devices. Computers in Human Behavior, 63, 694–703. Wang, H., Chua, V., & Stefanone, M. A. (2015). Social ties, communication channels, and personal well-being: A study of the networked lives of college students in Singapore. American Behavioral Scientist, 59, 1189–1202. doi: 10.1177/0002764215580590 Wang, J. L., Wang, H. Z., Gaskin, J., & Wang, L. H. (2015). The role of stress and motivation in problematic smartphone use among college students. Computers in Human Behavior, 53, 181–188. Wang, J.-L., Jackson, L. A., Gaskin, J., & Wang, H.-Z. (2014). The effects of Social Networking Site (SNS) use on college students’ friendship and wellbeing. Computers in Human Behavior, 37(August), 229–236. Wang, Y., Niiya, M., Mark, G., Reich, S., & Warschauer, M. (2015). Coming of age (digitally): An ecological view of social media use among college students. Proceedings of CSCW, 571–582. doi:10.1145/2675133.2675271 Wang, Z., Tchernev, J. M., & Solloway, T. (2012). A dynamic longitudinal examination of social media use, needs, and gratifications among college students. Computers in Human Behavior, 28(5), 1829–1839. Wei, R., Lo, V. H., Xu, X., Chen, Y. N. K., & Zhang, G. (2014). Predicting mobile news use among college students: The role of press freedom in four Asian cities. New Media & Society, 16(4), 637–654. Wendorf, J. E., & Yang, F. (2015). Benefits of a negative post: Effects of computer-mediated venting on relationship maintenance. Computers in Human Behavior, 52, 271–277. Whitehill, J. M., Brockman, L. N., & Moreno, M. A. (2013). “Just talk to me”: Communicating with college students about depression disclosures on Facebook. Journal of Adolescent Health, 52(1), 122–127. Wilmer, H. H., & Chein, J. M. (2016). Mobile technology habits: patterns of association among device usage, intertemporal preference, impulse control, and reward sensitivity. Psychonomic Bulletin & Review, 23(5), 1607–1614. doi:10.3758/s13423-016-1011-z Wohn, D. Y., & LaRose, R. (2014). Effects of loneliness and differential usage of Facebook on college adjustment of first-year students. Computers & Education, 76, 158–167. Wu, J. Y. (2015). University students’ motivated attention and use of regulation strategies on social media. Computers & Education, 89, 75–90. Wu, J. Y. (2017). The indirect relationship of media multitasking self-efficacy on learning performance within the personal learning environment: Implications from the mechanism of perceived attention problems and self-regulation strategies. Computers & Education, 106, 56–72. Xu, S., Wang, Z. J., & David, P. (2016). Media multitasking and well-being of university students. Computers in Human Behavior, 55, 242–250. Xu, X., Lin, Q., Zhang, Y., Zhu, R., Sharma, M., & Zhao, Y. (2016). Influence of WeChat on sleep quality among undergraduates in Chongqing, China: A cross-sectional study. SpringerPlus, 5(1), 2066. Yang, C. C., & Brown, B. B. (2013). Motives for using Facebook, patterns of Facebook activities, and late adolescents’ social adjustment to college. Journal of Youth and Adolescence, 42(3), 403–416.

298   Ronald E. Rice ET AL. Yang, C. C., & Brown, B. B. (2015). Factors involved in associations between Facebook use and college adjustment: Social competence, perceived usefulness, and use patterns. Computers in Human Behavior, 46, 245–253. Yang, H. C. (2014). Young people’s friendship and love relationships and technology: New practices of intimacy and rethinking feminism. Asian Journal of Women’s Studies, 20(1), 93–124. Yang, T., Yu, L., Oliffe, J.  L., Jiang, S., & Si, Q. (2017). Regional contextual determinants of internet addiction among college students: A representative nationwide study of China. The European Journal of Public Health, 27(6), 1032–1037. Yao, B., Han, W., Zeng, L., & Guo, X. (2013). Freshman year mental health symptoms and level of adaptation as predictors of Internet addiction: A retrospective nested case-control study of male Chinese college students. Psychiatry Research, 210(2), 541–547. Yao, M. Z., He, J., Ko, D. M., & Pang, K. (2014). The influence of personality, parental behaviors, and self-esteem on Internet addiction: A study of Chinese college students. Cyberpsychology, Behavior, and Social Networking, 17(2), 104–110. Yubero, S., Navarro, R., Elche, M., Larrañaga, E., & Ovejero, A. (2017). Cyberbullying victimization in higher education: An exploratory analysis of its association with social and emotional factors among Spanish students. Computers in Human Behavior, 75, 439–449. Yuchang, J., Cuicui, S., Junxiu, A., & Junyi, L. (2017). Attachment styles and smartphone addiction in chinese college students: The mediating roles of dysfunctional attitudes and self-esteem. International Journal of Mental Health and Addiction, 15(5), 1122–1134. Yuen, E. K., Koterba, E. A., Stasio, M. J. (2018). The effects of Facebook on mood in emerging adults. Psychology of Popular Media Culture. Zhang, Y., Mei, S., Li, L., Chai, J., Li, J., & Du, H. (2015). The relationship between impulsivity and internet addiction in Chinese college students: A moderated mediation analysis of meaning in life and self-esteem. PloS one, 10(7), e0131597. Zhu, R., & Krever, R. (2017). Media use and cultural adaptation by foreign students in Chinese universities. Continuum, 31(2), 307–324.

chapter 10

Bou n da ry M a nagem en t a n d Com m u n ication Tech nol ogies Marta E. Cecchinato and Anna L. Cox

Introduction The growing number of mobile communication technologies and computer-mediated communication (CMC) platforms has brought numerous benefits to and enrichments of the way we work and socialize. However, they also lead to the challenge of being always connected, which can be a source of stress (Barley, Meyerson, & Grodal, 2011). The extent to, and the ways in, which digital technologies foster stress, especially in relation to work-home boundary management, has been of particular interest in occupational psychology and to a lesser extent in human-computer interaction. Understanding this relationship has important implications for improving workplace well-being. In fact, work-related stress is a major health problem in the work environment, costing over £5 billion a year just in the United Kingdom (HSE, 2017). Most work looking at how workers are affected by this constant connection to platforms and devices belongs to the field of occupational psychology, where efforts have been directed towards understanding work-home boundary management practices and developing boundary theory (e.g., Ashforth, Kreiner, & Fugate,  2000; Kossek, Ruderman, Braddy, & Hannum, 2012; Rice, 2017). In contrast, work investigating the ways in which ubiquitous technology is changing our way of working comes primarily from the field of human-computer interaction (HCI), where there is a large body of work looking at computer supported mediated work and availability management (Cecchinato, Cox, & Bird, 2017; Mazmanian & Erickson, 2014). While the two fields complement each other nicely, there is still little research that focuses on how the two

300   Marta E. Cecchinato and Anna L. Cox overlap. In this chapter, we present a critical review (Grant & Booth, 2009) of the literature from occupational psychology and HCI to create an up-to-date understanding of how communication technologies are affecting work-home boundary management. The social construction of technology (SCOT) approach guides our review to explain how users experience work-home boundaries and their use of technology, as a result of interactions that define the experiences. This means, rather than relying on a false dichotomy that technology can be good or bad, SCOT researchers have shown how there is a two-way relationship in how technology is influencing society and vice-versa, otherwise known as “interpretive flexibility” (Klein & Kleinman, 2002). Kalman (2016, p. 9) explains that, “[i]t is not the use of ICTs that blurs the boundaries between work and home, but rather the managers, colleagues or clients who expect work to be carried out at home (or family and friends who expect employees to divert attention to them during the workday).” Similarly, individuals co-construct, manage and negotiate boundaries around their roles through social interactions. As Kreiner, Hollensbe and Sheep (2009) point out, boundary theory offers an ideal lens to study work-home boundaries within the social-constructionist approach. Our review focuses on the role that communication technologies play in shaping boundary management. As such, we emphasize how the relationship between individuals and technology can bring both enrichment and challenges, and by doing so, we are able to unearth strategies that can help individuals and organizations around boundary management. We start by covering how boundary research has evolved over time and in response to changes in technology, and then explain how mobile devices have shifted work outside the office (and family issues inside the office) through a proliferation of devices and CMC platforms that can support and challenge boundary management, negotiation, and availability. With particular focus on how this impacts knowledge workers (Pyöriä, 2005) who have flexible working practices, we start by reviewing boundary theory to explain work-home conflicts and enrichment. We then reflect on how these have changed as new communication technologies have been introduced. Finally, we conclude with an assessment of boundary management strategies that can be applied topdown and bottom-up to show how policy makers, practitioners and individuals can support better boundary management. Understanding these aspects can help researchers guide future work on boundary management in the digital age, and practitioners know where to focus efforts when designing interactions with communication technologies.

Terminology As this chapter brings together research from different fields, we clarify key terms in our review, namely what we mean by “boundary management” and “communication technologies.”% We refer to boundary management as any practice that an individual puts in place when creating, negotiating, and maintaining boundaries around work and home.

Boundary Management   301 Boundary theory literature does not agree on the terms used to describe the domains around the boundaries: as Allen (2013) points out, the terms work vs. home, work vs. family, and work vs. life are often used interchangeably to cover the variety of life roles. Here, we choose to use the umbrella expression work-home boundaries and to juxtapose work vs. personal to broadly differentiate between life roles and domains that are not necessarily confined to a specific time or space. For a deeper discussion around this terminology, see Moen (2011). From an HCI perspective, we use communication technologies as an umbrella term that encompasses both communication platforms (e.g., email, WhatsApp) and communication devices (e.g., smartphones, laptops). To simplify, while the former refers to the software or applications, the latter represents the hardware through which we can communicate. In particular, rather than “channel,” which can refer to the device or medium in communication research, we use the term “platform” to refer to the means by which one accesses content or communication.

Work-Home Boundaries Over the past 20 years, popular media have reported the growing interest in “work-life balance,” or the ideal equilibrium of well-being in all aspects of one’s life (Kreiner et al., 2009), as the outcome of a more complex process of work-home boundary management. The idea of balance is rooted in balance theory, as first described by Fritz Heider (1946). When people perceive important aspects of their life as being part of a system, they are inclined to maintain a state of balance among these elements, often through a “a juggling act,” where “some balls (roles) are larger (more demanding), some weigh more than others” (Roche, 2015, p. 18). How we juggle all these roles depends on many factors, some of which can have a positive impact on work-life balance (e.g., job satisfaction, telework), while others can impact it negatively (e.g., work overload and job demands).

Boundary Theory In general, boundaries are delimitations of an area, which can refer to a physical space (e.g., a country, a home), or a more abstract domain (e.g., a role). When referred to work and personal domains, boundaries have been classified as physical, temporal, or psychological (Clark, 2000). Physical boundaries for example can be the walls of an office, or a dedicated desk in the home of a telecommuter. Temporal boundaries refer to strict schedules, like a nine to five job, and/or explicit transitions between working time and family time, such as using the commute to shift and detach from one role to another. Finally, psychological boundaries are the series of rules self-created to establish which behaviors and attitudes belong to which domain and preferences for the balance among the domains.

302   Marta E. Cecchinato and Anna L. Cox Boundaries can be conceptualized along an integration-segmentation continuum (Ashforth et al., 2000). At one end of the continuum are individuals who tend to have work and home domains fully integrated, where “home” and “work” are “one giant category of social existence, for no conceptual boundary separates its contents or meaning” (Nippert-Eng, 1996, p. 567). At the other end are those for whom work and home are perceived as two completely separated worlds. Kossek, Ruderman, Braddy, and Hannum (2012) describe three main boundary styles that extend the integration-segmentation continuum paradigm, and include: separators, volleyers, and integrators. While integrators and separators reflect behaviors of those at the two extremes of Ashforth’s continuum, volleyers are people who rely on both strategies and switch between them depending on job structure and family situation (Kossek, Baltes, & Matthews, 2011). On a daily basis, individuals can experience repeated shifts between the different roles in different domains, each having different responsibilities and resources (e.g., employee and parent). These shifts are known as a “micro-role transitions” (Ashforth et al., 2000) and happen, for example, when a parent receives a phone call or email from their child’s school while at work. Ashforth, Kreiner and Fugate (2000) distinguish them from “macrorole transitions,” where these shifts are less frequent and occur more generally within the same domain from an old role to a new role, which comes with new responsibilities and resources (e.g., moving from being a PhD student to becoming a faculty member). The nature of the boundaries (physical, temporal, or psychological) and the degree of permeability to which they allow cross-overs (or micro-role transitions) has been attributed as the result of three factors: (1) identity centrality, (2) perceived sense of control (Kossek et al., 2012), and (3) the importance of work norms (Park, Fritz, & Jex, 2011). These, along with boundary strategies (which will be discussed at the end of the chapter), make up one’s boundary management style (Kossek et al., 2012). Identity centrality. Grounded in identity theory, identity or role centrality is an indication of the value that an individual puts on each of his or her roles and reflects the time and energy invested in a role. Identity centrality can be of four types: work, family, dual, or other (e.g., where priority is given to hobbies). Perceived boundary control. This refers to a sense of control over how permeable boundaries are and it is a psychological interpretation rather than a personal trait. Perceived boundary control can be high or low. People with high boundary control feel they are in control of when, how often, and in which direction boundary crossings occur, based on their role demands and centrality. Contrarily, people with lower boundary control perceive lower agency around boundary spill-overs and are more likely to experience workfamily conflict. Kossek et al. (2012) found that boundary control is negatively correlated with role conflict and stress and suggested that regardless of one’s preference for integration or segmentation, what makes the difference in boundary management satisfaction is a sense of boundary control. Work norms. Because of its basis in social-constructionism (i.e., the idea that boundaries are constructed in relation to others), an individual’s integration-segmentation behavior has been found to be consistent with segmenting norms in their workplace

Boundary Management   303 (Park et al.,  2011). That is to say, if a person experiences high segmentation in their or­gan­i­za­tion, he or she will be more likely to adopt a more segmented boundary style, for example by not check work emails outside of working hours. Similarly, there may be a certain expectation of how one might integrate or segment, sometimes accompanied by company policies or guidelines.

Work-Home Conflict Each role of an individual comes with its own expectations of time, attention, and resources. However, these many roles may often conflict with each other. “Work-life conflict occurs when the role demands in one domain interfere with meeting the demands of a role in another domain” (Olson-Buchanan & Boswell, 2006, p. 436). Such conflict has been linked to several undesirable outcomes, such as burnout, absenteeism, and stress (Amstad, Meier, Fasel, Elfering, & Semmer, 2011; Greenhaus & Powell, 2006; Kreiner et al., 2009). Just like different roles have different expectations, also different environments like work and home have strong (but often contrasting) expectations around rules, behaviors, and attitudes (Clark, 2000). The tensions, the interactions, and the management strategies thus created around the role/environment border are an interesting area of investigation still underexplored. Researchers (Ashforth et al., 2000; Hall & Richter, 1988) have suggested that more integration of work and home can lead to negative consequences. For example, the permeability of an integrated role allows interruptions, which in turn leads to increased confusion as to what role to adopt at that moment. This implies that individuals with higher integration have more difficulty disengaging from different roles when in a specific domain, causing negative affect and less task enjoyment (Williams, Suls, Alliger, Learner, & Wan, 1991). This is especially true if we think about ubiquitous technology that, for example, allows work communication to interrupt family time on a Sunday evening, or vice-versa, personal emails to be sent to a work account while in the office. Those who sit on the integration end of the continuum might be more likely to respond to a work email received out of office hours, interrupting their personal life; and the opposite scenario is just as likely (Ashforth et al., 2000). Interruptions can also challenge those who have more segmented boundaries and roles. As Olson-Buchanan and Boswell (2006) point out, those with segmented roles have a more negative reaction, feel more strained, and experience more inter-role conflict when an interruption occurs, compared to individuals with more integrated roles. Let’s take the example of receiving a work email outside of working hours: while for those who prefer to integrate it can help them keep on top of work, for those who prefer to segment workhome boundaries it can be a source of stress because they find it harder to ignore the work message during non-work time (Cecchinato, Cox, & Bird, 2015b; Pielot, Church, & de Oliveira, 2014). This role-referencing can result in mental preoccupation with another role, leading to strain-based work-home conflict (Olson-Buchanan & Boswell, 2006). While it is important to understand that cross-role interruptions and spill-overs can occur for both integrators and separators, it is even more important to remember that

304   Marta E. Cecchinato and Anna L. Cox these conflicts have a bi-directional nature, meaning work can interrupt non-work and non-work can equally disrupt work, depending on which role one choses to engage in (Greenhaus & Powell, 2006; Kossek et al., 2012; Kreiner et al., 2009).

Work-Home Enrichment Not all role-referencing and spill-overs have negative effects. Greenhaus and Powell (2006) propose a model of Work-Family Enrichment in which work and family are allies, and the enrichment comes from “the extent to which experiences in one role improve the quality of life in the other role” (Greenhaus & Powell, 2006, p. 73). The authors offer an extensive review on prior work measuring work-home enrichment and identify: (1) five resources that can promote work-family enrichment (skills and perspectives, psychological and physical resources, social-capital resources, flexibility, and material resources); (2) two mechanisms through which resources promote enrichment (performance, and affect); and (3) several moderators that determine conditions for resources in one role to enrich another role (salience of role, perceived relevance of resources, and consistency of resources with norms and requirements). As with work-home conflict, work-home enrichment also has a bi-directional nature. One of the five resources, flexibility, is of particular relevance. Flexibility is the ability to determine location, timing, and pace with which role requirements are met; communication technology enables us to achieve such flexibility, but also may impose too much flexibility and obligations. In the next section we will analyze how affordances and features of communication technologies are affecting work-home boundaries.

How Aspects of Communication Technologies Affect Work-Home Boundaries Digital technologies increase flexibility by enabling employees to access their work and of their personal life anytime and in any place (Allen,  2013; Boswell & OlsonBuchanan, 2007). They do so by shifting where we can work, which in turn defines how multi-device ecologies are used, by shaping what work or personal role we convey through ecologies of communication platforms, and finally by challenging the expectations and awareness of one’s availability around work or personal domains.

Work Place and Space Shift Mobility as an affordance of communication technology (Axtell, Hislop, & Whittaker, 2008; Rice et al., 2017) can shape and determine how, for example, teleworkers do their

Boundary Management   305 job (Brown & O’Hara, 2003). There are several interpretations around what constitutes mobility, ranging from contrasting the static or mobile location of where a computer can be situated in the physical space (e.g., a desktop PC can only be on a table, whereas a mobile phone can be carried in a pocket everywhere; Oulasvirta, Petit, Raento, & Sauli, 2007), to more abstract interpretations that refer to mobility as the ability to move across space and time through a mobile device for work or personal reasons (Cousins & Robey, 2015). Today’s workspace is distributed across multiple artefacts and locations, which yield to trends in device specialization, parallelism, and fragmentation (Santosa & Wigdor,  2013). This device specialization is not just limited to work spaces but also involves the home. Devices are used differently depending on where they are used and for what reason. Kawsar and Brush (2013), looking at how multiple devices were used in the home, identified spatial and temporal habits of common Internet activities. In terms of location, more personal activities (e.g., social networking) took place in private spaces (e.g., the bathroom) where interruptions are less acceptable and less likely to happen. More public and shared spaces (e.g., kitchen) instead were used for work purposes, as well as personal reasons. To make sense of mobility and what it means for work-home boundaries, it is useful to rely on Harrison and Dourish’s (1996) distinction between space and place, where the former is defined as a physical location and the latter prescribes behaviors for a specific space. More simply, spaces become places through the social interactions that happen in them. For workers with flexible working patterns, communication technology has made it more complicated to distinguish between different places. Thinking about Kawsar and Brush’s (2013) findings, the same space or locale (e.g., a kitchen) becomes populated with different places (e.g., an office space to work, but also an eating area for the family). Such places have temporal properties: “the same space can be different places at different times” (Harrison & Dourish, 1996, p. 7). What happens when those different times overlap or are not clearly defined? More than twenty years later, we question what happens when digital spaces and physical spaces are collocated, and particularly when they define incongruent work and personal places with overlapping temporal properties. Multi-device interaction has in fact created distributed workspaces, defined as “virtual areas spanning multiple devices across all physical working locations” (Santosa & Wigdor,  2013, p. 63). Kawsar and Brush’s insights are interesting and novel but not as discerning as they could have been, for example taking Dourish’s (2006) distinction between space and place. For example, what happens when digital and physical places overlap and create work-home conflict? How can interactions with technology, especially for distributed workspaces, be better designed to avoid such conflicts?

Using Multi-device Ecologies around Work-Home Spaces Before mobile technologies were introduced in our everyday lives, boundaries between work and home were more defined. Today, 77% of Americans own a smartphone, 51 percent

306   Marta E. Cecchinato and Anna L. Cox own a tablet (Pew Research, 2018), 5% own a smartwatch (The NPD Group, 2017), and these numbers are growing. Thus, understanding user interactions across multiple devices has become an active area of research, especially in more recent years. Bødker and Klokmose talk about all the devices “that a person owns, has access to, and uses” as “device ecologies” (Bødker & Klokmose, 2012, p. 448), and argue how these are constantly changing and adapting to the environment and the user. How combinations of devices are chosen and used for specific purposes needs to be understood, especially if this is different for work and for personal reasons, thus affecting boundary management. When Blackberries became widespread, work-related emails got pushed to recipients’ pockets, rather than being stored for later retrieval, providing an always-online experience and contributing to the addictive effect email can have on mobile devices (Mazmanian, Yates, & Orlikowski, 2006; Turel & Serenko, 2010). Once smartphones, like the iPhone, became popular, Dery, Kolb and MacCormick (2014) noticed that people used mobile phones mostly for personal use and associated BlackBerrys instead only with work. This meant that many users relied on two devices to keep boundaries separate between home and work, as Cousins and Robey also identified (2015). This is one strategy that people may adopt to disconnect from work outside the office. However, mobile devices (laptops, smartphones, and tablets) also constitute a bridge across work and personal boundaries, as Dearman and Pierce (2008) and Fleck, Cox and Robison (2015) found. Karlson, Meyers, Jacobs, Johns and Kane (2009) looked specifically at multi-device use and the impact on boundaries and working time. Their data logs and follow-up interviews showed that participants accessed work email outside working hours and relied on their phone whenever they did not have access to a PC. They found that people in their sample preferred to be constantly connected with work and life domains through their mobile phones and emails, as this connectedness gave participants a stronger sense of perceived control. These findings support Greenhaus and Powell’s (2006) idea of work-family enrichment, and the importance of perceived boundary control supported by Kossek et al. (2012). More recently, device ecologies have started to include also wearable technology, such as smartwatches. In contrast to mobile devices that can be placed in pockets and bags, wrist-worn devices are always in contact with its user and as such can be more discreet, allowing minimal interference between the user and the task, but at the same time they can also be more disruptive, as they are both “always on” in function as well as “always on the user.” Therefore, smart wrist-worn devices introduce the opportunity to explore new research areas of mobile user experience in relation to boundary management. So far, very little work has looked at this, with one exception. Cecchinato and colleagues (Cecchinato & Cox, 2017; Cecchinato, Cox, & Bird, 2015a; Cecchinato et al., 2017) analyzed how smartwatches are used within device ecologies and how they impact boundary management. We found that smartwatches are used strategically to better manage notifications and filter important messages to the wrist, as well as to help individuals manage their availability to others, by leveraging the limited functionalities and the

Boundary Management   307 material properties of the watch. For example, users would rely on the act of taking off the watch at home or at the end of the day as a ritual to help them disconnect from their work day. The next section moves from devices to platforms to analyze how they impact workhome boundaries, particularly when it comes to portraying ourselves and our availability through CMC.

Using Multi-platform Ecologies for Work and Personal Roles The typical individual enacts several roles throughout the day, such as parent, colleague, friend, employee, etc., none of which exists in a vacuum. How we choose to use CMC platforms tells something about how we decide to portray ourselves to others and could help inform how technology helps co-construct and negotiate work-home boundaries (Diaz, Chiaburu, Zimmerman, & Boswell, 2012). Goffman’s (1959) dramaturgical approach offers a lens to understand how users might decide to portray themselves. Goffman builds on the idea that most behaviors are bounded in space and time, and guided by specific norms belonging to the context. Farnham and Churchill apply this to the digital and physical self and call this “faceted identity,” where “different aspects of identity are performed depending on context, and expect that identity faceting will vary depending on the individual” (2011, p. 2). Similarly, Nippert-Eng discussed how self and identity are negotiated around time and space, when discussing what constitutes “work” and “home”: We each make “some sort of distinction between who we are when we are ‘at work’ and ‘at home’. This distinction may be quite remarkable for some (currently segmenting) people, hardly noticeable for other (extremely integrating) ones. However different our home and work selves are, though, boundary work supports these variations in who we are” (Nippert-Eng, 1996, p. 569). Market predictors have seen, and expect, a strong growth in the use of mobile devices for any form of communication. Use of email on phones used to be an exception, done primarily when fully-featured computers were not available, but as work becomes more flexible, its use “on-the-go” is more popular and accepted: approximately 50% of email users access it on their mobile devices (Radicati Group, 2015b; Specht, 2018). Instant messaging accounts, which today are over 3.2 billion, are expected to grow at a 4% rate until 2019, particularly for business use compared to personal use (Radicati Group, 2015a). CMC platforms can be used for work purposes, personal reasons, or both, and the reason may be influenced by the device they are accessed on. The ways in which people use communication technologies is rapidly evolving (Dery et al., 2014) and communication in the workplace has become particularly challenging compared to personal communication because it is becoming “more acceptable to have informal, non-informative, and non-work related (e.g., personal) conversations via an instant message service or with mobile devices in the workplace” (Cho, Ramgolam, Schaefer, & Sandlin, 2011, p. 40;

308   Marta E. Cecchinato and Anna L. Cox Fortunati,  2002). In fact, while work communication is fragmented across different devices, it is now also distributed across a growing number of platforms, which go beyond just email and include for example instant messaging (e.g., Skype, Slack, WhatsApp) and social media (e.g., Facebook and Workplace by Facebook). As a result, some argue that we live in a world of communication overload (Cho et al., 2011), where the rate and quantity of messages sent and received over a growing number of devices and platform can make it harder for individuals to process them. Despite the fragmentation of platforms, previous work has identified a trend for strong curation of communication around different platforms, in order to keep work and personal exchanges separate. This behavior can be associated with a desire to better manage work and personal boundaries, as well as to improve retrieval of information (Cecchinato, Sellen, Shokouhi, & Smyth, 2016). Recently, Nouwens, Griggio, and Mackay (2017) suggested that users may create idiosyncratic communication “places” within the “space” of the same app (using Harrison and Dourish’s (1996) definitions), adjusting rules based on the person they are communicating with. In other words, each platform (i.e., space) is associated with different rules (i.e., making it a particular place) and these rules are personal to the users, rather than inherent in the communication app. For example, the authors report the case of a participant who is friends with a colleague, so whenever he wants to contact the colleague for non-work purposes he will use Facebook Messenger, but he would not use WhatsApp because he sees it as a too personal platform. Other researchers have specifically compared different communication platform such as Facebook vs. Gmail (Shen, Brdiczka, & Ruan,  2013) or WhatsApp vs SMS (Church & de Oliveira, 2013) and found similar results. When the rules for a particular platform are not respected by others, work-home boundary conflict can occur. Cecchinato, Cox and Bird (2015b) report the case of a participant whose friends and family would email her on her work account when she is in the office instead of using what she would consider a personal platform (e.g., her personal account) because they know she is more likely to see the message in a timely manner. These examples emphasise how the context of a communication platform, the personal preference for use, as well as how it used in relation to others, can affect work-home boundary management. Given this fragmentation of platforms used for work and personal communications and the risks that might arise for work-home boundary management, users have developed or socially constructed new habits across devices and channels, and it is through these new habits that work-home boundary management can be further challenged or enriched. For example, Matthews, Pierce, and Tang (2009) found that users preferred their phone to triage messages in the inbox because they could easily swipe to delete or archive emails, while fully featured computers were used for reading and replying to emails, especially work ones. They also observed that smartphones were used to maintain awareness of information while away from a computer, e.g., by checking emails from remote collaborators. Other researchers have found that these checking or monitoring activities happen primarily outside of working hours, such as early morning or evening, or at weekends (Kawsar & Brush, 2013), reinforcing the notion that smartphones

Boundary Management   309 have the ability to blur work-home boundaries, depending on how available one decides to be.

Expectations of Work and Personal Availability The use of communication technologies can have a positive effect, increasing work satisfaction (Diaz et al., 2012) and empowering users to work where and when they feel is best, for example shifting an activity to a “dead time” to relieve pressure of availability (Bittman, Brown, & Wajcman, 2009). However, it also facilitates the blurring of boundaries (Boswell & Olson-Buchanan, 2007), increasing vulnerability to work-home conflict. As a result, it is more difficult for employees to distance themselves from work during non-working time (Park et al., 2011); that is, work-family boundary management tends to be asymmetric (Rice, 2017). It is worth noting that unlike personal life, which is not necessarily bound in time, work life is generally confined within certain hours, even if these are flexible and fragmented throughout the day. As a result, the challenges of constant availability for work can have worse implications compared to personal and indeed have been associated with stress and burnout (e.g., Amstad et al., 2011; Kossek et al., 2012). However, while this phenomenon is more salient in the work context, it also applies to personal life, whereby friends and family still expect timely responses (O’Hara, Massimi, Harper, Rubens, & Morris, 2014) causing role conflict in the individual, who has to more frequently complete micro-role transitions. In this section we will first analyse the challenges of expecting availability, before we move on to how curating others’ awareness of one’s availability and unavailability can help regain control over work-home boundary cross-overs. Managing availability after working hours can be so challenging, some refer to it as “the new night shift,” where employees log back into work platforms (or never log out) to respond to messages (Boswell, Olson-Buchanan, Butts, & Becker, 2016). When this constant connectivity is not motivated by the individual’s gains, it is generally the result of social expectations and work pressures (Barley et al., 2011). As a result, an individual may feel expected to be more attentive and responsive to incoming messages. Motivated by the desire to understand temporal patterns of responses in asynchronous CMC, Kalman and Rafaeli (2005) analysed chronemics (i.e., the role of time in communication) in three existing datasets of communication exchanges and found that people either reply relatively quickly or they do not reply at all. Despite digital communication having the benefit of being able to be asynchronous, people feel the need to reply quickly or be apologetic if their answer is delayed (Mazmanian, Orlikowski, & Yates,  2005; Mazmanian et al., 2006). That is because quick responses give non-verbal cues of immediacy and presence, i.e., being constantly available (Kalman, Ravid, Raban, & Rafaeli, 2006). If we are not constantly available, we feel we need to justify ourselves. In addition, aggravating this problem, some companies are even selling their employees’ rapid and sometimes constant (“24/7”) availability as part of the company’s services (Mazmanian & Erickson, 2014). Given the more or less perceived expectations of a need

310   Marta E. Cecchinato and Anna L. Cox for quick replies at any time, users are often expected to pay attention to their devices and any incoming notifications. Dingler and Pielot (2015) quantified attentiveness towards mobile messaging, analysing logs of mobile messaging notifications and user attentiveness for 42 participants over the course of two weeks. They found that people are attentive to messages for approximately 12.1 hours of the day, with higher peaks during weekdays and evenings. Taking this back to work-home boundary management, the pressures of having to constantly pay attention to work and/or personal communication, even when not currently embodying that role, can become overwhelming (Barley et al., 2011; Kossek et al., 2012; Mazmanian et al., 2006).

Awareness of Work and Personal Availability If availability is something that belongs to oneself, the awareness of that availability is instead obtained by those we interact with. Understanding how to manage the two sides of this coin is crucial when taking a social-constructionist approach of boundary management. Awareness of one’s availability can be gained in a number of ways: by explicitly asking/being told, by assumption, or by taking notice of the other person’s habits. Of relevance, there are specific features in communication platforms that are used to infer one’s availability and attentiveness to messages: these are referred to as awareness cues (Oulasvirta et al., 2007; Rice et al., 2017) and can be for example, read receipts, notifications, or online statuses to infer other people’s activity (O’Hara et al.,  2014). Understanding how people make use of awareness cues is important for work-home boundary management, because it can help identify where conflicts might arise and how to reduce them. Knowing when and how to communicate availability or unavailability can be a useful strategy to help shift between work and personal roles. The first research to provide an in-depth analysis of the issues around awareness cues in mobile devices was conducted by Oulavirta et al. (2007), who found that participants were able to infer someone’s activity (e.g., sleeping), someone’s potential availability to engage in some sort of communication (e.g., based on when they were last online), or even social situation (e.g., if two people were in the same location). A substantial body of work has looked at how these cues are used particularly in the work context to infer response times and one’s availability (e.g., Avrahami, Fussell, & Hudson,  2008; Birnholtz, Bi, & Fussell, 2012; Dourish & Bellotti, 1992). Since the initial work was published in 2007, the inclusion of awareness cues in instant messaging tools has become widespread. WhatsApp in particular allows users to be notified when a message is sent and delivered, with the use of two separate ticks next to each message. In addition, it displays the last time a user was online, a feature that can be disabled. Since that study was published, WhatsApp has added an additional feature: a change in color (from grey to blue) in the two ticks to notify when a message has been read. In their study of 20 WhatsApp users, O’Hara et al. (2014) uncovered “doings,” i.e., ways of engaging with relationships through IM-like applications. For example, the authors talk about “plausible deniability” and “plausible accounting” when discussing awareness features

Boundary Management   311 (i.e., “last seen online” and receipt ticks). They claim that these awareness features are not necessarily perceived as a precursor of interaction and communication—as Nardi, Whittaker and Bradner (2000) argue when discussing the use of IM in the office—but are instead messages per se, which they define as an “encounter of knowing” (i.e., the user gains insight about the interlocutor without having to communicate with him or her) as opposed to an “encounter of communication.” These awareness features add temporal properties to a communication, which need to be interpreted based on the interlocutor’s habits (e.g., how quickly are they likely to reply). O’Hara and colleagues (2014) explain how, when the communication happens between friends or family, these temporal patterns can be easily explained, but issues of social pressure to respond rise with particular, less intimate, relationships, such as with acquaintances or work colleagues. In these circumstances, knowing that someone has received and read a message can lead to an expectation that a reply will be sent immediately. As a consequence, user behavior has evolved as people have become more aware of how their behavior can trigger these cues to be sent to others. Users therefore adopt strategies to avoid triggering such cues. For example, we found that one of the ways in which smartwatches help people to manage their availability, and therefore their workhome boundaries, is that they enable users to read the text of any incoming message without any awareness cues being sent (Cecchinato & Cox, 2017; Cecchinato et al., 2017). Thus, curating others’ awareness of one’s availability can help regain control over boundary cross-overs. Awareness cues are also used by receivers of a message as a way of communicating unavailability, and therefore protecting their personal (or work) time. For example, Birnholtz and colleagues (Birnholtz, Guillory, Hancock, & Bazarova, 2010; Birnholtz, Hancock, Smith, & Reynolds, 2012), and Patterson et al. (2008) report strategies to avoid being constantly connected and create boundaries between devices and work and personal roles, for example by marking oneself as “away” or “invisible” on messaging platforms, despite being at their computer. Birnholtz et al. (2010) call these “butler lies,” but focused in particular on explicitly verbalized lies or linguistic solutions to overcome the technology design limitations in teenagers (e.g., saying “sorry I just saw you text” when actually it was seen straight away). The authors highlight the importance of being able to manage and coordinate one’s unavailability, especially in our always-connected society. These “lies provide a useful window into the broader sociotechnical problem of unavailability and inattention management” (Birnholtz, Hancock et al.,  2012, p. 35). Unfortunately, technology can give away the truth without the user necessarily realizing it. For example, automatic read receipts can uncover whether someone has really just read a message or indeed had delayed a reply and verbally lied about it. Ultimately, this emphasizes how managing work-home boundaries through communication technologies requires a multi-pronged effort from individuals and those they interact with (based on what strategies they use and how these are interpreted), organizations (what guidelines and training to they put in place) and interaction designers (how they design technology to support users’ boundary preferences). We discuss these aspects in the following final section.

312   Marta E. Cecchinato and Anna L. Cox

Managing Boundaries in the Digital Age We started this chapter by looking at how work-home boundaries can be challenged or crossed in either direction and how this can result in either conflicts or enrichment. We then moved on to explore how mobile technology, especially when used within multidevice ecologies, can challenge boundary management, before discussing how communication technologies are used and adopted to manage one’s availability and work-home boundaries. Together, this paints a picture of all the complex work required to create, maintain and manage these boundaries, for which support and guidance are often lacking. Communication and mobile technologies have made it easier to stay connected and thus facilitate an integration between work and personal life. However, Kossek, Lautsch and Eaton (2006) found that segmentation is a strong predictor of well-being, con­sist­ ent with Ashforth et al. (2000), and Hall and Richter (1988), who point out that integration can lead to negative consequences. While creating a sense of detachment from work can help recovery from work stress (Park et al.,  2011), segmenting can also be more demanding from a psychological point of view (Ashforth et al., 2000): it is not always as easy to stop thinking or worrying about a personal matter or a work related issue, as it is to disable notifications. Building on initial work looking at physical boundary artefacts (Nippert-Eng,  1996), more attention is being given to the role technology plays in boundary management. Most of these efforts fall under top-down policies and guidelines (e.g., Kossek et al., 2011), but more recently researchers have started to uncover bottom-up strategies that individuals can adopt (e.g., Cousins & Robey, 2015).

Top-down Boundary Strategies One of the ways companies can influence people’s boundary strategies is through their own policies. In the past few years, several policies and government precautions have been put in place to in an attempt to help workers better manage work-home boundaries (Cecchinato, Fleck, Brid, & Cox, 2015). These build upon family-friendly programs (e.g., shared parental leave) and manifest an acknowledgement on the institutions’ side of personal life values, to help lessen the effects of role conflict. Additionally, companies that pay for employees’ devices or have Bring Your Own Device (BYOD) policies are implicitly (or even explicitly in some cases) suggesting who is in control of boundary permeation (Grevet, 2014). In the first case (buying devices for employees), an employee may feel he or she is expected to be available around the clock; in the second case (adopting BYOD policies) workers might feel legitimized to take personal communications while at work (Fleck et al., 2015; Grevet, 2014). More recently, Boswell and collaborators have proposed a series of recommendations for organizations who want to help their employees manage after hour-work communications (Boswell et al., 2016).

Boundary Management   313 A variety of organizations have adopted policies with the aim of supporting workhome boundary segmentation. For example, in April 2014 officials from the Swedish city Gothenburg launched a trial policy by adopting six-hour working days, expecting the mental and physical state of their employees to improve and their productivity to increase (Crew, 2015; Gee, 2014). The experiment lasted two years and ended in early 2017: the positive results of employees feeling healthier and more productive, however, were met with some scalability concerns by the government (Alderman, 2017). Other European countries have considered similar measures. For example, Germany’s labor minister has been considering an “anti-stress” law as a measure to reduce mental health issues connected to the constantly available paradigm (i.e., checking emails after working hours) and commissioned an investigation to determine binding thresholds (Stuart, 2014). More recently, the French government introduced a law on the “right to disconnect” at the beginning of 2017, whereby employers should negotiate with employees how to reduce work intruding in their personal life, sanctioning companies who fail to clearly state what is expected of employees out of hours (Agence France—Presse, 2016). All these examples assume a one-size fits all solution. However, how one manages work-home boundaries depends on several factors. To this end, we compared different professional groups within the same university and found that how email is managed across accounts and devices varies greatly based on personal preference, but also professional differences between staff in different roles (Cecchinato, Cox, et al., 2015b). As mentioned previously, each role comes with certain expectations and resources and rather than suggesting that all employees should stop checking emails after a certain hour, researchers have suggested offering training for employees to manage resources and expectations more consciously and effectively (Jahn, Klesel, Lemmer, & Weigel, 2016).

Bottom-up Boundary Strategies Depending on one’s boundary preference for integration or segmentation, different boundary strategies may be adopted. However, individualized strategies are crafted in a dynamic and flexible way (Sturges, 2012), making it hard to know which ones to adopt. As emphasized by Chen and Karahanna (2014, p. 31), “given that cross-domain technologymediated interruptions are unavoidable for today’s knowledge workers, a concerted effort is needed by technology designers, organizations, and knowledge workers to provide tools and techniques to alleviate negative effects.” Some researchers have started to at least identify types of boundary strategies and provide some actionable knowledge for individuals. Christena Nippert-Eng (1996) identified interesting behaviors and artefacts used for managing boundaries, like having separate calendars or key chains for work and personal reasons. Kreiner et al. (2009) identified boundary work tactics pertinent to behavioral, temporal, physical, and communicative aspects. Of particular interest are the

314   Marta E. Cecchinato and Anna L. Cox communicative tactics, identified as “setting expectations” and “confronting violations.” Despite this work being published in 2009 when mobile technology was already mainstream, there is very little mention about the role technology plays in creating and using these boundary tactics. The authors labelled a type of behavioral tactic as “leveraging on technology” but did not provide detailed examples of how their participants actually leveraged technology, other than relying on caller ID and voicemail. Olson-Buchanan and Boswell (2006) discuss how technology can be used to set appropriate boundaries. They found that when fewer boundaries around the use of communication technology during non-work time are set, more work interference on non-work occurs compared to when boundaries are put in place. Golden and Geisler (2007) were among the first to study the use of a device as a boundary management strategy. They interviewed 42 users about their use of a PDA (Personal Digital Assistant) and found that participants used their devices to support their boundary style preference—whether integrating, segregating, or transcending boundaries between work and home. More recently, Cousins and Robey (2015) identified a series of tactics that can be put in place to manage psychological boundaries, including (1) designating certain rules for technology (e.g., having one phone for personal use and one for work use), (2) setting permeating rules (e.g., logging out of IM platforms when switching domain), or (3) creating connection/disconnection rules (e.g., turning off devices after a certain hour). While the strategies presented in the previous paragraph offer a classification of boundary management behaviors, they do not provide actionable strategies that other users can pick up and use. To this end, Köffer, Anlauf, Ortbach and Niehaves (2015, p. 1) identified three strategies for boundary integration, and three for segmentation. These primarily refer to the use of company devices for only work or both work and personal reasons, and similarly the use of personal devices just for personal use or also for work purposes. The authors emphasized the number of issues that users still encountered in fulfilling their boundary preferences, and in particular how those who tended to integrate work and personal life included also users who would prefer to segment the two domains but were not successful because they were unable to manage their technology. Other actionable strategies come from Jahn et al. (2016), who classified IT-related tactics based on how these tactics are put in place using technology: they can be automated (e.g., allowing automatic push notifications) or implemented manually (e.g., pulling information as a result of disabled notifications). Finally, Cecchinato and colleagues found that communication technology can be used to create microboundaries, i.e. strategies “to limit the impact of micro-role transitions caused by cross-domain technology mediated interruptions” (Cecchinato, Cox et al., 2015b, p. 3997). These strategies can be used to set social microboundaries, (e.g., disabling notifications when out for dinner); temporal microboundaries (e.g., setting restrictions on when certain apps or websites can be accessed); digital microboundaries (e.g., using separate applications to check work and personal emails); and physical microboundaries (e.g., taking off a smartwatch as a symbol of disconnecting from work; Cecchinato et al., 2017). Ultimately, microboundaries can be used by interaction designers and individuals as a way to introduce a designed friction when interacting with technology. These frictions

Boundary Management   315 could be as simple as a pop-up notification reminding a user of their intentions not to check work emails at certain times or in certain locations. Rather than encouraging seamless interactions, we have proposed the idea that for interactions to introduce small hurdles (or designed frictions), that can help users stop and reflect about what they are doing. In turn, this can help foster more mindful interactions with technology (Cox, Gould, Cecchinato, Iacovides, & Renfree, 2016). As technologies become more ubiquitous, we call for more work to explore how technology should be designed to support individuals’ boundary management practices.

Conclusion Communication technologies have increased how easily, how frequently, and how many boundary transitions can occur on a daily basis between one’s several life roles. This can be problematic because disconnection from work is important for recovery from workrelated stress. Similarly, to ensure focus and productivity, it is important to ensure some separation from personal matters while at work. Given the large number of people suffering from work-related stress, it is crucial to understand how individuals, policy-makers, and practitioners can help support better boundary management practices and thus better recovery. In this chapter, we have reviewed a large body of research, pointing out new trends that bring together two fields—occupational psychology and HCI—by combining literature on boundary theory, multi-device implications, and computer-mediated communication use. The two fields offer complementary views on the use of technology and its impact on our daily lives. By taking a social-constructionist view of technology, and particularly relying on one of the four concepts of SCOT—interpretive flexibility— we have emphasized how communication technologies can both support as well as challenge home-work boundary management. This approach has allowed us to identify strategies that individuals and organizations can rely on when socially constructing the boundaries between work and home domains.

References Agence France—Presse. (2016). French workers win legal right to avoid checking work email out-of-hours. The Guardian. Retrieved February 18, 2018 from https://www.theguardian. com/money/2016/dec/31/french-workers-win-legal-right-to-avoid-checking-workemail-out-of-hours Alderman, L. (2017). In Sweden, happiness in a shorter workday can’t overcome the cost. The New York Times. Retrieved February 18, 2018 from https://www.nytimes.com/2017/01/06/business/ sweden-work-employment-productivity-happiness.html?mtrref=query.nytimes.com Allen, T. D. (2013). The work–family role interface: A synthesis of the research from industrial and organizational psychology. In I.  B.  Weiner (Ed.), Handbook of psychology (2nd ed.) (pp. 698–718). Hoboken, NJ: Wiley & Sons, Inc.

316   Marta E. Cecchinato and Anna L. Cox Amstad, F. T., Meier, L. L., Fasel, U., Elfering, A., & Semmer, N. K. (2011). A meta-analysis of work–family conflict and various outcomes with a special emphasis on cross-domain versus matching-domain relations. Journal of Occupational Health Psychology, 16(2), 151–169. Ashforth, B., Kreiner, G. E., & Fugate, M. (2000). All in a day’s work: boundaries and microrole transitions. Academy of Management Review, 25(3), 472–491. Avrahami, D., Fussell, S. R., & Hudson, S. E. (2008). IM waiting: Timing and responsiveness in semi-synchronous communication. In Proceedings of the 2008 ACM conference on computer supported cooperative work (pp. 285–294). Florence: ACM. Axtell, C., Hislop, D., & Whittaker, S. (2008). Mobile technologies in mobile spaces: Findings from the context of train travel. International Journal of Human-Computer Studies, 66(12), 902–915. https://doi.org/10.1016/j.ijhcs.2008.07.001 Barley, S. R., Meyerson, D. E., & Grodal, S. (2011). E-mail as a source and symbol of stress. Organization Science, 22(4), 887–906. https://doi.org/10.1287/orsc.1100.0573 Birnholtz, J., Bi, N., & Fussell, S. (2012). Do you see that I see? Effects of perceived visibility on awareness checking behavior. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1765–1774). Austin, TX: ACM. Birnholtz, J., Guillory, J., Hancock, J., & Bazarova, N. (2010). “On my way ”: Deceptive texting and interpersonal awareness narratives. In Proceedings of the 2010 ACM conference on computer supported cooperative work (pp. 1–4). Savannah, GA: ACM. Birnholtz, J., Hancock, J., Smith, M., & Reynolds, L. (2012). Understanding unavailability in a world of constant connection. Interactions, 19(5), 32. Bittman, M., Brown, J. E., & Wajcman, J. (2009). The mobile phone, perpetual contact and time pressure. Work, Employment and Society, 23(4), 673–691. Bødker, S. & Klokmose, C. N. (2012). Dynamics in artifact ecologies. In Proceedings of the 7th Nordic conference on human-computer interaction: Making sense through design (pp. 448–457). Copenhagen: ACM. Boswell, W.  R. & Olson-Buchanan, J.  B. (2007). The use of communication technologies after hours: The role of work attitudes and work-life conflict. Journal of Management, 33(4), 592–610. Boswell, W. R., Olson-Buchanan, J. B., Butts, M. M., & Becker, W. J. (2016). Managing “after hours” electronic work communication. Organizational Dynamics, 45(4), 291–297. Brown, B. & O’Hara, K. (2003). Place as a practical concern of mobile workers. Environment and Planning A, 35(9), 1565–1587. Cecchinato, M. E. & Cox, A. L. (2017). Smartwatches: Digital handcuffs or magic bracelets? Computer, 50(4), 106–109. Cecchinato, M. E., Cox, A. L., & Bird, J. (2015a). Smartwatches: The good, the bad and the ugly? In Proceedings of the 33rd annual ACM conference extended abstracts on human factors in computing systems (pp. 2133–2138). Seoul: ACM. Cecchinato, M.  E., Cox, A.  L., & Bird, J. (2015b). Working 9–5? Professional differences in email and boundary management practices. In Proceedings of the SIGCHI conference on human factors in computing system (pp. 3989–3998). Seoul: ACM. Cecchinato, M. E., Cox, A. L., & Bird, J. (2017). Always on(line)? User experience of smartwatches and their role within multi-device ecologies. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 3557–3568). Denver, CO: ACM. Cecchinato, M. E., Fleck, R., Brid, J., & Cox, A. L. (2015). Online vs. Offline: Implications for Work Identity. In CHI 2015 workshop: Between the lines: Reevaluating the online/offline binary (pp. 1–6). Seoul: ACM.

Boundary Management   317 Cecchinato, M. E., Sellen, A., Shokouhi, M., & Smyth, G. (2016). Finding email in a multiaccount, multi-device world. In Proceedings of the ACM conference on human factors in computing systems (pp. 1200–1210). San Jose, CA: ACM. Chen, A. & Karahanna, E. (2014). Boundaryless technology: Understanding the effects of technology-mediated interruptions across the boundaries between work and personal life. AIS Transactions on Human-Computer Interaction, 6(2), 16–36. Cho, J., Ramgolam, D. I., Schaefer, K. M., & Sandlin, A. N. (2011). The rate and delay in overload: An investigation of communication overload and channel synchronicity on identification and job satisfaction. Journal of Applied Communication Research, 39(1), 38–54. Church, K. & de Oliveira, R. (2013). What’s up with WhatsApp? Comparing mobile instant messaging behaviors with traditional SMS. 15th international conference on humancomputer interaction with mobile devices and services (pp. 352–361). Munich: ACM. Clark, S. C. (2000). Work/family border theory: A new theory of work/family balance. Human Relations, 53(6), 747–770. Cousins, K. C. & Robey, D. (2015). Managing work-life boundaries with mobile technologies. Information Technology & People, 28(1), 34–71. Cox, A. L., Gould, S., Cecchinato, M. E., Iacovides, I., & Renfree, I. (2016). Design frictions for mindful interactions: The case for microboundaries. In CHI extended abstracts on human factors in computing systems (pp. 1389–1397). San Jose, CA: ACM. Crew, B. (2015). Sweden is shifting to a 6-hour work day. Retrieved February 18, 2018 from https://www.sciencealert.com/sweden-is-shifting-to-a-6-hour-workday Dearman, D. & Pierce, J.  S. (2008). It’s on my other Computer! Computing with multiple devices. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 767–776). Florence: ACM. Dery, K., Kolb, D., & MacCormick, J. (2014). Working with connective flow: How smartphone use is evolving in practice. European Journal of Information Systems, 23(5), 558–570. Diaz, I., Chiaburu, D. S., Zimmerman, R. D., & Boswell, W. R. (2012). Communication technology: Pros and cons of constant connection to work. Journal of Vocational Behavior, 80(2), 500–508. Dingler, T. & Pielot, M. (2015). I’ll be there for you: Quantifying attentiveness towards mobile messaging. In Proceedings of the international conference on human-computer interaction with mobile devices and services (pp. 1–5). Copenhagen: ACM. Dourish, P. (2006). Re-Space-ing place: “Place” and “Space” ten years on. In Proceedings of the 2006 20th anniversary conference on computer supported cooperative work (pp. 299–308). Banff, AB: ACM. Dourish, P. & Bellotti, V. (1992). Awareness and coordination in shared workspaces. In Proceedings of the ACM conference on computer-supported cooperative work (pp. 107–114). Toronto: ACM. Farnham, S. D. & Churchill, E. F. (2011). Faceted identity, faceted lives: Social and technical issues with being yourself online. In Proceedings of the ACM conference on computersupported cooperative work (pp. 359–368). Hangzhou, China: ACM. Fleck, R., Cox, A. L., & Robison, R. A. V. (2015). Balancing boundaries: Using multiple devices to manage work-life balance. In Proceedings of the ACM conference on human factors in computing systems (pp. 3985–3988). Seoul: ACM. Fortunati, L. (2002). The mobile phone: Towards new categories and social relations. Information, Communication & Society, 5(4), 513–528.

318   Marta E. Cecchinato and Anna L. Cox Gee, O. (2014). Swedes to give six-hour workday a go. The Local. Retrieved February 18, 2018 from https://www.thelocal.se/20140408/swedish-workers-to-test-six-hour-work-days Goffman, E. (1959). The presentation of self in everyday life. London: Penguin. Golden, A. G. & Geisler, C. (2007). Work-life boundary management and the personal digital assistant. Human Relations, 60(3), 519–551. Grant, M. J. & Booth, A. (2009). A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information and Libraries Journal, 26(2), 91–108. Greenhaus, J. & Powell, G. N. (2006). When work and family are allies: A theory of workfamily enrichment. Academy of Management Review, 31(1), 77–92. Grevet, C. (2014). Managing the work-home boundary in a BYOD (Bring Your Own Device) culture. In MobileHCI’14 workshop on socio-technical systems and work-home boundaries (pp. 1–4). Toronto: ACM. Hall, D. T. & Richter, J. (1988). Balancing work life and home life: What can organizations do to help? Academy of Management Executive, 2(3), 213–223. Harrison, S. & Dourish, P. (1996). Re-place-ing space. In Proceedings of the ACM conference on computer supported cooperative work (pp. 67–76). Boston: ACM. Heider, F. (1946). Attitudes and cognitive organization. The Journal of Psychology, 21(1), 107–112. https://doi.org/10.1080/00223980.1946.9917275 HSE (2017). Tackling work-related stress using the Management Standards approach (Workbook). Health and Safety Executive. Retrieved February 18, 2018 from http://www. hse.gov.uk/pubns/wbk01.htm Jahn, K., Klesel, M., Lemmer, K., & Weigel, A. (2016). Individual boundary management: An empirical investigation on technology-related tactics. In PACIS Proceedings (pp. 268–280). Taiwan: AIS. Kalman, Y.  M. (2016). Why do we blame information for our overload? In D.  I.  Ballard & M. S. McGlone (Eds.), Work pressures: New agendas in communication (pp. 45–62). New York: Routledge. Kalman, Y.  M. & Rafaeli, S. (2005). Email chronemics : Unobtrusive profiling of response times. In Proceedings of the 38th annual Hawaii international conference on system sciences (pp. 1–10). Big Island, HI: IEEE. Kalman, Y. M., Ravid, G., Raban, D. R., & Rafaeli, S. (2006). Pauses and response latencies: A chronemic analysis of asynchronous CMC. Journal of Computer-Mediated Communication, 12(1), 1–23. Karlson, A.  K., Meyers, B.  R., Jacobs, A., Johns, P., & Kane, S. (2009). Working overtime: Patterns of smartphone and PC usage in the day of an information worker. In International conference on pervasive computing (pp. 398–405). Orlando, FL: ACM. Kawsar, F. & Brush, A. (2013). Home computing unplugged: Why, where and when people use different connected devices at home. In Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing (pp. 627–636). Zurich: ACM. Klein, H. K. & Kleinman, D. L. (2002). The social construction of technology: Structural considerations. Science, Technology, & Human Values, 27(1), 28–52. Köffer, S., Anlauf, L., Ortbach, K., & Niehaves, B. (2015). The intensified blurring of boundaries between work and private life through IT consumerization. In Proceedings of the 23rd European conference on information systems (ECIS 2015) (pp. 1–17). Münster: AIS. Kossek, E. E., Baltes, B. B., & Matthews, R. A. (2011). How work-family research can finally have an impact in organizations. Industrial and Organizational Psychology, 4(03), 352–369. Kossek, E. E., Lautsch, B. A., & Eaton, S. C. (2006). Telecommuting, control, and boundary management: Correlates of policy use and practice, job control, and work–family effectiveness. Journal of Vocational Behavior, 68(2), 347–367.

Boundary Management   319 Kossek, E.  E., Ruderman, M.  N., Braddy, P.  W., & Hannum, K.  M. (2012). Work-nonwork boundary management profiles: A person-centered approach. Journal of Vocational Behavior, 81(1), 112–128. Kreiner, G.  E., Hollensbe, E.  C., & Sheep, M.  L. (2009). Balancing borders and bridges: Negotiating the work-home interface via boundary work tactics. Academy of Management Journal, 52(4), 704–730. Matthews, T., Pierce, J., Road, H., Jose, S., & Tang, J. (2009). No smartphone is an island: The impact of places, situation and other device on smart phone use. IBM RJ10452., 10,452, 1–10. Mazmanian, M. & Erickson, I. (2014). The product of availability: Understanding the economic underpinnings of constant connectivity. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 763–772). Toronto: ACM. Mazmanian, M., Orlikowski, W. J., & Yates, J. (2005). CrackBerrys: Exploring the social implications of ubiquitous wireless email devices. In Designing ubiquitous information environments: Socio-technical issues and challenges (pp. 337–343). Boston, MA: Springer. Mazmanian, M., Yates, J., & Orlikowski, W. (2006). Ubiquitous email: Individual experiences and organizational consequences of Blackberry use. In Academy of Management proceedings, Meeting abstract supplement, 1, D1–D6. Briarcliff Manor, NY 10510: Academy of Management. Moen, P. (2011). From “work-family” to the “gendered life course” and “fit”: Five challenges to the field. Community, Work, & Family, 14(1), 81–96. Nardi, B. A., Whittaker, S., & Bradner, E. (2000). Interaction and outeraction: Instant messaging in action. In Proceedings of the 2000 ACM conference on computer supported cooperative work (pp. 79–88). Philadelphia, PA: ACM. Nippert-Eng, C. (1996). Calendars and keys: The classification of “home” and “work.” Sociological Forum, 11(3), 563–582. Nouwens, M., Griggio, C. F., & Mackay, W. E. (2017). WhatsApp is for family, Messenger is for friends: Communication places in app ecosystems. In Proceedings of the CHI conference on human factors in computing systems (pp. 727–735). Denver, CO: ACM. The NPD Group. (2017). Smartwatch ownership expected to increase nearly 60 percent into 2019. Retrieved February 18, 2018 from https://www.npd.com/wps/portal/npd/us/news/ press-releases/2017/us-smartwatch-ownership-expected-to-increase-nearly-60-percentinto-2019/ O’Hara, K. P., Massimi, M., Harper, R., Rubens, S., & Morris, J. (2014). Everyday dwelling with WhatsApp. In Proceedings of the 17th ACM conference on computer supported cooperative work & social computing (pp. 1131–1143). Baltimore, MA: ACM. Olson-Buchanan, J. B., & Boswell, W. R. (2006). Blurring boundaries: Correlates of integration and segmentation between work and nonwork. Journal of Vocational Behavior, 68(3), 432–445. Oulasvirta, A., Petit, R., Raento, M., & Sauli, T. (2007). Interpreting and acting on mobile awareness cues. Human-Computer Interaction, 22(October), 97–135. Park, Y., Fritz, C., & Jex, S. M. (2011). Relationships between work-home segmentation and psychological detachment from work: The role of communication technology use at home. Journal of Occupational Health Psychology. 16(4), 457–467. Patterson, D.  J., Baker, C., Ding, X., Kaufman, S.  J., Liu, K., & Zaldivar, A. (2008). Online everywhere: Evolving mobile instant messaging practices. In Proceedings of the 10th international conference on ubiquitous computing (pp. 64–73). Seoul: ACM. Pew Research. (2018). Demographics of mobile device ownership and adoption in the United States. Pew Research Center. Retrieved February 18, 2018 from http://www.pewinternet. org/fact-sheet/mobile/

320   Marta E. Cecchinato and Anna L. Cox Pielot, M., Church, K., & de Oliveira, R. (2014). An in-situ study of mobile phone notifications. In Proceedings of the international conference on human-computer interaction with mobile devices & services (pp. 233–242). Toronto: ACM. Pyöriä, P. (2005). The concept of knowledge work revisited. Journal of Knowledge Management, 9(3), 116–127. Radicati Group. (2015a). Instant messaging statistics report, 2015–2019. Retrieved February 18, 2018 from https://www.radicati.com/wp/wp-content/uploads/2015/03/Instant-MessagingStatistics-Report-2015-2019-Executive-Summary.pdf Radicati Group. (2015b). Mobile Statistics Report, 2015–2019. Retrieved February 18, 2018 from https://www.radicati.com/wp/wp-content/uploads/2015/02/Mobile-Statistics-Report-20152019-Executive-Summary.pdf Rice, R.  E. (2017). Flexwork, boundaries, and work-family conflicts: How ICTs and work engagement influence their relationship. In G. Hertel, D. Stone, R. D. Johnson, & J. Passmore (Eds.), Handbook of the psychology of the internet at work (pp. 175–193). London, UK: Wiley Blackwell Industrial & Organizational Psychology Series. Rice, R. E., Evans, S. K., Pearce, K. E., Sivunen, A., Vitak, J., & Treem, J. W. (2017). Organizational media affordances: Operationalization and associations with media use. Journal of Communication, 67(1), 106–130. Roche, H.  G. (2015). Managing work and life: The impact of framing. Unpublished Ph.D. Dissertation. Seattle Pacific University. http://digitalcommons.spu.edu/iop_etd/4/ Santosa, S. & Wigdor, D. (2013). A field study of multi-device workflows in distributed workspaces. In Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing (pp. 63–72). Zurich: ACM. Shen, J., Brdiczka, O., & Ruan, Y. (2013). A comparison study of user behavior on Facebook and Gmail. Computers in Human Behavior, 29(6), 2650–2655. Specht, B. (2018). The 2017 email client market share [Infographic]. Litmus Software, Inc. Retrieved February 19, 2018 from https://litmus.com/blog/the-2017-email-client-marketshare-infographic Stuart, K. (2014). German minister calls for anti-stress law ban on emails out of office hours|Technology. The Guardian. Retrieved February 18, 2018 from https://www.theguardian.com/technology/2014/aug/29/germany-anti-stress-law-ban-on-emails-outof-office-hours Sturges, J. (2012). Crafting a balance between work and home. Human Relations, 65(12), 1539–1559. https://doi.org/10.1177/0018726712457435 Turel, O. & Serenko, A. (2010). Is mobile email addiction overlooked? Communications of the ACM, 53(5), 41. Williams, K. J., Suls, J., Alliger, G. M., Learner, S. M., & Wan, C. K. (1991). Multiple role juggling and daily mood states in working mothers: An experience sampling study. Journal of Applied Psychology, 76(5), 664–674.

SECTION 4

ORGANIZATIONAL C ON T E X T S

chapter 11

ESRC R ev iew Economy and Organizations Simeon J. Yates, Paul Hepburn, Ronald E. Rice, Bridgette Wessels, and Elinor Carmi

Introduction This chapter provides an overview of the analyses of the Delphi process, literature review and any relevant workshops material for what was originally defined as the Economy and Sustainability domain. This domain proved difficult to define via the Delphi work—it both broadened out to wider economic issues while also overlapped with many of the other domains. Interestingly, the interpretation of “sustainability” remained predominantly within the economic realm rather than in relation to social, environmental, or climate change realms. The chapter first explores the results of the various digital humanities analyses of the literature and the review of methods and theory. The chapter then sets out the results of the Delphi Process, concluding with the key questions, topics, and challenges identified by the process. The final section presents the recommendations for areas of future study. The initial ESRC scoping questions for this domain were • How do we construct the digital to be open to all, sustainable, and secure? • What impacts might the automation of the future workforce bring?

Initial Comments This domain proved the most difficult for which to collect data. Response rates to the Delphi process were low, and the data provided were more limited than in other domains. One of the major current social, political, and economic concerns for this domain is the impact of augmentation and automation, although that is notably absent

324   Simeon J. Yates et al. from the analysis of prior literature. The potential impact of automation and augmentation was extensively addressed by two dedicated workshops jointly funded by the ESRC and the UK Defence Science and Technology Laboratory, and by the ESRC and US National Science Foundation, respectively. We speculate that a review of this domain undertaken in the coming years would see this topic emerge as a major theme. Chapter 24 details more fully the outcomes from these two workshops. In the present chapter, we report the Delphi data in full, but we caution that this is not as large or robust a data set as that provided for the other domains, so the data sets used for the consultation workshop were more limited; therefore, the workshop participants provided additional commentary. Although very useful, this makes the results here dependent on a smaller set of mainly UK expertise.

Literature Analysis The literature analysis is designed to identify two sets of data. The first data set comprises the key topics within the existing literature, which will allow the comparison with areas of importance identified by the Delphi review. The second data set is a content analysis of the literature to explore the predominance of specific, theory, methods, and approaches.

Topics Despite the lower number of Delphi responses, the recommended literature was of comparable size to the other domains. Table 11.1 lists the 10 most common (2% or more of the identified cases) concepts identified in the Round 1 literature. Table 11.2 lists concept pairs. Table 11.1  Analysis Concepts Ranked Concepts

Percent

Information

13.4

Knowledge

10.3

Computer

9.2

Internet

6.6

Communication

6.0

Work

5.1

Datum

4.9

Medium

3.1

Chain

2.1

Organization

2.0

ESRC Review: Economy and Organizations   325 Table 11.2  Concept Pairings—Main and Secondary Concepts Concepts

Percent

Concepts

Percent

Concepts

Percent 16.4

chain

3.4

datum

7.8

knowledge

datum

1.9

industry

1.6

likelihood

system

1.4

mortgage

1.2

work

communication

9.6

observation

.9

seeker

4.2

competence

3.4

work

1.1

task

2.7

equipment

1.3

standard

3.0

technician

1.1

sage

1.1

information

transfer

5.0

21.3

spectrum

1.7

literacy

2.8

stress

2.1

mickey

.9

computer

14.7

construct

1.3 .9

uncertainty

1.2

medium

5.0

producer

2.6

narcissism

.7

.8

production

7.7

newspaper

1.0

course

2.1

proposition

1.3

outlet

education

2.5

sale

1.2

platform

female

1.0

supply

2.0

story

measurement

1.0

technician

.9

organization

.7 1.9 .8 3.2

personality

1.2

visibility

2.0

production

2.2

student

3.8

Internet

1.5

property

1.0

teacher

1.0

literacy

2.4

work

8.1

trait

1.0

servant

1.4

technology

2.2

van

.4

skill

5.8

time

2.3

telecommunication

1.0

work

3.6

All the literature collected from both rounds was analyzed using Wordstat. Wordstat identified 13 topics, presented in Table 11.3. In this case the two analyses do not strongly overlap except in the areas of digital skills and product development. This may reflect substantive differences in the round 1 and round 2 data sets, but as noted in chapter 2 (Methods), these are new and to an extent experimental methods. Further research work is needed to explore the different representations that alternative concept and topic modelling tools provide. We would also note that the idea of “sustainability” was predominantly interpreted as the development of “technologies to support environmental sustainability” such as smart meters. Also, it is clear that our round 2 respondents took a broader “political economy” definition into account. Finally, a considerable number of identified texts overlapped with the Citizenship and Participation (chapter 16), Communities and Identities (chapter 14), and Governance and Security (chapter 22) literature. Looking at the underlying keywords in each analysis, seven key areas stand out (see Table 11.3):

326   Simeon J. Yates et al. Table 11.3  Wordstat Analysis of Topics



Topics

Keywords

Eigen-value

Freq

Cases

% Cases

Social capital

SUPPORT; MEMBER; GROUP; SOCIAL; MEDIAT; COMMUN

10.64

30,941

546

96.1

Supply chains

SUPPLI; JURISDICT; SUPPLIER; INTANG; CUSTOM; TAXAT; CHAIN; VAT; BUSI

3.60

9442

454

79.9

Smart energy

STRENGER; YOLAND; ENERGI; SMART; EVERYDAI; LIFE

3.27

6168

446

78.5

Economic growth

MARKET; NATION; GROWTH; INDUSTRI; COMPETIT

2.81

15,685

521

91.7

Democracy and public sphere

DEMOCRACI; SPHERE; POLIT; DEMOCRAT; CIVIC; CITIZEN; PUBLIC; MEDIA

2.36

17,342

529

93.1

Urban migration and MIGRANT; CHINA; URBAN; MOBIL; mobile CHINES; PHONE; CITI; CLASS; ICT THRIFT; LEYSHON; FINANCI; GEOGRAPHI; SPACE

2.13

4169

401

70.6

Facebook and Internet use

1.96

27,056

537

94.5

Digital education and EDUC; SKILL; CHILDREN; ADULT; skills HOUSEHOLD; LITERACI; GENDER; INTERNET; SURVEI

1.91

7928

478

84.2

Marxist analysis

MARX; CAPIT; CAPITALIST; LABOUR; FUCH

1.75

6447

388

68.3

Twitter

TWEET; HASHTAG; TWITTER

1.66

1478

109

19.2

Product and technology development

DEVELOP; PRODUCT; TECHNOLOGI; KNOWLEDG; DESIGN; COLLABOR; AR; PRACTIC; SOFTWAR; THI

1.63

69,507

555

97.7

Intellectual property PROPERTI; INTELLECTU; LAW; GOVERN; PRIVAT

1.57

9190

507

89.3

Taxation

1.55

692

47

8.3

FACEBOOK; USER; ONLIN; SITE; WEB; INTERNET; GOOGL; NETWORK

TAX; OECD; BEP; TAXAT; DIGIT; ECONOMI; JURISDICT GST; VAT

1. Product and technology development 2. Social and economic capital 3. Facebook and Internet use 4. Democracy and public sphere 5. Economic growth and change 6. Intellectual property 7. Digital education and skills

ESRC Review: Economy and Organizations   327 Of these, Facebook use, and democracy and the public sphere, have been dealt with in the Communication and Relationships (chapter  8), Community and Identities (chapter 14), and Citizenship and Politics (chapter 16) domains. Looking over time at the smaller curated literature for the subject “economy,” we can see an early focus in the literature (2000–2004) on information as a product: goods and production; costs; information as property, and information rights; markets, transactions, strategy, law, organizations and firms; with a small focus on technologies, such as networks, websites, the web, and channels. By 2012–2016, there was almost a complete shift to emphasizing knowledge seeking, skills and experience, communication, uses (purposes such as health, support, and attitudes such as anxiety and ambiguity), and more global issues such as energy and the information society. But in general the pattern of concepts over time becomes quite diverse. Figures 11.1 and 11.2 display the changing nature and frequency of concept pairs in the “economy” subject, from 2000–2004 and 2012–2016.1

alliance/fir

mark/servi experience/f

employment/s

site/web

communicatio

cost/property

information/ soci

information/st

cost\transacti access/infor

information/se

market/prod peer/producti

cost/productio

property/right

flow/interne

cost/informa

communicati

information/inpu

network/wirele communication/su

id/see organizatio information/pr

good/information

group/support

effect/inform

information/ma

channel/firm

information/ know

production/

cost/market

cost/input

information/use information/pro

network/sy

information/rig

information/or flow/respon

domain/inform

communication firm/market

vendor/web

developmemt/

information/p

information/l

information equipment/n communicati

Figure 11.1  Economy 2000–2004: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2000–2004. The diameter of each circle reflects the frequency of the concept pair, with the most frequent pair beginning in the center.

328   Simeon J. Yates et al.

communicati

communication esm/team

energy/resou

energy/home channel/sour computer/st

energy/tec computer/ge

cogn/comm consumer/en

support/tie

group/support

knowledge/tra

communication

internet/skill

practice/tech network/wirele

knowledge/n labour/time communicatio

information/soci knowledge/seeker

medium/sour

energy/practice

network/suppo

behavior/comp information/know anxiety/comp

activity/te

communication/su knowledge/s

health/support

energy/feedba

ambiguity/kn

compluter/ss computer/expe peer/producti

activity/es

individual/s

study/supp

skill/van

information/skil informatio energy/way knowledge/s

internet/va boundary/te

material/p

man/resour

Figure 11.2  Economy 2012–2016: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2012–2016. The diameter of each circle reflects the frequency of the concept pair, with the most frequent pair beginning in the center.

In the workshops and stakeholder engagements, the impacts of automation, artificial intelligence, and augmentation on economy and society were often highlighted (and these issues are dealt with elsewhere; e.g., chapters 12 and 24). This was a strong societal and media topic at the time of this work, so the ESRC commissioned the team to run two further workshops on these topics. We therefore introduce some themes from the stakeholder workshops run before and during the project. In these, SMEs (small and medium sized enterprises) and corporate and government stakeholders predominantly raised issues with regard to: product and technology development and uptake, the use of social media and Internet platforms (Facebook, Twitter, Google), economic growth, intellectual property, and digital education and skills. Given the variations between the analyses of the literature and the additional concerns of stakeholders, picking key topics in the literature was challenging. We have therefore focused here on the topics with a strong economic aspect that point to wider social issues:

ESRC Review: Economy and Organizations   329

1. Digital technology uptake by both business and consumers 2. Social and economic capital of citizens 3. Digital skills 4. Economic growth and change

Digital technology uptake. In the ESRC and NSF workshop one of the economics contributors noted that “small investments in digital innovations can lead to disproportionate accumulations in wealth.” This comment reinforces one of the key features of our digital economy is the extent to which the initial development of digital systems can be relatively low cost but have high economic returns. This has been the underlying drive behind the substantive venture capital investment in digital start-ups. At the same time, governments have identified basic and advanced digital skills and digital infrastructure investments as key to developing national productivity (for example the UK government’s digital and industrial strategies at the time of writing). It’s therefore clear that digital technologies have and will continue to change, transform, and even transform many aspects of the economy and business practices and processes. Social research therefore needs to understand and examine the processes by which digital products are developed, deployed, and taken up by both businesses and consumers. A key business context where uptake of digital technologies is a challenge is that of SMEs. In the UK close to 30% of SMEs are limited users of the Internet (see chapter 13). This is a concern for many countries. For example, Ifinedo (2011) examines Canadian SMEs and the reasons behind their use (or non-use) of Internet and e-business technologies (IEBT). They concluded that the primary factors influencing adoption of the Internet are: perceived benefits (the most salient factor), management commitment/support, and external pressure. At the same time, there is a lack of awareness and knowledge of Internet services, which are accompanied by a lack of vendor support and access to financial support from the Canadian authorities. The issue of governmental intervention to support the development of a “digital society” is considered in chapters 22 and 23. Within organizations, Ifinedo notes that digital leadership is key to the uptake of digital technologies: “The views of the participating SMEs seem to be indicating that top executive support is considered crucial for IEBT to the accepted in the adopting organization” (Ifinedo, 2011, p. 269). Another key issue is how digital technologies change how organizations function, are managed, and perceive themselves. There is a large body of literature exploring the particular and general impactions of specific technologies or of digital tools in general that we cannot explore in depth here. But as we note in the content analysis to come, no clear theoretical position or approach stood out in the literature analyzed. There are examples, though, of work that seeks to apply digital- or technology-oriented theory to these issues. As an interesting take on this issue, Flyverbom et al. (2016), in their introduction to a special issue on “visibility” in the digital age, argue that in order to understand how contemporary organizations operate in a digital context, we need to understand how they manage the “visibilities” provided by digital technologies. That is, how do they make things transparent or keep things hidden within the organization? This in part refers to the extent that such technologies allow for the observation of work (content)

330   Simeon J. Yates et al. and workers (connections). Flyverbom et al. identify four central affordances in digital technologies that are enacted in the contemporary workplace: visibility, persistence, editability, and association. They argue that taking an affordance perspective on digital technology use for understanding visibility management seems appropriate in that it allows scholars not only to focus on the features of technologies that enable visibility but also to simultaneously probe how those features interact with and produce people’s goals in ways that encourage them to orient toward visibilities in entirely new ways.  (Flyverbom et al., 2016)

They conclude: Visibility is a root affordance in the digital age that helps to enable other branch affordances, including persistence, editability, association, and likely many others. In other words, these other affordances are possible because of the visibility affordance.  (Flyverbom et al., 2016)

Similarly, there is the need to understand and theorize complex networks of people and systems created by digital technologies—within companies, among citizens, and across society and the economy. Contractor, Monge, and Leonardi (2011) provided a typology for such multidimensional networks that draws from and builds on actor-network-theory and includes multiple kinds of nodes and multiple kinds of relations, involving both human and nonhuman actors. At the core of their argument is the claim that failing to conceptualize human actors and machines or technologies in complex multifaceted networks as interrelated in dynamic processes would provide incomplete and partial research or theory. They argue: . . . making technologies endogenous to networks will offer researchers the ability to begin thinking about networks composed of different types of nodes (e.g., persons, databases, books, etc.), and about where the relationships among these varying nodes also differ (e.g., one might have a friendship relationship with another person, but an information-retrieval relationship with a database). We call these “multidimensional networks.”  (Contractor, Monge, & Leonardi, 2011, p. 685)

Adoption of technologies by consumers has also been extensively studied across a range of disciplines. Such work often blends into the policy domain, as there is a focus on issues such as digital inclusion, service provision, innovation, and market development. An example would be LaRose et al.’s (2012) study that examines broadband adoption and use and offers design and monitoring of sustainable broadband adoption interventions. Again this is an area where a variety of theories are employed. The authors develop a new theory that takes people’s psychological considerations into account, rather than only demographic factors, when thinking about broadband adoption. Another concept that is taken under consideration as an improvement to the Diffusion of Innovation theory is self-efficacy, as they argue that, “habitual Internet use can be expected to provide additional opportunities to observe and directly experience the outcomes of broadband

ESRC Review: Economy and Organizations   331 adoption while bolstering beliefs about individual abilities to use the Internet effectively” (LaRose et al., 2012). They conclude and reinforce this argument by stating that their “results suggest that demographic variables such as age, income, and race play a relatively smaller role in intentions to adopt broadband, whereas socio-cognitive theory variables such as self-efficacy and habit strength play a relatively larger role” (LaRose et al., 2012). Social and economic capital. The other side of the discussion around the socioeconomic impact of digital media and technologies is that of growing “digital divides.” Though discussion of digital inequalities can be found in all the domains reported in this book, a good number of papers address this in terms of economic, social or ­cultural capital (see also chapters 5 and 15). For example, Helsper (2012) proposes a theoretical model that links social and digital exclusion. She shows how fields of social, economic, cultural, and personal resources influence digital exclusion. Helsper further argues that offline exclusion fields influence digital exclusion and are mediated by access, skills, attitude, or motivation. But Helsper also contributes to the debate by noting how digital exclusion influences social exclusion, which she categorizes as four digital impact mediators: relevance (usefulness), quality of experience (ease of use), ownership (agency and empowerment), and sustainability (social and financial). Helsper argues: The four top level fields of offline and digital exclusion relate to each other; an individual who is excluded from one is also likely to be excluded from another. Nevertheless, the fields are separate constructs addressing different (macro and micro) aspects of exclusion. These economic, cultural, social, and personal fields are operationalized through underlying specific resources that are similarly interrelated.  (Helsper, 2012, p. 417)

This is of course an issue across the globe. As an example, Cartier, Castells, and Qiu (2005) examine the category of “information have-less” which described millions in China’s income groups such as rural migrants, pensioners, and fired employees who are sitting in a gray zone of China’s digital divide. Because of their lack of financial resources, they have to use inexpensive ICTs such as Internet cafes, prepaid phone cards, and limited smart mobile phones. The people of this category tend to use “have-less” ICT, which has three characteristics: inexpensive technologies and services; limited mobility and low functional choice (usually constrained by time and space); and limited ability to perform critical informational functions. They note: The growth of have-less ICTs in China reflects the country’s economic boom since the 1980s, which is characterized by increasing income inequality . . . Structural inequality and institutional constraints can systematically keep the have-less from accessing regular and high-end ICT services.  (Cartier, Castells, & Qiu, 2005, pp. 22–23)

Skills. Within the debate on digital inequalities, a key concern is not just material access but also skills. A considerable focus of such work is on the acquisition of digital skills

332   Simeon J. Yates et al. relevant to both work and to aspects of digital exclusion. Van Deursen and van Dijk (2010), building on their prior work, point to four types of digital skills:

1. Operational Internet skills 2. Formal Internet skills 3. Information Internet skills 4. Strategic Internet skills

Using two surveys of the Dutch population, Van Deursen and Van Dijk consider how the levels of such skills vary with key demographics, concluding that the original digital divide (defined as the gap between people who have and do not have physical access to computers and the internet) has developed a second divide that includes differences in the skills to use the internet . . . In digital divide research, the conclusion that operational and formal internet skills are not sufficient for an effective use of the internet so far only received little attention. Information and strategic internet skills are also required. In contemporary (and future) information society these skills increasingly determine people’s positions in the labor market and in social life. Unfortunately, these skills appear to be the most problematic and a large part of the Dutch population seems to be struggling to equip themselves with the skills they need to participate in contemporary society.  (Van Deursen & Van Dijk, 2011, p. 908)

Looking at a specific workplace context, Van Deursen and Van Dijk (2010) examine Dutch civil servants’ Internet skills (operational, information, and strategic) across different types of civil servant roles (administrators, executive, and policy advisors). As with the general population, the authors found that these civil servants do not perform well when it comes to skills involving information and strategic tasks. They conclude that “the levels of operational and formal internet skills are higher than the levels of information and strategic internet skills” (Van Deursen & Van Dijk, 2010, p. 140). They also noted key variations by age: Age and position appear most important for the civil servant’s level of operational and formal internet skills. Younger civil servants performed better than their older counterparts, and the executive employees performed worse than policy advisors and administrators.  (Van Deursen & Van Dijk, 2010, pp. 140–141)

Economic change and growth. A key area of concern is of course the specifics of economic change and growth that comes with the use of digital technologies and media. This includes the challenges of creating global markets for digital technologies, how such technologies and their users change both industries and markets, and shifts from “material” to “digital or knowledge” products. This is an area of research that has been ongoing since the rise of ICTs, predating much of the review work reported here. A part of such work has been the understanding of how technology standards function to underpin markets for digital technologies, or, conversely, the use of digital technologies

ESRC Review: Economy and Organizations   333 to underpin markets. For example, David and Steinmueller’s (1996) article examined Global Information Infrastructures (GII) and the way that standards influence potential contributions to international trade. They point to a fundamental need to reconcile various information and communication technology standards: Technical compatibility standards play an essential role in bringing about international convergence in the production of these investment goods, and, thereby, tend to promote competition in telecommunications equipment markets. (David & Steinmueller, 1996, p. 821)

Especially because GII is not under the governance of one country, this brings new challenges whereby parties try to negotiate power and control through common agreements on standards. The authors argue that there are three economic reasons behind problems around interoperability: innovation, individuals’ mass adoption of a particular technology, and attractiveness of “super-setting” (adding features to a popular product on top of accepted standards). Somewhat supported by history, Sarkar, Butler, and Steinfield (1998) argued that intermediaries, termed as cybermediaries, and multi-organization structures, and what we may now consider to be “platform economics” of organizations and platforms such as Amazon, Netflix or Facebook, will play a key role in electronic markets:  . . . in electronic marketplaces, unique features of environment, the nature of the underlying technology, and other traditional economies of scope and scale combine to make it unlikely that the average production firm will be able to perform channel functions as efficiently as specialized cybermediaries.  (Sarkar, Butler, & Steinfield, 1998, p. 217)

The literature also examines the question of a shift from manufacturing to data or knowledge-based economics (see also chapter 1). Steinmueller (2002) points to the transition in the structure of economic activities towards a knowledge-based economy, and its implications for social development. The special characteristics of a knowledgebased economy—where information is the main economic product—mean that there is a need for new analyses and measures for economic growth that take into account their influence. Steinmueller provides a brief and simplified summary of the unique characteristics of information: Information, in turn, has important economic properties not shared by other economic commodities, namely: (1) nonexcludability (i.e., an individual’s possession of information does not prevent another from using it as well); (2) non-rivalry in use (providing a copy of information does not reduce information ‘holding’); and (3) low marginal cost of reproduction (once the first copy of information has been produced, subsequent copies are much cheaper to reproduce).  (Steinmueller, 2002, p. 144)

Some of the problems that arise are searching for, filtering, and evaluating information as part of knowledge management. There are also regulatory issues that arise when trying

334   Simeon J. Yates et al. to commodify information, such as intellectual property right and competition policy. Furthermore, there is a need to re-examine individual versus collective knowledge production. The openness of the Internet and the rapid ability to exchange information therefore potentially undermines more traditional aspects of material-based markets. This tension between the ease of reproduction and market value has been played out in a range of industries, especially media industries, which are based on the creation, distribution, commodification, and, more recently, collection of information (see Rice, 2008), of the last decade. Steinmueller, points out that restriction of access and digital “ownership” rights are required in such markets: Information can be transformed from a public good into an economic commodity to the extent that its reproduction can be limited. The most direct way to limit reproduction is to assign property rights in information. By creating ‘legitimate owners’ of information, the initial conditions are in place for the operation of a market.  (Steinmueller, 2002, p. 144)

As a counter-point to the optimistic and marketplace orientations of much of this literature, there is also a strong element of critical assessment of the socio-economic impact of digital media, mainly focused around the work of key authors. Clearly the work of Castells (2011) falls into this category. Literature that specifically takes a critical social science view on the digital can be found in a number of works by Fuchs. For example Fuchs (2016) provides a comparative political-economic analysis of China’s social media, specifically Baidu (search engine), Weibo (micro-blogging), and Renren (social network) with the USA dominant platforms Google, Facebook and Twitter. One of the key differences pointed out by Fuchs is that the Chinese state owns three of the dominant platforms, while two of them use advertising, which means that commercial and profit logics are guiding the development of the Chinese Internet in a similar manner to the US case. The work also challenges the common belief that only Chinese platforms are being monitored and censored, whereas Fuchs argues that both Western platforms and Chinese sites employ Internet filtering and control mechanisms. In addition, media companies in the West and China enjoy low or no tax regimes. Furthermore, both the US and Chinese platforms use relatively similar terms of use and privacy policies that enable them to use and commodify people’s personal data for various commercial purposes. As Fuchs notes: User data are both in China and the West’s surveillance-industrial complexes first externalised and made public or semi-public on the Internet in order to enable users’ communication processes, then privatised as private property by Internet platforms in order to accumulate capital and finally particularised by secret services and the police who bring massive amounts of data under their control that are made accessible and analysed with the help of profit-making security companies. (Fuchs, 2016, pp. 30–31)

He further points out the strong links between the various elements of the economy, especially the information and finance sectors, in both China and the United States:

ESRC Review: Economy and Organizations   335 This circumstance is an indication that the capitalist information economy is both in China and the USA not independent from the finance industry, but dependent on its investments, support and loans, which results in an interconnection of informational capitalism and finance capitalism and a dependence of informational capital on finance capital.  (Fuchs, 2016, p. 35)

Theory, Method, and Approach As in the other review chapters this analysis builds on Borah (2017), though as noted in the introduction, the data collected in this area was not as strong as in the other domains. Most of the analyzed papers (59%) were inductive, either describing findings or building theory. The remainder were deductive, undertaking theory testing or assessment (see Table 11.4). Just under a third of the papers (30%) undertook primary data collection, with 55% being discursive reviews of, or reflective on, existing research (see Table 11.5). Concerning the role of theory, only actual use of theory for the purposes of design or analysis were coded, while general reference to prior work and theory were not. The majority of papers (76%) did not utilize theory in the analysis of data. The main discipline from which theory was taken was sociology (72% of all theory used). There was considerable variety in the specific theories applied from any disciplines and no clear preference. No one theory appeared more than three times. The main research method was literature reviews (36%; Table 11.6). The majority of the empirical work focused on specific groups, with a limited number of general population studies (see Table 11.7). No papers were based on the use of big data. As noted earlier, this domain may have the least reliable Delphi data set and therefore the least explicit starting point for the literature collection, although the identified literature

Table 11.4  Epistemological Approach Percent Deductive (testing of existing theory)

41.3

Inductive (conclusions driven by data)

58.6

Table 11.5  Empirical Approach Percent Discursive/descriptive (no new data or theory)

28.9

Primary empirical (data collected and analyzed)

30.4

Secondary empirical (analysis of existing data)

14.4

Theoretical (synthesis of current or prior work)

26.4

336   Simeon J. Yates et al. Table 11.6  Research Method Percent Literature review (general or narrative)

36.2

Survey

11.0

Theory building

11.0

Interview(s)

9.2

None

8.3

Other

6.8

Ethnography

6.1

Content analysis

5.8

Focus groups

4.0

Experiment

1.2

Social network analysis

.3

Table 11.7  Study Population Percent Case study(ies) General population

1.5 8.0

Specific group

34.8

No study group

56.0

data set is of a similar scale to all the other domains. The literature appears to be predominantly reflective and review-based as opposed to being based on empirical data collection and testing. It also appears to be strongly sociological, as reflected in the strong political economy aspects of the topic analysis. Selecting areas for future work is therefore more problematic here, especially as the issue of the automation of work has been addressed separately.

Delphi Review The following sections summarize the results of the Delphi process for the domain covering: suggested scoping or research questions, key topics to address within these questions, and key challenges to researching these questions.

ESRC Review: Economy and Organizations   337

Scoping Questions The Delphi review responses indicated that the two ESRC scoping questions were deemed broadly appropriate for the domain: • How do we construct the digital to be open to all, sustainable and secure? • What impacts might the automation of the future workforce bring? Only a limited number of additional questions were provided, so they were not grouped or coded: • How is the digital economy constructed through economic, cultural, and political processes, and how could it be constructed to enable greater participation and sustainability? • How to guide and assist all participating actors in the digital economy to ensure it is open to all stakeholders, sustainable, and secure? • How can the digital and society be shaped in order to be sustainable, participatory, and fostering co-operation and inclusion? • What interventions are feasible and desirable in order to shape the digital according to any set of preferences? • How should those preferences be established? How should those preferences be negotiated, taking into account the global nature of digital? • Under which conditions and in what contexts is it desirable to construct a digital world that maximizes openness, and in which contexts is it desirable to construct a relatively closed digital environment? • What conditions and problems can hinder the establishment of a participatory co-operative, sustainable, inclusive information society and digital society? • In a given context, which approaches to openness are sustainable from a variety of stakeholder points of view? • What issues of security arise in each of these contexts that then limit the openness of the digital world? As noted previously, we have introduced some themes from the stakeholder workshops (Digital Leader Salons) run before and during the project. In these, SME and corporate and government stakeholders predominantly raised issues with regard to product and technology development, the use of social media and Internet platforms (Facebook, Twitter, Google), economic growth, intellectual property, and digital education and skills. The confirmatory survey asked respondents to select the most important of these, presented in Table 11.8. The three most frequently mentioned scoping questions (24% each for the first two and 19% for the third) involved (1) the shaping and developing of the digital economy, especially in ways that promote participation and sustainability, (2) the shaping of the interrelations of the digital and society to improve sustainability, cooperation, and inclusion, and (3) the conditions and problems hindering such shaping.

338   Simeon J. Yates et al. Table 11.8  Delphi Review Scoping Questions Question

Percent

How is the digital economy constructed through economic, cultural, and political processes, and how could it be constructed to enable greater participation and sustainability?

23.8

How can the digital and society be shaped in order to be sustainable and participatory and foster co-operation and inclusion?

23.8

What conditions and problems can hinder the establishment of a participatory cooperative, sustainable, inclusive information society and digital society?

19.0

What interventions are feasible and desirable in order to shape the digital according to any set of preferences. How should those preferences be established? How should those preferences be negotiated, taking into account the global nature of digital?

14.3

Under which conditions and in what contexts is it desirable to construct a digital world that maximizes openness, and in which contexts is it desirable to construct a relatively closed digital environment?

9.5

In a given context, which approaches to openness are sustainable from various stakeholders’ points of view? What issues of security arise in each of these contexts, which then limit the openness of the digital world?

9.5

How can all participating actors in the digital economy be guided and assisted to ensure that the digital economy is open to all stakeholders and is sustainable and secure?

0

The consultation workshop noted potential gaps in the suggested scoping questions and offered the reworked question: • How do specific digital technologies impact SMES, entrepreneurship, business opportunities, and collaborations; labor markets, work, and productivity; nature of employment, gig economy, self-employment, job insecurity, and cybercrime; taxation; gig economy (Uber), Amazon, eBay, and online selling; rural and informal economy, and regional or geographical implications (e.g., specialist regions)?

Topics The topics identified in the Delphi review were coded into 14 categories, listed in Table 11.9. The most frequently mention topic was the role and impact of major corporate platforms, followed by disruptive technology, environment and sustainability, forms of digital labor, and governance. Table 11.10 presents the ranked importance of these from the confirmatory survey, which closely matches the initial Delphi list. The consultation workshop offered a number of additional topics, some of which overlap with those presented earlier: • Impacts of digital labor on people’s life experience; impacts on firms of digital platforms

ESRC Review: Economy and Organizations   339 Table 11.9  Key Topics Ranked by Percent of Cases Topic

Percent

Topic

Percent

Role and impact of major corporate platforms

31

Digital divides

4

Disruptive technology

12

Digital literacy

4

Environment and sustainability

8

Finance and capital

4

Forms of digital labor

8

Methods

4

Governance

8

Politics

4

Productivity

4

Public vs. private

4

Surveillance

4

Theory

4

Table 11.10  Key Topics Ranked by Importance from Delphi Survey Topic/percent

Very important Important

Neutral Unimportant Very unimportant

Role and impact of major corporate platforms

85.7%

14.3%

0.0%

0.0%

0.0%

Forms of digital labor

71.4

28.6

0.0

0.0

0.0

Environment and sustainability

71.4

0.0

28.6

0.0

0.0

Disruptive technology

57.1

14.3

28.6

0.0

0.0

Governance of digital economy

42.9

42.9

14.3

0.0

0.0

• Technology adoption in organizations • Role of digital monopolies and large corporations; digital impacts on the state: taxation, feedback to society • Inequality and justice, social divides, financing, investment, crowd funding, lending • Implications of the digital for energy/resource use (i.e., increased paper consumption) • Enabling of sustainability through digital means through new platforms and apps • Regional urban/rural development

Challenges The challenges in undertaking research in this area identified by the Delphi panel were grouped into six categories. Table 11.11 lists these categories, ranked by the number of coded items, with those deemed to be domain specific by the consultation workshop

340   Simeon J. Yates et al. Table 11.11  Challenges Ranked by Percent of Cases Challenge

Percent

New methods and tools to study the digital economy

47

Access to data on the digital economy

13

Ethics

13

Representativeness of data

13

Sustainability and digital technologies

7

Understanding impact and development of algorithms

7

Note: Domain-specific challenges in bold

Table 11.12  Challenges Ranked by Importance from Delphi Survey Challenge

Very important Important

Sustainability and digital technologies

57.1%

42.9%

Understanding the impact and development of algorithms

42.9

42.9

Access to data on the digital economy

42.9

Ethics

Neutral Unimportant Very unimportant 0.0%

0.0%

0.0%

14.3

0.0

0.0

14.3

42.9

0.0

0.0

28.6

28.6

42.9

0.0

0.0

New methods and tools to study digital economy

14.3

42.9

28.6

14.3

0.0

Representativeness of big data on digital economy and society

14.3

42.9

28.6

14.3

0.0

marked in bold. By far the highest percentage of cases involved new methods and tools to study the digital economy, followed at much lower levels by access to data on the digital economy, ethics, and representativeness of data. Table 11.12 shows their ranking by the confirmation survey. There is an inverse relationship here between these lists, but given the low response rates, we should not infer too much from this. The consultation workshop proposed a set of further challenges, some of which overlap with those mentioned previously. A number of these are reflected in the cross-cutting challenges discussed in chapter 25. Here are the challenges, with domain-specific ones in italics: • The social sciences need to take place within a more technology-oriented area. • Funding landscape is inevitably shaped by the status quo/current economic modes—possibly making it harder for radically different modes to be researched.

ESRC Review: Economy and Organizations   341 • Concerns over the allure of “novelty,” as some “older topics” may also be highly needed. • Measuring overall impact of a digital technology on a business is very difficult. • Is there a bias towards quantitative data? • Similarly measuring scale and scope of new ways of working and consuming • Fluctuating and differentiation of prices make certain qualifications challenging (e.g., consumer price index). • Challenges around interdisciplinary/cross-sector working • Incorporating new forms of data, limited resources, extracting information

Conclusion Given the more limited data for this domain, making both broad and in-depth conclusions are harder than it is for the other domains. The literature and the various inputs from Delphi process, stakeholders, and review workshops tended to focus on the social and organizational aspects. The theme of sustainability has not come through strongly, nor has formal economics work. This likely points to the foci of the theme questions and limitation of the sample. But we would argue it also points to the fact that understanding the economic impact of digital systems has a strong social and sociological element that needs to be explored. Overall further work needs to be done to explore the specifically economic disciplinary issues that digital technologies engender. Within the context of this review we would argue, caveats concerning the representativeness of the data notwithstanding, that the workshops, Delphi results, and stakeholder input have defined the following key areas for future research: 1. The role and impact of major corporate digital platforms, including impacts on firms of digital platforms and the role of digital monopolies and large corporations. 2. The uptake and impacts of digital technologies in organizations, especially automation and augmentation on work and the economy. 3. Tied to both key areas 1 and 2 are forms of digital labor, including impacts of digital labor on people’s life experience, and the gig economy (linked to platforms). Two of the key challenges that cross-cut these are the finding of new methods and tools to study digital economy, and access to data on the digital economy.

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the

342   Simeon J. Yates et al. period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high-frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualizations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https:// waysofbeingdigital.com/literature-analysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each Domain, and the relative frequency of concepts associated with each cluster.

References Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Cartier, C., Castells, M., & Qiu, J. L. (2005). The information have-less: Inequality, mobility, and translocal networks in Chinese cities. Studies in Comparative International Development, 40(2), 9–34. Castells, M. (2011). The rise of the network society. New York: John Wiley & Sons. Contractor, N., Monge, P., & Leonardi, P. M. (2011). Network theory. Multidimensional networks and the dynamics of sociomateriality: Bringing technology inside the network. International Journal of Communication, 5, 39. David, P. A., & Steinmueller, W. E. (1996). Standards, trade and competition in the emerging global information infrastructure environment. Telecommunications Policy, 20(10), 817–830. Flyverbom, M., Leonardi, P., Stohl, C., & Stohl, M. (2016). Digital age. The management of visibilities in the digital age—introduction. International Journal of Communication, 10, 12. Fuchs, C. (2016). Baidu, Weibo and Renren: The global political economy of social media in China. Asian Journal of Communication, 26(1), 14–41. Helsper, E. J. (2012). A corresponding fields model for the links between social and digital exclusion. Communication Theory, 22(4), 403–426. Ifinedo, P. (2011). Internet/e-business technologies acceptance in Canada’s SMEs: An exploratory investigation. Internet Research, 21(3), 255–281. LaRose, R., DeMaagd, K., Chew, H. E., Tsai, H. Y. S., Steinfield, C., Wildman, S. S., & Bauer, J. M. (2012). Broadband adoption. Measuring sustainable broadband adoption: an innovative approach to understanding broadband adoption and use. International Journal of Communication, 6, 25. Rice, R. E. (Ed.). (2008). Media ownership: Research and regulation. Cresskill, NJ: Hampton Press. Sarkar, M., Butler, B., & Steinfield, C. (1998). Cybermediaries in electronic marketspace: Toward theory building. Journal of Business Research, 41(3), 215–221.

ESRC Review: Economy and Organizations   343 Steinmueller, W. E. (2002). Knowledge-based economies and information and communication technologies. International Social Science Journal, 54(171), 141–153. Van Deursen,  A. & Van Dijk, J. (2010). Civil servants’ internet skills: Are they ready for e-government? In M.  A.  Wimmer, H.  Chappelet, M.  Janssen, & H.  J.  Scholl (Eds.), International conference on electronic government. EGOV 2010 (pp. 132–143). Berlin, Heidelberg: Springer. Van Deursen, A. & Van Dijk, J. (2011). Internet skills and the digital divide. New Media & Society 13(6), 893–911.

chapter 12

The Ch a ngi ng Nat u r e of K now l edge a n d Serv ice Wor k i n th e Age of I n tel ligen t M achi n es Crispin Coombs, Donald Hislop, Stanimira Taneva, and Sarah Barnard

Introduction One of the most significant recent technological developments concerns the application of intelligent, interactive, and highly networked machines to jobs that up to now have been considered safe from automation. These “intelligent machines” are characterized by autonomy, the ability to learn, and the ability to interact with other systems and with humans. They draw on new advances in technologies such as artificial intelligence (AI) and robotics, enabling them to undertake tasks that could previously only be completed by human workers. We define and describe intelligent machines in detail in the following section. Referring to what some have called the second machine age, analysts and commentators have forecast mass unemployment from the automation of a wide range of predictable, repetitive job roles (Brynjolfsson & McAfee, 2016). What sets this change apart from previous technological revolutions, such as the automation of factory work in the 19th century, is the potential of intelligent machines to affect dramatic changes to the demand for skill-intensive, knowledge-based workers (Loebbecke & Picot, 2015). However, there is considerable debate regarding the likely impacts of intelligent machines on work. For example, Frey and Osborne (2017) suggest that as much as 47% of jobs in the United States economy could be eliminated from widespread

Changing Nature of Work   345 implementation of machine learning and mobile robotics over the next one to two decades. The Bank of England published a report in 2015 suggesting that almost half of United Kingdom jobs (about 15 million) could be lost to automation and AI technologies. By contrast, Arntz et al. (2016) found that only 9% of jobs were potentially automatable in Organization for Economic Co-operation and Development (OECD) economies. A valuable source of guidance for understanding these developments is current academic knowledge. Indeed, there are a considerable number of AI and robotics related research contributions that consider the potential impacts of these new technologies. However, these contributions lie in a wide range of scholarly disciplines that draw on contrasting research paradigms, theories, methods, and perspectives. This presents business leaders, policymakers, and researchers with a messy environment that lacks a coherent overview of the current state of knowledge, key research gaps, and how researchers may proceed to fill these gaps. Therefore, the purpose of this chapter is to report the findings from a systematic review of the currently published academic literature around the key impacts of intelligent machines on work. In order to explore the transformational effects of intelligent machines (such as AI and robotics), rather than capturing technological applications that are relatively mature (such as those of robots in manufacturing contexts (Dorf & Kusiak, 1994; Khouja & Offodile, 1994), the review is focused upon service and knowledge work. Several authors have noted that service and knowledge work has traditionally been safe from automation (for example, compared to manufacturing) but have identified that recent intelligent machine developments now threaten to erode many of these jobs (Brynjolfsson & McAfee, 2016; Davenport & Kirby, 2016). Unlike the manufacturing sector, the service sector produces intangible goods that may refer to a wide range of services in a variety of areas, including finance and commerce, government, transportation, health care and social assistance, tourism, arts, entertainment, and science. The growing size and importance of the service (and knowledge) sector, in comparison to agriculture and manufacturing, is a trend that has been occurring since the late 1970s in most developed economies. This idea links to and builds from Daniel Bell’s vision of an information/ knowledge society that was initially articulated in the early 1970s (Bell, 1973). Knowledge work is formally defined as work that is intellectual, creative, and non-routine and that involves the use and creation of knowledge (Hislop, Bosua, & Helms, 2018) and workers labelled as “symbolic analysts” (Reich, 1991). The knowledge sector (that partially overlaps with the service sector) is generally associated with work involving a great deal of research and development activities and the creation of innovative products. In a broader sense, the knowledge sector may refer to professional areas such as information and communication, consulting, pharmacology, and education (Kuusisto & Meyer, 2003). Thus, from an occupational perspective, this chapter considers all forms of nonmanual work, including white-collar office and administrative work, service work and what can be labelled knowledge work. The chapter is organized as follows. First the notion of intelligent machines and the different types of technologies that may be considered under this term are discussed. The approach and procedures adopted to undertake the literature review are then

346   Crispin Coombs et al. explained. The subsequent section presents the findings of the review. An overview of the nature of the literature sample is provided, followed by a discussion of the three main themes that emerged: human relations with intelligent machines; adoption and ac­cept­ ance of intelligent machines; and ethical issues associated with machine-human collaboration. The chapter concludes with a review of the key gaps in the existing literature and suggestions for future research directions.

What Are Intelligent Machines (Artificial Intelligence and Robotics)? Burkhard (2013) observed that it is difficult to define intelligent machines because there are no universal definitions of natural (animal, especially human) intelligence. Machines may be better at tasks that can be described as intelligent behavior, such as being able to apply a wide range of languages for translating text, but the quality of the translations is lower than that of human translations (so far). Further, machines do not understand the meaning of the words they translate; they use statistical calculations to determine the most likely suitable alternative word (Friend, 2018). However, recent advances in technologies have meant that these machines are more likely to be undertaking tasks that were previously performed by humans. Advances in two main types of technology have largely driven these developments: artificial intelligence (including machine learning and cognitive computing) and robotics (including service robots, robot assisted procedures, and robotic process automation). Thus, our review focuses on these two technologies.

Artificial Intelligence Several authors have acknowledged that it is difficult to define AI (DeCanio, 2016). For example, it is possible to make a distinction between strong AI (or Artificial General Intelligence) and weak AI (or Artificial Narrow Intelligence; Bostrom & Yudkowsky, 2011). Strong AI implies a system that has superhuman intelligence and at present remains a fictional aspiration. Weak AI describes AI in terms of being able to complete specific tasks that require single human capabilities such as visual perception or probabilistic reasoning. In these tasks, AI can considerably outperform human capabilities. However, AI remains unable to make ethical decisions or manage social situations. In other words, weak AI refers to the ability to complete the specific tasks that humans do rather than replicating the way humans actually think (Hengstler, Enkel, & Duelli, 2016). Despite these complexities, several authors have proposed definitions of AI. AI has been defined as the development of computers to engage in human-like thought processes such as learning, reasoning, and self-correction (Dilsizian & Siegel,  2014).

Changing Nature of Work   347 Building on the cognitive aspect, DeCanio (2016, p. 280) describes AI as a “broad suite of technologies that can match or surpass human capabilities, particularly those involving cognition.” Niu et al. (2016, p. 2) add that AI “aims to understand the essence of intelligence and design intelligent machines that can act as human behavior.” Others have emphasized the superiority of human intelligence over AI. For example, the computer scientist Larry Tesler described human intelligence as “whatever machines haven’t done yet” (Friend, 2018). All these definitions highlight the role of AI in modelling human behavior and thought, but do not go as far as to talk about using AI technologies to build other smart technologies. AI may be presented in various forms such as natural language processing, affective computing systems, virtual reality (avatars), or humanoid and non-humanoid robots (e.g., Luxton,  2014). Johnson (2014) introduces the term “artificial agent” (AA) that refers generally to computational devices performing tasks on behalf of humans autonomously (i.e., without immediate, direct human control or intervention from humans). Some AAs are software programs (e.g., bots undertaking Internet searches). A more advanced example of such a system is Robotic Process Automation (RPA), a software solution (essentially a software license) configured to do administrative work previously undertaken by humans. RPA is suited to automating a process in which a human takes in many electronic data inputs, processes these data using rules, adds data, and then enters this new information into another system, such as an enterprise or customer relationship management system (Willcocks, Lacity, & Craig, 2015).

Robots A traditional view of robots that would be familiar to popular culture concerns service robots. Those are robots that provide assistance to a human to complete a physical task, such as scrubbing, cleaning, sorting, packaging instruments, and sending them for sterilization for dentists (Chen, 2013); helping an elderly person pour a liquid (Xu, Tu, He, Tan, & Fang, 2013); providing an intelligent interactive assistant for an office environment (Wang et al., 2013), or serving meals in a restaurant (Yu et al., 2012). The goal of these robots is to provide autonomous assistance to humans in undertaking these tasks but without the need for specific human guidance. By contrast, robot-assisted surgery concerns the use of a human controlled robot to perform surgical procedures that result in less invasive procedures than those undertaken by human surgeons alone. The robotic system (for example the Da Vinci robotic system) provides a three-dimensional view, hand-tremor filtering, fine dexterity, and motion scaling and are suitable for narrow, inaccessible operative areas (Zaghloul & Mahmoud, 2016). Moreover, some robots involve a human-machine interaction that resembles the interaction between humans. These are robots that are no longer confined to factories but are specifically designed to interact with people in urban contexts. They are referred to as “social robots” (Torras, 2015). Social robots may replace receptionists or shop assistants in shopping malls, interact with elderly people or clinical patients, and even act as

348   Crispin Coombs et al. support teachers and nannies (e.g., Calo et al., 2011; Torras, 2015). Most recent developments in robotics are demonstrated by the appearance of humanoid robots. Thus, the notion of a “robot” is complex and heterogeneous, physical robots autonomously performing single or multiple tasks such as a robot waiter, physical robotics being used to extend human capabilities in terms of precision and micro-control (but not acting autonomously, such as robot assisted surgery), or social robots providing social, emotional, and informational support.

Literature Review Methods We followed a rapid review approach outlined by Khangura et al. (2012), comprising a systematic literature search, screening and selection of studies, thematic synthesis of included studies, and production of a report. The four databases used to identify relevant academic studies included: Scopus, Business Source Complete, Psychinfo, and Web of Science. Two types of search terms were used in combination: those related to the types of technology/change we were interested in examining, and those related to the effects/ impacts of these technologies/changes. The initial technology/change terms that were used included: artificial intelligence, smart machines, cognitive computing, automation of knowledge work, and automation of service work. The search was focused on these terms due to the focus of the review on the use of advance/contemporary developments in IT and computing in relation to the computerization and automation of knowledge and service work. These search terms were used in combination with other search terms related to the type of impact/effect that we were interested in examining. These impacts were in four broad areas: impacts on organizations, impacts on workers, impacts on society, and ethical implications. The search terms included innovation, business value, quality of working life, productivity, employment, social impact, autonomy, collaboration, human computer interaction, service work, knowledge work, adoption, and implementation. After exploratory searches and research, the search terms were extended to include robotic process automation, robot*/knowledge work, and robot*/service work. In all four databases all technology terms were combined individually with each impact term. The results from these searches were filtered to extract only peer reviewed articles or conference papers, published from January 2011 onwards, in English with full text available. These searches identified 1581 possible items for inclusion. The titles and abstracts from all 1581 items were reviewed. Items were excluded if they were purely technical papers concerned with engineering and design issues related to the technologies examined, or they were not focused on the application of the selected technologies in the context of service and knowledge work (i.e., studies focused purely on manufacturing were excluded). While undertaking this reviewing of items identified via the primary searches, a number of secondary items were identified for inclusion in the study population. These were identified primarily via the abstracts and reference lists of the primary search items, where additional, widely cited sources were identified.

Changing Nature of Work   349 After these additional steps were completed, the total number of sources identified for review and in-depth coding was 219. The thematic synthesis was undertaken by all of the project team, with each team member being allocated a roughly equal proportion of papers to read. The thematic synthesis of our review involved the creation of standardized summaries for each source that identified the year of publication, whether it was a journal article or conference paper, the context of the research, technology type, level of analysis (work practice, organizational or societal), research method, topic areas, and key findings. For the purposes of this chapter, we used the level of analysis categorization to extract items that focused on intelligent machines at the work practice level, resulting in a subsample of 84 publications. (Only those cited in this chapter are included in the References section; the Appendix lists all 84 references). During the in-depth coding, we identified several topic areas related to the impacts of intelligent machines on (service or knowledge) work. We discussed each of these topic areas and classified them into three broad categories: human relations with intelligent machines, adoption and acceptance of intelligent machines, and ethical issues associated with machine-human collaboration. Before presenting the findings, it is useful to give an overview of the analyzed sample. Peer reviewed papers made up 79% of the sample, conference papers constituted 19%, and the remaining 2% of sources were working papers. Just over half (54%) of the sources were based on empirical studies, with the remainder either narrative discussions of selected literature and conceptual papers (31%) or thought-leading articles (13%). The embryonic nature of knowledge on the issues examined here is further reinforced by the method of data used in the empirical studies: the most common empirical methods (35%) were a “proof of concept” experiment, with case studies and survey research accounting for 25% and 23%, respectively. Much of the reviewed research was undertaken in the Sciences with Engineering and Technology (18%), Medicine, Dentistry, and Allied Health (15%), Computer Science (14%), and Behavioral Sciences (13%), contributing 60% of the sources. Social Sciences contributed a more modest 32% of the research literature, suggesting that current studies have been techno-centric in their focus and that a wider social-centric view is presently lacking. The following sections discuss the three main themes that emerged.

Changing Human Relations with Intelligent Machines Human-Robot Interaction Several studies have documented examples of humans using robots to complement and, in some cases, extend their abilities to complete specific social interaction tasks. The majority of these studies have been undertaken in health or social care settings.

350   Crispin Coombs et al. For example, Huijnen, Lexis, Jansens, and de Witte (2016) discuss the use of humanoid robots to interact with children with autism spectrum disorders (ASD). Following a systematic review of the literature and focus groups with 53 ASD professionals, they report that a range of different humanoid robots had been found to be an effective aid for supporting health care professionals interviews with children with ASD, because the robot provides more predictable and clearly defined cues compared to human-tohuman interaction. Huijnen et al. (2016) observe that the most common use of the robot was through remote-control in which an ASD professional operates the behavior of the robot, rather than a fully autonomous robot. Therefore, the ASD professional is needed to read the social situation with the child and control the robot accordingly. Interestingly, Huijnen et al. (2016) add that this approach also creates an additional increase in workload on the professional, and that often additional technical personnel are required to operate the robot. Khosla et al. (2013) reported on three field trials of Matilda, a human-like affective communication (service and companion) robot in care homes for the elderly in Australia. The robot combines human communication tools (e.g., speech recognition) with artificial intelligence programs (e.g., emotionally intelligent, persuasive, diet suggestion dialog system). They found that the robot had the potential to increase the capacity of care homes to provide care and also improve the well-being of the elderly. For example, the elderly residents were keen for Matilda to participate in group activities and play games like Bingo and Hoy with them. Normally, a care worker would be required to be involved with calling the numbers for these games. However, Khosla et al. (2013) report that the residents did not miss the care giver that would normally have led the game. The researchers also refer to one of the residents performing a spontaneous clap and dance after winning the game, as evidence of improved well-being. However, although a care worker is no longer needed to perform the bingo calling task, the caregivers are free to complete additional care tasks, as well as deciding when to introduce and remove Matilda from the care environment and also monitoring the interaction between elderly residents and the robot. A further example concerns the application of robotics to undertake particular surgical procedures. In this case, the robot assists the surgeon to complete manipulation and mobility tasks in a remote physical environment in correspondence to continuous control movements by the remote human (Sheridan, 2016). In a five year study of 116 children De Benedictis et al. (2017) found that robotic surgery (the application of the ROSA device, or Robotized Stereotactic Assistant) in pediatric neurosurgery improves safety and reduces intrusiveness of procedures. The ROSA system is composed of a compact robotic arm and a touch screen, mounted on a mobile trolley for surgical procedures involving the head of the patient. The surgeon can either supervise as the robot performs autonomously or directly control the surgical instrument during the procedure. The ROSA system combined human decision making with the accuracy of machine technology by improving ergonomics, visualization, and the haptic ability of the surgeon. However, again, the example illustrates that the surgeon works alongside the robot, either supervising or controlling the robot, rather than being replaced by the machine.

Changing Nature of Work   351

Human-Robot Hybrid Teams An interesting vein of research is focusing on how workers collaborate with advanced robots in hybrid robot/worker teams. Schwartz, Krieger, and Zinnikus (2016) describe the conceptual organization of a hybrid team consisting of humans, robots, virtual characters, and softbots that combine artificial intelligence and robotics. They believe that a key challenge in establishing such hybrid teams is establishing intuitive interfaces between humans that typically usually use speech, gestures and facial expressions to transfer information, and intelligent machines that can use data streams to communicate with the system and other artificial team members. They add that the development of (robotic) team-competencies is also necessary to determine a suitable balance between autonomous behaviors of individual machines and coordinated teamwork. In experiments, Gombolay et al. (2015) found that when people work with robots they may actually allocate more work to themselves than to their robot co-worker because of their preferences for completing particular tasks, such as assembling compared to fetching. It was also found that people attribute greater value to human team members in comparison to robot team members. However, greater robot-autonomy positively affected the participants’ desire to work with the robot again. In an experimental study, Mubin et al. (2014) investigated the role of a robot assistant in office meetings. They constructed a hypothetical scenario of selecting a suitable job candidate with human subjects acting as members of a selection panel tasked with achieving agreement consensus regarding the most suitable candidate. The robot assistant was remotely controlled and was either dynamic and interactive (e.g., reminding the subjects that success would lie in sharing information), or passive (e.g., the robot would only interact when requested by the human subjects). Mubin et al. (2014) found that the human subjects preferred the more interactive robot as a partner in meetings compared to the passive robot, but also that the human subjects interacted with each other more than they did with the robot. They concluded that humans might be willing to engage and interact with, and even receive guidance from, a robot in the form of an active assistant, but not as a replacement for the human partner. In sum, these studies provide several examples of AI and robots working in collaboration to enhance the working practices of knowledge and service workers. These intelligent machines appear to be assisting and augmenting existing work practices, in some cases replacing a small routine and repetitive task: for example, the ASD professional using a robot as an advanced form of ventriloquist dummy to interview child patients, the care worker no longer acting as bingo caller, and the surgeon being able to perform more precise surgical procedures. In these situations, the intelligent machine appears to be seen as a helpful additional aid to complete tasks in knowledge and service work. The intelligent machine is welcome in teams when the working circumstances allow people to take on a proportionate amount of work in line with their task preferences (Gombolay et al., 2015; Schwartz et al., 2016), and it does not add extra responsibility, such as monitoring the robot’s work, to the human team members.

352   Crispin Coombs et al.

Adoption and Acceptance of Intelligent Machines Following on from the exploration of human relations with robots, the adoption and acceptance of intelligent machines in practice has been researched most extensively in the health care and transport sectors. It has been suggested that the logistics of using robotic surgery, investment of time, and storage of bulky equipment may influence the adoption of the technologies, especially as it is deemed more expensive to run (Sananès et al., 2011). Sananes et al. (2011) argue that when there are operating theatres dedicated to robotic surgery, some of these logistical problems will no longer be an issue. In other cases, the technology requires less physical management. For example, Robotic Process Automation (RPA) is a software solution (essentially a software license) configured to do the work previously undertaken by humans, for example, structured tasks associated with validating the sale of insurance premiums, generating utility bills, creating news stories, paying health care insurance claims, and keeping employee records up to date (Willcocks et al., 2015). Other robots have rather more modest aspirations: Nielsen et al. (2016) found that robots being used to perform mundane tasks such as vacuuming were well received by managers in care home settings, mainly within the context of trying to modernize care of the elderly. The vacuum cleaners were viewed by managers as affordable and effective. However, clients held mixed views towards robot vacuum cleaning— some not happy with quality of cleaning or the reduction of contact with staff, whilst others enjoyed the “on-demand” nature of vacuuming.

Trust in AI and Robots Over and above the practical issues of adoption, a clear factor for acceptance of AI and robots is trust. Trust in the technology was reported as important for air traffic controllers’ willingness to accept increased levels of automation in two hypothetical scenarios (Bekier, Molesworth, & Williamson, 2011) and trust was also identified as important for the human acceptance of AI enabled autonomous cars (Hengstler et al., 2016). Findings of a study by Kolbjørnsrud et al. (2017) show that managers have mixed feelings about AI, and that top managers are more enthusiastic than mid/front-line managers. When asked if they are comfortable with AI monitoring and evaluating their work, participants’ responses again became more negative lower in the management hierarchy. Kolbjørnsrud et al. (2017) hint that this may be due to apprehension about the threat of job losses as a result of AI implementation, although the study does not explicitly explore what underpins these differences. It is possible to make distinctions between fostering trust and enhancing confidence, as suggested by Pieters’ (2011) study on cyber security and AI. There, system users’ trust was fostered through explanations about the security and processes of the system (thus requiring an opening of the “black box”), and

Changing Nature of Work   353 confidence was enhanced through explanations about the validity of the decisions themselves (here the “black box” can remain closed). Cultural differences to AI were also noted, with managers in emerging economies (e.g. India, China and Brazil) more open to the technology (Kolbjørnsrud et al., 2017). Some suggest that, where appropriate, unions should be involved in consultations regarding the implementation of AI and robots. Further, the variability of the manual system and work practices should be fully understood by the integrator before the automated system is implemented (Charalambous, Fletcher, & Webb, 2015). It is also advised that in order to enhance trust in these new technologies, more support should be given to employees as their roles change from being workers to becoming supervisors of automated processes (Charalambous et al., 2015). In sum, the findings of the literature in this area suggests that to facilitate the adoption and acceptance of intelligent machines it is necessary to create a suitable workplace environment, in terms of physical configuration and design. Top managers wishing to adopt intelligent machines may need to convince less senior managers that the implementation of AI or robotics will lead to positive change. In particular, less senior managers are likely to be concerned about worker fears regarding the loss of tasks and ultimately jobs, or about role changes. For example, service workers such as cleaners may see the introduction of robot vacuum cleaners as a threat to their long-term job security or may be apprehensive regarding new role expectations of having responsibility for checking and monitoring robot vacuum cleaner performance. There may also be a need for managers to provide support to knowledge workers to adjust to working with and following decision support provided by intelligent machines, such as supporting air traffic controllers’ development of trust in AI decision making for choosing aircraft flight patterns. Again, changes in work roles and responsibilities may be a crucial area to be agreed regarding critical task outcomes. If the new AI system for aircraft flight control recommends an incorrect decision that the air traffic controller implements, where does the responsibility for this decision reside? These types of ethical issues are considered in the followings section.

Ethical Issues Associated with Machine-Human Collaboration Intelligent machines are already present in many areas of our society (Friend, 2018) and will play an increasing role in our work and overall lives in the future. The more advanced intelligent machines become (e.g., more human-like androids), the more blurred the physical, psychological, and social boundaries between machines and humans will be. For example, robots will be “looking” after clinical patients, educating students, and making complex financial or security decisions (e.g., Luxton,  2014; Torras, 2015). While the experienced and anticipated benefits of these technologies for

354   Crispin Coombs et al. individuals, organizations and societies are apparent (e.g., Calo et al., 2011; Luxton, 2014), rapid technological developments in this area may also posit some serious risks. For example, using simulations for patients with delusional or psychotic psychopathologies in the absence of careful monitoring may put the health of these patients at a great risk (Luxton, 2014). Torras (2015) warns about potential negative impacts of robot nannies on children’s psychological development. For instance, how could a robot achieve a balance between protecting a child from danger and restricting his/her freedom (hence, affecting the child’s development to become mature and autonomous)? Such progressing interactions between machines and humans are psychologically complex and evoke some important ethical questions. Thus, a robust ethical strategy that will ensure the safe use of advanced technologies becomes an imperative (e.g., Luxton,  2014; Torras, 2015). The following paragraphs present two key emerging themes associated with intelligent machine-related ethical issues in a work context; safety and risks during human-machine relations, and responsibility and accountability for intelligent machines.

Safety and Risks during Human-Machine Relations Luxton (2014) hypothesizes a number of ethical issues related to artificial intelligence care providers (AICPs) in mental health and in care professions (e.g., medicine, nursing, social work, education, and ministry) in general. Most of these refer to safety of a human-machine interaction. AICPs may exist in various forms and interact with users (e.g., patients) in different ways. For instance, AICPs may be avatars (virtual simulations), social robots (either humanoid or non-humanoid), as well as non-embodied systems (e.g., audio simulations). Many current “caring” machines are designed to “read” emotions and behavioral signals, and even simulate emotions and empathetic understanding. Thus, boundaries between humans and machines may become less obvious and in some extreme cases lead to “Turing Deceptions” (i.e., the inability of a human to determine if s(he) is interacting with a machine or not). This could be a significant ethical issue, especially in situations involving vulnerable people (such as children or clinical patients; Bryson, 2016). For example, Weizenbaum (Luxton, 2014) found that even when patients who interacted with an AI-simulated psychotherapist knew that it was just software, they still considered it a real therapist. A further illustration of such a situation is the case of Paro, a robotic baby seal used for therapeutic purposes with patients with mid- and advanced dementia (Calo et al., 2011). Paro is intended to be a replacement of social interaction with people or animals and is labelled as a Class 2 medical device by the U.S. Food and Drug Administration. Practically, Paro is considered a type of non-medication anti-depressant. Calo et al. (2011) argue that despite some hypothesized risks (e.g., of evoking empathetic response in patients, who are deceived by the robot’s appearance), Paro can be highly beneficial for the patients’ health, if used appropriately and competently. Hence, the related ethical issue here is not so much about whether, but how, to use intelligent machines.

Changing Nature of Work   355 “Healthy” machine-user interactions can also be secured through transparent information about a robot’s characteristics, as well through limiting an intelligent machine’s capabilities to a specific context (Bostrom & Yudkowsky, 2011; Bryson, 2016; Kinne & Stojanov, 2014). With regard to transparency of information, Bostrom and Yudkowsky (2011) suggest that “it will become increasingly important to develop AI algorithms that are not just powerful and scalable, but also transparent to inspection” (p. 1). More recent publications (e.g., Bryson, 2016) report the development of sets of guidelines for designers and users of intelligent machines. These guidelines (also known as “Principles of Robotics”) emphasize the need of transparency, which Bryson explains as “…clear, generally comprehensible descriptions of their [robots] goals should be available to any owner, operator, or other concerned party” (Bryson, 2016, p. 205). Moreover, Kinne and Stojanov (2014) discuss the ethical issues associated with using Lethal Autonomous Weapon System (LAWS) and emphasize the importance of the specific context. For example, there might be situations in which an intelligent machine is superior in its ethical behavior to a human ethical judgement. Unlike humans, machines in a given context would not be susceptible to emotions, which can present a risk compromising ethical decisions. Notably, in order to be able to socially accept and properly utilize a humanmachine interaction, humans should be aware of how (and within what boundaries) to interact/collaborate with intelligent systems.

Responsibility and Accountability for Intelligent Machines Luxton (2014) emphasizes the importance of competency levels of the AICPs users for avoiding putting patients at risk. Competency refers to both the design and ethical use of intelligent machines. Increased complexity of AI systems causes greater difficulty in the prediction and interpretation of machine behaviors and, therefore, presents higher risks for the humans’ safety (e.g., Friend, 2018). Also, with the evolution of intelligent machines the boundaries between the role of humans and machines may become less clear and, therefore, more difficult to manage (Bostrom & Yudkowsky,  2011; Johnson, 2014). In addition, when large numbers of people have been involved in the design and use of intelligent machines, it is not always obvious who the responsible individuals are. Examples in this area refer to a variety of sectors including scenarios about the use of robotic health care assistants, autonomous vehicles, AI in banking and commerce, etc. (e.g., Luxton,  2014; Torras,  2015). Both scientists and practitioners have vigorously argued about who should take responsibility, and at what point, for the (potential) negative consequences of the applications of intelligent machines (e.g., Johnson, 2014). One point upon which most authors agree is that the ultimate responsibility should lie with the human stakeholders (i.e., machine designers, manufacturers, implementers, and users (e.g., Luxton, 2014). In sum, literature suggests that the transition of intelligent machines into the domain of knowledge and service work, a domain that had been solely the purview of humans, may present a number of ethical challenges, such as avoiding the creation of Turing

356   Crispin Coombs et al. Deceptions at the work practice level. The studies provide examples of situations where such problems arise with human patients anthropomorphizing AI simulated psychotherapists or therapy robots such as Paro. Researchers argue that it will be important to focus on the transparency of information in intelligent machines and whether a human or machine is making the decisions. This transparency also has important implications for responsibility and accountability debates regarding knowledge and service work. As knowledge and service work becomes more augmented by intelligent machines and the boundaries of tasks and roles blur, decisions about responsibility and accountability will become even more complex. The previous sections have described three key themes that emerged from the literature review regarding how intelligent machines may change knowledge and service work. In the following section we present an agenda for research in these areas.

Agenda for Future Research The broad review of the literature related to recent developments in intelligent machines and their potential impact on service and knowledge work practices grounds the following agenda for future research. First, we discuss cross-cutting requirements for future multi-disciplinary, context-sensitive empirical research related to intelligent machines’ impacts on knowledge and service work. Second, we consider the research priorities for each of the three themes that emerged from the literature review.

Cross-cutting Requirements Multi-disciplinary research. First, the emerging notion of intelligent machines is multifaceted. It is associated with a variety of academic subjects and complex sociotechnical systems. For example, our review reveals that researchers have investigated the ethical issues associated with AI and robots from Computer Science (Bryson,  2016), Engineering (Johnson, 2014), Robotics and Industrial Informatics (Torras, 2015) and Philosophy perspectives (Michelfelder, 2011). Hence, it can be best studied through the adoption of a multi-disciplinary approach that is focused upon multiple stakeholders (e.g., the human designers, manufacturers, and users of machines, policymakers, regulators, and the intelligent machines themselves) and accounts for a wide range of person, social, technical, legal, and environmental factors. Contextual focus. Although the importance of studying intelligent machines in specific contexts has been acknowledged, only a small number of recent studies have attempted to address organizational or work-specific topics (e.g., Dogan et al., 2016; Kinne & Stojanov, 2014; Luxton, 2014). Most of the published literature refers to general issues associated with intelligent machines and future-oriented scenarios. Future research should aim to capture work-specific themes along with key general issues and,

Changing Nature of Work   357 thus, ensure a more in-depth knowledge of both the concept and the practical manifestations of intelligent machines. For example, linking back to the theme of changing human relations with intelligent machines, the speculative and experimental literature on human-robot dynamics in hybrid teams (e.g., Schwartz et al.,  2016; Gombolay et  al.,  2015) raises questions about how humans team/co-work with intelligent machines, in relation to decision-making authority over scheduling decisions. While there is an extensive body of literature of human-computer interaction, future research needs to examine in rich detail the nature of this dynamic, in ways which take account of ongoing technological developments in AI systems to communicate via increasingly human-like speech etc. The more human-like these systems become, the greater the implications for human-technology dynamics and interactions. Empirical research. While this review has focused on empirically grounded analysis, the majority of the current academic literature on these issues proposes theoretical models or laboratory proof of concept experiments of intelligent machines rather than offering empirical evidence. In the future, research should move on to more empirical exploration of the context-specific issues. Given the complex nature of the topic, we recommend a mixed-method approach involving the use of both qualitative and quantitative research designs (e.g., experiments, real-time measurements, stakeholder surveys, focus groups, case studies, system-collected usage data). For example, the theme on the adoption and acceptance of these technologies highlighted the importance of human/user trust in technology for its implementation and use to be effective. Arguably, something like a longitudinal, mixed-methods, case study-based approach has the ability to capture data on how the attitudes and trust levels of different stakeholders (managers, IT staff, users etc.), dynamically evolve during the implementation and use of AI systems.

Research Priorities for the Three Themes Investigating changing human relations with intelligent machines. It is clear from the literature that the nature of the relationship between humans and intelligent machines is changing. Studies suggest that the social aspect of human-machine interaction is an important mediating (and moderating) factor for the successful realization of the benefits from automation. For example, the literature on the use of robots in the provision of care for the elderly and those in care homes (e.g., Metzler, Lewis, & Pope, 2016; Nielsen et al., 2016) raises questions about how the success of such technologies will be shaped by factors such as user attitudes, and the extent to which it is perceived that robots and AI are able to provide the type of emotional support and care currently provided by human nurses and care staff. Thus, further research that examines mediating factors of human-machine relations, such as user perceptions of AI to provide emotional care and support, would provide a useful context to interpret how quickly we are likely to embrace these new technologies, how to foster more positive outcomes, and how to prevent or mitigate more negative ones.

358   Crispin Coombs et al. We suggest that in-depth empirical studies drawing on ethnographic methods on real-life case studies (rather than in experimental settings) would offer crucial insights into the relationships between robots and humans in the workplace, which may discover interesting examples of how intelligent machines are assimilated and/or subverted in practice. Being sensitive to the idea of “subversion via practice” is important, as the way any technology is used and appropriated is often different from how it is designed, with user adaptation having significant implications for technology use (Beaudry & Pinsonneault,  2005). Thus, with the implementation and use of any form of AI, or advanced robotics, account needs to be taken of this, which can only be done via indepth qualitative studies, which are sensitive to the micro-level subtleties of user behavior and intention. Aiming to include a range of case studies from different countries, sectors, and organization sizes would also enable research to begin unpacking some of the contextual factors that have only been hinted at so far. Investigating the adoption and acceptance of intelligent machines. Much of the research in this review discusses intelligent machines in terms of complementing and extending human capabilities rather than removing humans from work processes. The concept of augmentation of humans and human work in a range of ways, rather than wholesale replacement from robotized job automation, flows through the literature across a range of domains (Davenport & Kirby, 2016). However, future research needs to better account for the “multi-layered” nature of the work that humans carry out and where automation fits. For example, the case of robot assisted surgery (e.g., De Benedictis et al., 2017; Sananès et al., 2011) represents an important and interesting context where AI and to robotics are augmenting the work of surgeons. Future research needs to examine, in a fine-grained way, the diverse ways in which surgical work is changed, where some aspects/roles/tasks may remain unchanged, and while others are radically transformed. To gain a more accurate understanding of these issues a largescale cross-country survey study on experiences with implementation, trust issues, and feelings of confidence might be a suitable research strategy. The inclusion of participant employment information with regard to contract types, level, and job role would also help to find out more about how users’ position in organizational hierarchies shapes their experiences with innovative technologies in the workplace. Investigating intelligent machine-related ethical issues. The review highlights that some key ethical issues such as safety, accountability, and liability related to intelligent machines need further attention (e.g., Bryson,  2016; Johnson,  2014; Luxton,  2014; Yampolskiy & Fox, 2013). For instance, further research is needed on whether AI can or should be afforded moral agency or patience (Bryson,  2016); how the responsibility arrangements for intelligent machines will be negotiated and worded as the technology is being developed, tested, put into operation, and used (Johnson, 2014); who should be held responsible in a complex sociotechnical system with multiple human stakeholders (Luxton, 2014); and how, for the sake of humans’ safety, the development and testing of advanced AI can be confined to a highly controlled environment (e.g. via a formalized confinement protocol) and thus directed by the human machine designers in ac­cord­ance

Changing Nature of Work   359 to the latest developments of machine ethics (Yampolskiy & Fox, 2013). Further, the current literature presents only a few examples of early attempts to create AI-related legal and policymaking frameworks (Bryson, 2016). Zeng (2015) highlights that current legislation refers mostly to low-tech technologies, leaving advanced AI systems unregulated. Consequently, the legal and policymaking approaches to AI ethics are reactionary (i.e., triggered sporadically by accidents that occur) rather than holistic (i.e., generally preventative; Ambrose, 2014). Further research that examines how legal and policy decisions are debated, agreed and implemented is needed to understand how different societies are responding to the challenges and opportunities intelligent machines present for knowledge and service work. Research that compares the emerging policy responses and regulatory systems proposed by different national governments may provide valuable insights regarding ethical concerns related to the adoption of intelligent machines.

Conclusion The evidence so far, such as it is, suggests that intelligent machines (here, AI and robots) are augmenting what people are doing and enabling some degree of role expansion for employees. Key questions are still open and require further analysis based on evidence of how intelligent machines are being developed and implemented in practice, and how workers and humans interacting with these machines experience these changes. However, it is important to keep in mind that workers, organizations, governments, and society have the power to shape the future use of these new technologies. The future is malleable, but it is up to us to be pro-active in shaping it.

Acknowledgments Funding Acknowledgement and Disclaimer: The Chartered Institute of Personnel and Development (CIPD) funded the initial data collection for this study. The views expressed are those of the authors and not necessarily those of the CIPD.

References Ambrose, M. L. (2014). The law and the loop. In IEEE international symposium on ethics in science, technology and engineering, ETHICS 2014. Chicago. http://doi.org/10.1109/ ETHICS.2014.6893374 Arntz, M., Gregory, T., & Zierahn, U. (2016). The risk of automation for jobs in OECD countries: A comparative analysis (No. 189). Retrieved from www.oecd.org/els/workingpapers Beaudry, A., & Pinsonneault, A. (2005). Understanding user responses to information technology: A coping model of user adaptation. MIS Quarterly, 29(3), 493–524. Bekier, M., Molesworth, B.  R., & Williamson, A. (2011). Defining the drivers for accepting decision making automation in air traffic management. Ergonomics, 54(4), 347–356. http://doi.org/10.1080/00140139.2011.558635

360   Crispin Coombs et al. Bell, D. (1973). The coming of post-industrial society. Harmondsworth: Penguin. Bostrom, N., & Yudkowsky, E. (2011). The ethics of artificial intelligence. In K. Frankish & W.  Ramsey (Eds.), Cambridge handbook of artificial intelligence (pp. 1–20). New York: Cambridge University Press. http://doi.org/10.1016/j.mpmed.2010.10.008 Brynjolfsson, E., & McAfee, A. (2016). The second machine age—work, progress, and prosperity in a time of brilliant technologies. New York and London: WW Norton & Co. Bryson, J.  J. (2016). Patiency is not a virtue : Intelligent artefacts and the design of ­ethical  systems. In Association for the Advancement of Artificial Intelligence (pp. 1–18).  Phoenix. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/summary? ­ doi=10.1.1.299.5725 Burkhard, H.-D. (2013). Let the machines do. How intelligent is artificial intelligence? In 36th international convention on information and communication technology, electronics and microelectronics (MIPRO) (pp. 947–952). Opatija. Calo, C. J., Hunt-Bull, N., Lewis, L., & Metzler, T. (2011). Ethical implications of using the Paro robot with a focus on dementia patient care. In Association for the Advancement of Artificial Intelligence workshop (Vol. WS-11–12, pp. 20–24). San Francisco, CA. Retrieved from http:// www.scopus.com/inward/record.url?eid=2-s2.0-80,054,913,481&partnerID=40&md5=cf9ad1 b74f81299aa3f48be4f8e9700f Charalambous, G., Fletcher, S., & Webb, P. (2015). Identifying the key organisational human factors for introducing human-robot collaboration in industry: An exploratory study. International Journal of Advanced Manufacturing Technology, 81(9–12), 2143–2155. http:// doi.org/10.1007/s00170-015-7335-4 Chen, L. (2013). The application of robots and eye tracking devices in a general dentist’s clinic. In IEEE third international conference on consumer electronics (pp. 5–7). Berlin. Davenport, T. H., & Kirby, J. (2016). Only humans need apply: Winners and losers in the age of smart machines. New York: Harper Business. De Benedictis, A., Trezza, A., Carai, A., Genovese, E., Procaccini, E., Messina, R., . . . Marras, C.  E. (2017). Robot-assisted procedures in pediatric neurosurgery. Neurosurgical Focus, 42(5), 1–12. http://doi.org/10.3171/2017.2.FOCUS16579 DeCanio, S.  J. (2016). Robots and humans—complements or substitutes? Journal of Macroeconomics, 49, 280–291. http://doi.org/10.1016/j.jmacro.2016.08.003 Dilsizian, S. E., & Siegel, E. L. (2014). Artificial intelligence in medicine and cardiac imaging: Harnessing big data and advanced computing to provide personalized medical diagnosis and treatment. Current Cardiology Reports, 16(441), 1–8. http://doi.org/10.1007/s11886013-0441-8 Dogan, E., Chatila, R., Chauvier, S., & Evans, K. (2016). Ethics in the design of automated vehicles: The AVEthics project. In 1st workshop on ethics in the design of intelligent agents (pp. 1–6). The Hague. Dorf, R. C., & Kusiak, A. (1994). Handbook of design, manufacturing, and automation. London and New York: Wiley. Frey, C. B., & Osborne, M. A. (2017). The future of employment: How susceptible are jobs to computerisation? Technological Forecasting and Social Change, 114, 254–280. http://doi. org/10.1016/J.TECHFORE.2016.08.019 Friend, T. (2018, May). How frightened should we be of A.I.? The New Yorker, 1–19. Retrieved from https://www.newyorker.com/magazine/2018/05/14/how-frightened-shouldwe-be-of-ai Gombolay, M. C., Huang, C., & Shah, J. A. (2015). Coordination of human-robot teaming with human task preferences. In AAAI Fall symposium series on AI-HRI. Arlington.

Changing Nature of Work   361 Hengstler, M., Enkel, E., & Duelli, S. (2016). Applied artificial intelligence and trust: The case of autonomous vehicles and medical assistance devices. Technological Forecasting and Social Change, 105, 105–120. http://doi.org/10.1016/j.techfore.2015.12.014 Hislop, D., Bosua, R., & Helms, R. (2018). Knowledge management in organisations: A critical introduction (4th ed.). Oxford: Oxford University Press. Huijnen, C. A. G. J., Lexis, M. A. S., Jansens, R., & de Witte, L. P. (2016). Mapping robots to therapy and educational objectives for children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 46(6), 2100–2114. http://doi.org/10.1007/s10803-016-2740-6 Johnson, D. G. (2014). Technology with no human responsibility? Journal of Business Ethics, 127, 707–715. http://doi.org/10.1007/s10551-014-2180-1 Khangura, S., Konnyu, K., Cushman, R., Grimshaw, J., & Moher, D. (2012). Evidence summaries: The evolution of a rapid review approach. Systematic Reviews, 1(1), 10. http://doi. org/10.1186/2046-4053-1-10 Khosla, R., Chu, M.  T., & Nguyen, K. (2013). Affective robot enabled capacity and quality improvement of nursing home aged care services in Australia. In IEEE 37th annual computer software and applications conference workshops (pp. 409–414). Kyoto. http://doi. org/10.1109/COMPSACW.2013.89 Khouja, M., & Offodile, 0 Felix. (1994). The industrial robots selection problem: Literature review and directions for future research. IIE Transactions, 26(4), 50–61. http://doi.org/ 10.1080/07408179408966618 Kinne, E., & Stojanov, G. (2014). Grounding drones’ ethical use reasoning. In Association for the Advancement of Artificial Intelligence (pp. 231–235). Québec City. Kolbjørnsrud, V., Amico, R., & Thomas, R. J. (2017). Partnering with AI: How organizations can win over skeptical managers. Strategy & Leadership, 45(1), 37–43. http://doi.org/10.1108/ SL-12-2016-0085 Kuusisto, J., & Meyer, M. (2003). Insights into services and innovation in the knowledge intensive economy. Technology Review, 134. Helsinki: Tekes. Retrieved from https://www.tekes.fi/ globalassets/julkaisut/insights.pdf Loebbecke, C., & Picot, A. (2015). Reflections on societal and business model transformation arising from digitization and big data analytics: A research agenda. Journal of Strategic Information Systems, 24, 149–157. http://doi.org/10.1016/j.jsis.2015.08.002 Luxton, D. D. (2014). Recommendations for the ethical use and design of artificial intelligent care providers. Artificial Intelligence in Medicine, 62(1), 1–10. http://doi.org/10.1016/j. artmed.2014.06.004 Metzler, T. A., Lewis, L. M., & Pope, L. C. (2016). Could robots become authentic companions in nursing care? Nursing Philosophy, 17(1), 36–48. http://doi.org/10.1111/nup.12101 Michelfelder, D. P. (2011). Dirty hands, speculative minds, and smart machines. Philosophy and Technology, 24(1), 55–68. http://doi.org/10.1007/s13347-010-0009-0 Mubin, O., D’Arcy, T., Murtaza, G., Simoff, S., Stanton, C., & Stevens, C. (2014). Active or passive? Investigating the impact of robot role in meetings. In IEEE international workshop on robot and human interactive communication (pp. 580–585). Edinburgh. http://doi.org/10.1109/ ROMAN.2014.6926315 Nielsen, J. A., Andersen, K. N., & Sigh, A. (2016). Robots conquering local government services: A case study of eldercare in Denmark. Information Polity, 21(2), 139–151. http://doi. org/10.3233/IP-160381 Niu, J., Tang, W., Xu, F., Zhou, X., & Song, Y. (2016). Global research on artificial intelligence from 1990–2014: Spatially-explicit bibliometric analysis. International Journal of GeoInformation, 5(66), 1–19. http://doi.org/10.3390/ijgi5050066

362   Crispin Coombs et al. Pieters, W. (2011). Explanation and trust: What to tell the user in security and AI? Ethics and Information Technology, 13(1), 53–64. http://doi.org/10.1007/s10676-010-9253-3 Reich, R. (1991). The work of nations: Preparing ourselves for 21st-century capitalism. London: Simon & Schuster. Sananès, N., Garbin, O., Hummel, M., Youssef, C., Vizitiu, R., Lemaho, D., . . . Wattiez, A. (2011). Setting up robotic surgery in gynaecology: The experience of the Strasbourg teaching hospital. Journal of Robotic Surgery, 5(2), 133–136. http://doi.org/10.1007/s11701-010-0231-x Schwartz, T., Krieger, H., & Zinnikus, I. (2016). Hybrid teams : Flexible collaboration between humans, robots and virtual agents. In Proceedings of the 14th German conference on multiagent system technologies (pp. 131–146). Klagenfurt. Sheridan, T. B. (2016). Human-robot interaction: Status and challenges. Human Factors, 58(4), 525–532. http://doi.org/10.1177/0018720816644364 Torras, C. (2015). Social robots: A meeting point between science and fiction. MÈTODE Science Studies Journal, 5, 111–115. http://doi.org/10.7203/metode.82.3546 Wang, C.-M., Tseng, S.-H., Wul, P.-W., Xu, Y.-H., Liao, C.-K., Lin, Y.-C., . . . Fu, L.-C. (2013). Human-oriented recognition for intelligent interactive office robot. In 13th International conference on control, automation and systems (pp. 960–965). Gwangju. http://doi. org/10.1093/gbe/evv015 Willcocks, L., Lacity, M. C., & Craig, A. (2015). The IT function and robotic process automation. The Outsourcing Unit Working Research Paper Series. London. http://eprints.lse.ac. uk/64519/1/OUWRPS_15_05_published.pdf Xu, S., Tu, D., He, Y., Tan, S., & Fang, M. (2013). ACT-R-typed human–robot collaboration mechanism for elderly and disabled assistance. Robotica, 32(November 2013), 1–11. http:// doi.org/10.1017/S0263574713001094 Yampolskiy, R., & Fox, J. (2013). Safety engineering for artificial general intelligence. Topoi, 32(2), 217–226. http://doi.org/10.1007/s11245-012-9128-9 Yu, Q., Yuan, C., Fu, Z., & Zhao, Y. (2012). An autonomous restaurant service robot with high positioning accuracy. The Industrial Robot, 39(3), 271–281. http://doi.org/10.1108/ 01439911211217107 Zaghloul, A. S., & Mahmoud, A. M. (2016). Preliminary results of robotic colorectal surgery at the National Cancer Institute, Cairo University. Journal of the Egyptian National Cancer Institute, 28(3), 169–174. http://doi.org/10.1016/j.jnci.2016.05.003 Zeng, D. (2015). AI Ethics: Science fiction meets technological reality. IEEE Intelligent Systems, 3, 2–5.

Appendix: Publications Analyzed Abdel Raheem, A., Song, H. J., Chang, K. D., Choi, Y. D., & Rha, K. H. (2017). Robotic nurse duties in the urology operative room: 11 years of experience. Asian Journal of Urology, 4(2), 116–123. http://doi.org/10.1016/j.ajur.2016.09.012 Albu, A., & Stanciu, L. (2015). Benefits of using artificial intelligence in medical predictions. In 2015 e-health and bioengineering conference (EHB) (pp. 1–4). Iasi. http://doi.org/10.1109/ EHB.2015.7391610 Alizadehsani, R., Zangooei, M. H., Hosseini, M. J., Habibi, J., Khosravi, A., Roshanzamir, M., . . .  Nahavandi, S. (2016). Coronary artery disease detection using computational intelligence methods. Knowledge-Based Systems, 109, 187–197. http://doi.org/10.1016/j.knosys.2016.07.004

Changing Nature of Work   363 Amershi, S., Fogarty, J., Kapoor, A., & Tan, D. (2011). Effective end-user interaction with machine learning. In Proceedings of the twenty-fifth AAAI conference on artificial intelligence (pp. 1529–1532). San Francisco. http://doi.org/10.1145/2046396.2046416 Amrit, C., Paauw, T., Aly, R., & Lavric, M. (2017). Identifying child abuse through text mining and machine learning. Expert Systems with Applications, 88, 402–418. http://doi. org/10.1016/j.eswa.2017.06.035 Aron, R., Dutta, S., Janakiraman, R., & Pathak, P. A. (2011). The impact of automation of systems on medical errors : Evidence from field research. Information Systems Research, 22(3), 429–446. Autor, D. H. (2015). Why are there still so many jobs? The history and future of workplace automation. Journal of Economic Perspectives, 29(3), 3–30. http://doi.org/10.1257/jep.29.3.3 Balfe, N., Sharples, S., & Wilson, J. R. (2015). Impact of automation: Measurement of performance, workload and behaviour in a complex control environment. Applied Ergonomics, 47, 52–64. http://doi.org/10.1016/j.apergo.2014.08.002 Balkin, T. J., Horrey, W. J., Graeber, R. C., Czeisler, C. a., & Dinges, D. F. (2011). The challenges and opportunities of technological approaches to fatigue management. Accident Analysis and Prevention, 43(2), 565–572. http://doi.org/10.1016/j.aap.2009.12.006 Balram, N., Tošić, I., & Binnamangalam, H. (2016). Digital health in the age of the infinite network. In APSIPA Transactions on Signal and Information Processing (Vol. 5, pp. 1–13). Cambridge. http://doi.org/10.1017/ATSIP.2016.6 Baril, C., Gascon, V., & Brouillette, C. (2014). Impact of technological innovation on a nursing home performance and on the medication-use process safety. Journal of Medical Systems, 38(3), 1–12. http://doi.org/10.1007/s10916-014-0022-4 Bekier, M., Molesworth, B.  R., & Williamson, A. (2011). Defining the drivers for accepting decision making automation in air traffic management. Ergonomics, 54(4), 347–356. http:// doi.org/10.1080/00140139.2011.558635 Bennett, C. C., & Hauser, K. (2013). Artificial intelligence framework for simulating clinical decision-making: A Markov decision process approach. Artificial Intelligence in Medicine, 57(1), 9–19. http://doi.org/10.1016/j.artmed.2012.12.003 Bocci, T., Moretto, C., Tognazzi, S., Briscese, L., Naraci, M., Leocani, L., . . . Sartucci, F. (2013). How does a surgeon’s brain buzz? An EEG coherence study on the interaction between humans and robot. Behavioral and Brain Functions : BBF, 9(1), 1–12. http://doi. org/10.1186/1744-9081-9-14 Bogue, R. (2011). Robots in the nuclear industry: A review of technologies and applications. Industrial Robot: An International Journal, 38(2), 113–118. http://doi.org/10.1108/ 01439911111106327 Broussard, M. (2015). Artificial intelligence for investigative reporting. Digital Journalism, 3(6), 814–831. http://doi.org/10.1080/21670811.2014.985497 Byun, S., & Buyn, S.-E. (2011). Exploring perceptions toward biometric technology in service encounters: a comparison of current users and potential adopters. Behaviour & Information Technology, 32(3), 217–230. http://doi.org/10.1080/0144929X.2011.553741 Calo, C. J., Hunt-Bull, N., Lewis, L., & Metzler, T. (2011). Ethical implications of using the Paro robot with a focus on dementia patient care. In Association for the Advancement of Artificial Intelligence workshop (Vol. WS-11-12, pp. 20–24). San Francisco. Retrieved from http:// www.scopus.com/inward/record.url?eid=2-s2.0–80,054,913,481&partnerID=40&md5=cf9ad 1b74f81299aa3f48be4f8e9700f

364   Crispin Coombs et al. Chang, A. C. (2012). Primary prevention of sudden cardiac death of the young athlete: The controversy about the screening electrocardiogram and its innovative artificial intelligence solution. Pediatric Cardiology, 33(3), 428–433. http://doi.org/10.1007/s00246-012-0244-5 Charalambous, G., Fletcher, S., & Webb, P. (2015). Identifying the key organisational human factors for introducing human-robot collaboration in industry: an exploratory study. International Journal of Advanced Manufacturing Technology, 81(9–12), 2143–2155. http:// doi.org/10.1007/s00170-015-7335-4 Charchat-Fichman, H., Uehara, E., & Santos, C. F. (2014). New technologies in assessment and neuropsychological rehabilitation. Trends in Psychology, 22(3), 539–553. http://doi. org/10.9788/TP2014.3-01 Collins, J. W., Patel, H., Adding, C., Annerstedt, M., Dasgupta, P., Khan, S. M., . . . Wiklund, P. N. (2016). Enhanced recovery after robot-assisted radical cystectomy: EAU robotic urology section scientific working group consensus view. European Urology, 70, 649–660. http://doi.org/10.1016/j.eururo.2016.05.020 Danilchenko, A., Balachandran, R., Toennies, J.  L., Baron, S., Munske, B., Fitzpatrick, J. M., . . . Labadie, R. F. (2011). Robotic mastoidectomy. Otology and Neurotology, 32(1), 11–16. http://doi.org/10.1097/MAO.0b013e3181fcee9e.Robotic De Benedictis, A., Trezza, A., Carai, A., Genovese, E., Procaccini, E., Messina, R., . . . Marras, C.  E. (2017). Robot-assisted procedures in pediatric neurosurgery. Neurosurgical Focus, 42(5), E7. http://doi.org/10.3171/2017.2.FOCUS16579 Decker, M., Fischer, M., & Ott, I. (2017). Service robotics and human labor: A first technology assessment of substitution and cooperation. Robotics and Autonomous Systems, 87, 348–354. http://doi.org/10.1016/j.robot.2016.09.017 Dilsizian, S. E., & Siegel, E. L. (2014). Artificial intelligence in medicine and cardiac imaging: Harnessing big data and advanced computing to provide personalized medical diagnosis and treatment. Current Cardiology Reports, 16(441), 1–8. http://doi.org/10.1007/ s11886-013-0441-8 Doryab, A., Min, J. K., Wiese, J., Zimmerman, J., & Hong, J. I. (2014). Detection of behavior change in people with depression. In Twenty-eighth AAAI Conference on Artificial Intelligence workshop (pp. 12–16). Québec. Drew, J. (2017). Real talk about artificial intelligence and blockchain. Journal of Accountancy, 224(1), 22–26,28. Retrieved from https://ucd.idm.oclc.org/login?url=https://search.proquest.com/docview/1917636631?accountid=14,507 %0Ahttp://jq6am9xs3s.search.serialssolution.com?ctx_ver=Z39.88–2004&ctx_enc=info:ofi/enc:UTF-8&rfr_id=info:sid/ ProQ%3Aabiglobal&rft_val_fmt=info:ofi/fmt:kev:m Drigas, A. S., & Ioannidou, R.-E. (2012). Artificial intelligence in special education: A decade review. International Journal of Engineering Education, 28(6), 1366–1372. Edwards, P., & Ramirez, P. (2016). When should workers embrace or resist new technology? New Technology, Work and Employment, 31(2), 99–113. http://doi.org/10.1111/ntwe.12067 Fischer, M. (2012). Interdisciplinary technology assessment of service robots: the psychological/work science perspective. Poiesis & Praxis: International Journal of Ethics of Science and Technology Assessment, 9(3–4), 231–248. http://doi.org/10.1007/s10202-012-0113-6 Gilbert, B. J., Goodman, E., Chadda, A., Hatfield, D., Forman, D. E., & Panch, T. (2015). The Role of Mobile Health in Elderly Populations. Current Geriatrics Reports, 4(4), 347–352. http://doi.org/10.1007/s13670-015-0145-6 Gombolay, M. C., Huang, C., & Shah, J. A. (2015). Coordination of human-robot teaming with human task preferences. In AAAI Fall symposium series on AI-HRI. Arlington.

Changing Nature of Work   365 Hengstler, M., Enkel, E., & Duelli, S. (2016). Applied artificial intelligence and trust: The case of autonomous vehicles and medical assistance devices. Technological Forecasting and Social Change, 105, 105–120. http://doi.org/10.1016/j.techfore.2015.12.014 Hirsch, P. B. (2017). The robot in the window seat. Journal of Business Strategy, 38(4), 47–51. http://doi.org/10.1108/JBS-04-2017-0050 Holloway, B. B., Deitz, G. D., & Hansen, J. D. (2013). The benefits of sales force automation (SFA): An empirical examination of SFA usage on relationship quality and performance. Journal of Relationship Marketing, 12(4), 223–242. http://doi.org/10.1080/15332667.2013.846735 Huijnen, C. a G. J., Lexis, M. a S., Jansens, R., & de Witte, L. P. (2016). Mapping robots to therapy and educational objectives for children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 46(6), 2100–2114. http://doi.org/10.1007/s10803016-2740-6 James, K. L., Barlow, D., Bithell, A., Hiom, S., Lord, S., Oakley, P., . . . Whittlesea, C. (2013). The impact of automation on pharmacy staff experience of workplace stressors. International Journal of Pharmacy Practice, 21(2), 105–116. http://doi.org/10.1111/j.2042-7174.2012.00231.x Jeong, G. M., Park, C. W., You, S., & Ji, S. H. (2014). A study on the education assistant system using smartphones and service robots for children regular paper. International Journal of Advanced Robotic Systems, 11(1), 1–9. http://doi.org/10.5772/58389 Jeske, D., & Santuzzi, A. M. (2015). Monitoring what and how: Psychological implications of electronic performance monitoring. New Technology, Work and Employment, 30(1), 62–78. http://doi.org/10.1111/ntwe.12039 Junejo, F., Amin, I., Hassan, M., Ahmed, A., Hameed, S., & Author, C. (2017). The application of artificial intelligence in grinding operation using sensor fusion. International Journal of GEOMATE, 12(30), 11–18. Jung, J., Song, H., Kim, Y., Im, H., & Oh, S. (2017). Intrusion of software robots into journalism: The public’s and journalists’ perceptions of news written by algorithms and human journalists. Computers in Human Behavior, 71, 291–298. http://doi.org/10.1016/j.chb.2017.02.022 Kaivo-Oja, J., Roth, S., & Westerlund, L. (2017). Futures of robotics. Human work in digital transformation. International Journal of Technology Management, 73(4), 176–205. http://doi. org/10.1504/IJTM.2017.083074 Khosla, R., Chu, M.  T., & Nguyen, K. (2013). Affective robot enabled capacity and quality improvement of nursing home aged care services in Australia. In IEEE 37th annual computer software and applications conference workshops (pp. 409–414). Kyoto. http://doi. org/10.1109/COMPSACW.2013.89 Kim, C., Kim, D., Yuan, J., Hill, R. B., Doshi, P., & Thai, C. N. (2015). Robotics to promote elementary education pre-service teachers’ STEM engagement, learning, and teaching. Computers and Education, 91, 14–31. http://doi.org/10.1016/j.compedu.2015.08.005 Klintong, N., Vadhanasindhu, P., & Thawesaengskulthai, N. (2012). Artificial intelligence and successful factors for selecting product innovation development. In 3rd International Conference on Intelligent Systems Modelling and Simulation (pp. 397–402). Kota Kinabalu. http://doi.org/10.1109/ISMS.2012.86 Kokina, J., & Davenport, T. H. (2017). The emergence of artificial intelligence: How automation is changing auditing. Journal of Emerging Technologies in Accounting, 14(1), 115–122. http://doi.org/10.2308/jeta-51,730 Kolbjørnsrud, V., Amico, R., & Thomas, R. J. (2017). Partnering with AI: how organizations can win over skeptical managers. Strategy & Leadership, 45(1), 37–43. http://doi.org/10.1108/ SL-12-2016-0085

366   Crispin Coombs et al. Kraan, K. O., Dhondt, S., Houtman, I. L. D., Batenburg, R. S., Kompier, M. A. J., & Taris, T. W. (2014). Computers and types of control in relation to work stress and learning. Behaviour & Information Technology, 33(10), 1013–1026. http://doi.org/10.1080/0144929X.2014.916351 Kumar, S., Pragatheeswarane, M., Sharma, A. P., Bishnoi, K., Sharma, M. K., Panwar, V. K., & Sethi, S. (2017). Expanding the horizon of robotic surgery to large pelvic paraganglioma. Journal of Robotic Surgery, 11(2), 247–250. http://doi.org/10.1007/s11701-016-0648-y Lacity, M.  C., & Willcocks, L.  P. (2016). Robotic process automation at telefónica O2. MIS Quarterly Executive, 15(1), 21–35. Lee, H., Troschel, F. M., Tajmir, S., Fuchs, G., Mario, J., Fintelmann, F. J., & Do, S. (2017). Pixellevel deep segmentation: Artificial intelligence quantifies muscle on computed tomography for body morphometric analysis. Journal of Digital Imaging, 30(4), 487–498. http://doi. org/10.1007/s10278-017-9988-z Lund, H. H. (2011). Anybody, anywhere, anytime—Robotics with a social impact through a building block approach. In Proceedings of IEEE workshop on advanced robotics and its social impacts (pp. 2–7). Menlo Park. http://doi.org/10.1109/ARSO.2011.6301970 Luxton, D.  D. (2014). Artificial intelligence in psychological practice: Current and future applications and implications. Professional Psychology: Research and Practice, 45(5), 332. http://doi.org/10.1037/a0034559 Mathers, N., Goktogen, A., Rankin, J., & Anderson, M. (2012). Robotic mission to Mars: Hands-on, minds-on, web-based learning. Acta Astronautica, 80, 124–131. http://doi. org/10.1016/j.actaastro.2012.06.003 Mubin, O., D’Arcy, T., Murtaza, G., Simoff, S., Stanton, C., & Stevens, C. (2014). Active or passive? Investigating the impact of robot role in meetings. In IEEE International Workshop on Robot and Human Interactive Communication (pp. 580–585). Edinburgh. http://doi. org/10.1109/ROMAN.2014.6926315 Naik, G., & Bhide, S. S. (2014). Will the future of knowledge work automation transform personalized medicine? Applied and Translational Genomics, 3(3), 50–53. http://doi. org/10.1016/j.atg.2014.05.003 Nezhad, H.  R.  M. (2015). Cognitive assistance at work. In AAAI 2015 Fall symposium (pp. 37–40). Arlington. Nielsen, J. A., Andersen, K. N., & Sigh, A. (2016). Robots conquering local government services: A case study of eldercare in Denmark. Information Polity, 21(2), 139–151. http://doi. org/10.3233/IP-160381 Noor, A. (2011). Intelligent adaptive cyber-physical ecosystem for aerospace engineering education, training, and accelerated workforce development. Journal of Aerospace Engineering, 24(October), 403–408. http://doi.org/10.1061/(ASCE)AS.1943-5525.0000128. Ohlsson, S. (2016). Constraint-based modeling: From cognitive theory to computer tutoring— and back again. International Journal of Artificial Intelligence in Education, 26(1), 457–473. http://doi.org/10.1007/s40593-015-0075-7 Peña, P., Del Hoyo, R., Vea-murguía, J., Rodrigálvarez, V., Calvo, J. I., & Martín, J. M. (2016). Moriarty: Improving “time to market” in big data and artificial intelligence applications. International Journal of Design & Nature and Ecodynamics, 11(3), 230–238. http://doi. org/10.2495/DNE-V11-N3-230-238 Piccoli, M., Mullineris, B., Santi, D., & Gozzo, D. (2017). Advances in robotic transaxillary thyroidectomy in Europe. Current Surgery Reports, 5(8), 1–7. http://doi.org/10.1007/ s40137-017-0180-7 Pieters, W. (2011). Explanation and trust: What to tell the user in security and AI? Ethics and Information Technology, 13(1), 53–64. http://doi.org/10.1007/s10676-010-9253-3

Changing Nature of Work   367 Samani, H. (2016). The evaluation of affection in human-robot interaction. Kybernetes, 45(8), 1257–1272. http://doi.org/10.1108/K-09-2015-0232 Samarakou, M., Fylladitakis, E.  D., Prentakis, P., & Athineos, S. (2014). Implementation of artificial intelligence assessment in engineering laboratory education. In International conference e-learning (pp. 299–303). Lisbon: International Association for Development of the Information Society. Sananès, N., Garbin, O., Hummel, M., Youssef, C., Vizitiu, R., Lemaho, D., . . . Wattiez, A. (2011). Setting up robotic surgery in gynaecology: The experience of the Strasbourg teaching hospital. Journal of Robotic Surgery, 5(2), 133–136. http://doi.org/10.1007/s11701010-0231-x Schwartz, T., Krieger, H., & Zinnikus, I. (2016). Hybrid teams : Flexible collaboration between humans, robots and virtual agents. In Proceedings of the 14th German conference on multiagent system technologies (pp. 131–146). Klagenfurt. Semerjian, A., & Pavlovich, C. P. (2017). Extraperitoneal robot-assisted radical prostatectomy: Indications, technique and outcomes. Current Urology Reports, 18(42), 1–7. http://doi. org/10.1007/s11934-017-0689-4 Sheridan, T. B. (2016). Human-robot interaction: Status and challenges. Human Factors, 58(4), 525–532. http://doi.org/10.1177/0018720816644364 Skulimowski, A. M. J. (2014). Future prospects of human interaction. In International conference on adaptive and intelligent systems (pp. 131–141). Bournemouth. Stalidis, G., Karapistolis, D., & Vafeiadis, A. (2015). Marketing decision support using artificial intelligence and knowledge modeling: Application to tourist destination management. Procedia—Social and Behavioral Sciences, 175, 106–113. http://doi.org/10.1016/j. sbspro.2015.01.1180 Sundararajan, S. C., & Nitta, S. V. (2015). Designing engaging intelligent tutoring systems in an age of cognitive computing. IBM Journal of Research and Development, 59(6), 10:1–10:9. http://doi.org/10.1147/JRD.2015.2464085 Sutton, S. G., Holt, M., & Arnold, V. (2016). “The reports of my death are greatly exaggerated”— Artificial intelligence research in accounting. International Journal of Accounting Information Systems, 22, 60–73. http://doi.org/10.1016/j.accinf.2016.07.005 Szalma, J. L., & Taylor, G. S. (2011). Individual differences in response to automation: the five factor model of personality. Journal of Experimental Psychology. Applied, 17(2), 71–96. http://doi.org/10.1037/a0024170 Taylor, A. K., & Cotter, T. S. (2014). Human-machine intelligence interaction in aviation. In Proceedings of the American Society for Engineering Management (pp. 210–217). Virginia Beach. van de Merwe, K., Oprins, E., Eriksson, F., & van der Plaat, A. (2012). The influence of automation support on performance, workload, and situation awareness of air traffic controllers. The International Journal of Aviation Psychology, 22(2), 120–143. http://doi.org/10.1080/1050 8414.2012.663241 van Doorn, J., Mende, M., Noble, S. M., Hulland, J., Ostrom, A. L., Grewal, D., & Petersen, J. A. (2017). Domo arigato Mr. Roboto: Emergence of automated social presence in organizational frontlines and customers’ service experiences. Journal of Service Research, 20(1), 43–58. http://doi.org/10.1177/1094670516679272 Willcocks, L., Lacity, M. C., & Craig, A. (2015a). Robotic process automation at Xchanging (The Outsourcing Unit Working Research Paper Series). London. Willcocks, L., Lacity, M. C., & Craig, A. (2015b). The IT function and robotic process automation. (The Outsourcing Unit Working Research Paper Series). London.

368   Crispin Coombs et al. Xu, J., Le, K., Deitermann, A., & Montague, E. (2014). How different types of users develop trust in technology: A qualitative analysis of the antecedents of active and passive user trust in a shared technology. Applied Ergonomics, 45(6), 1495–1503. http://doi.org/10.1016/j. apergo.2014.04.012 Yang, S., Wei, R., Guo, J., & Xu, L. (2017). Semantic inference on clinical documents: Combining machine learning algorithms with an inference engine for effective clinical diagnosis and treatment. IEEE Access, 5, 3529–3546. http://doi.org/10.1109/ACCESS.2017.2672975 Ye, J. J. (2015). Artificial intelligence for pathologists is not near-it is here: Description of a prototype that can transform how we practice pathology tomorrow. Archives of Pathology and Laboratory Medicine, 139(7), 929–935. http://doi.org/10.5858/arpa.2014-0478-OA Zaghloul, A. S., & Mahmoud, A. M. (2016). Preliminary results of robotic colorectal surgery at the National Cancer Institute, Cairo University. Journal of the Egyptian National Cancer Institute, 28(3), 169–174. http://doi.org/10.1016/j.jnci.2016.05.003

chapter 13

Wor k pl ace “Digita l Cu ltu r e” a n d th e U pta k e of Digita l Solu tions Personal and Organizational Factors Simeon J. Yates and Eleanor Lockley

Introduction The study of technology acceptance has generally focused on single technologies in ­specific contexts, be that home, work, or a business sector. This chapter addresses the broader question of technology acceptance across the UK workforce as a whole. In doing so the chapter tries to understand the non-technical barriers to UK industry taking up digital solutions. Over the last 60 years there have been repeated cases of digital technologies causing substantive disruption to business practices and markets, often with both negative and transformative impacts on those businesses and markets (e.g., email, Internet, or office software, and more recently, iTunes and Uber). Predicting such disruptions is complex, and at times impossible. However, the digital sector itself argues that 25% of companies are likely to have their business, organizational structures, or work practices disrupted by digital solutions over the coming three years (Bradley et al., 2015; Brecher et al., 2016). This chapter is not focused on predicting such transformations, but rather seeks to understand the broad organizational factors that might impact the ability of businesses and institutions to respond to digital change. Tied to this is the fact that digital technologies pervade the home as much as work. In everyday life, the ease of use of such things as smart phones and apps gives the impression that the public, that is the workforce, can easily take on new technologies. Ensuring the engagement of the workforce is essential when organizations are actively trying to prepare for digital

370   Simeon J. Yates and Eleanor Lockley disruption, or to implement new digital solutions (Edwards & Ramirez, 2016; Reynolds, 2015). This leads to questioning the extent to which organizations can rely upon digital skills transferring from domestic or non-work settings to the workplace. The public and media perception of the UK as a technologically and digitally engaged society masks a far more complex and less optimistic reality. This is very much the case for personal digital media use, with considerable class and age differences in access and use (see Yates et al., 2015; Yates & Lockley, 2018; and Chapter 15 in this volume). As prior research on commercial organizations has found, a large proportion (43%) of businesses have yet to respond at a leadership or board level to the disruption that digital technologies may bring to their sectors (Loucks et al., 2016). Others have found that around 30% of UK SMEs are not online or using the Internet in very limited ways (Lloyds Bank, 2017). One link between these home and work issues has been the focus of much government policy on closing the digital divide by ensuring digital access at home (see GOV.UK, 2014: Digital Inclusion Strategy). This is in part because it is assumed that digital skills at home will transfer to other aspects of citizens’ lives. The digital divide at work, both between organizations and their workers, has not been a major concern until recently—for example, moves to bring coding to the curriculum and the identification of substantive IT skills gaps in the workforce. A future research and policy challenge is to understand access to and use of digital in UK small and medium sized enterprises (SMEs). As an initial step in understanding these issues, this chapter reports on a national survey of UK employees of all grades and sectors. This survey explores employees’ personal experiences of digital technologies at home and work, their evaluations of the effectiveness of the technologies, and the digital culture in their organization. In some research (Robey & Azevedo, 1994), much digital industry marketing, and more general media coverage, digital technologies are claimed to be quick and efficient fixes to complex organizational issues (Lloyds Bank, 2017). This, along with potential digital transformations and issues of access, use, and skills creates considerable challenges around ensuring the success of digital technologies in the workplace. Such challenges raise questions about which organizations are best able to manage such change, as well as the types of organizations and the economic sectors that are engaging with new digital solutions. Questions considering the major barriers to uptake are also important—especially practical challenges such as finances and legacy systems, or issues of leadership, vision, and organizational culture. By taking a national workers’ perspective, this chapter fills a gap identified in research that looks into attitudes to digital technology acceptance. It explores the factors that influence digital roll-outs by focusing on the experiences and perceptions of the UK workforce as a whole with the expectation that introducing new technology alone is not enough. This chapter presents a review of approaches to technology uptake and applies the ideas to the survey of 3040 UK workers, and is divided into eight sections. The background to the report, including prior academic work concerning technology acceptance and the framing of the research questions, is presented in section two. Section three examines which or­gan­i­ za­tions are rolling out new digital solutions, analyzed by sector and by organization size. Section four examines the UK workforce’s experience with and use of digital technologies at home. The experience of digital technologies and their implementation at work by

“Digital Culture” and the Uptake of Digital Solutions   371 ­ rganization sector and size follows in section five. Section six further explores the commuo nication channels used by organizations when rolling out new digital solutions. Section seven examines the UK workforce’s experience of organizational barriers to the implementation of digital solutions. Section eight develops a statistical model of the most important factors that influence perceptions of the success (or not) of new digital solutions, based upon the results from the previous sections. The chapter ends with an overall summary, the importance of culture and strategy, and final conclusions including the role of leadership.

Understanding and Measuring Technology Acceptance Factors Technologies don’t just drop into place or spontaneously emerge; at both home and work they have to become accepted. Technology acceptance models (Davis,  1989; Venkatesh, 2000; Venkatesh et al., 2003) have tended to focus on two areas: • Perceived usefulness—the extent to which a worker or home user believes that using a technology would enhance the task they are engaged in • Perceived ease-of-use—the extent to which a worker or home user believes that using a technology would be free from substantive effort These two issues have been measured in a variety of ways in a range of studies (Adams, et al., 1992; Segers & Grover, 1993; Szajna, 1994). Many of these studies have a very individualistic focus—they look at the motivations and rational behaviors of individual users. The organizational or personal situation of users is a context in which they engage with the technology. More recent research by Venkatesh and Davis (2000) has identified four factors to explore: • Performance expectancy, parallel to perceived usefulness, is the degree to which an individual believes that using the system will help him or her to attain gains in job performance. • Effort expectancy, parallel to perceived ease-of-use, is the degree of ease associated with the use of the system. • Social influence is the degree to which an individual perceives that important others believe he or she should use the new system. • Facilitating conditions are the degree to which an individual believes that an or­gan­ i­za­tional and technical infrastructure exists to support use of the system (Venkatesh & Davis, 2000). This chapter considers the role of context as well as personal factors implicit in these four factors. It therefore explores how personal attitudes, use, and confidence mix with organizational culture to influence attitudes to digital technologies at work.

372   Simeon J. Yates and Eleanor Lockley This research has looked at the issue across a national sample of the UK workforce, whereas technology acceptance models have tended to be applied to specific case studies. These are mainly studies of the uptake of technologies by specific organizations, from SME’s to large corporate organizations (Bruque & Moyano, 2007; Fitzgerald et al., 2013). The majority of case studies are also focused on one single technology (e.g., social media; Abduwaila & Ali 2013; Rauniar et al., 2014), or sector (e.g., healthcare; Cresswell & Sheikh, 2013; Xie et al., 2013, or ICT and telecommunications; Barnes, 2012). This chapter therefore presents a unique national picture of the issues, challenges, and best practice around digital technology uptake and acceptance by the UK workforce. Making use of the technology acceptance approach described above, a set of personal and work factors that could be assessed were identified. Often ease of use, expected ease of use; performance, and effort are measured in relation to specific technologies. The UK workforce is today exposed to as many if not more technologies at home as they are at work. It was therefore important to assess different aspects of home and work use: from confidence, to the types and range of digital technology use. The survey questions about the UK workforce’s personal use across home and work included • Personal experience and confidence at home: confidence, acceptance of new technology in the home, and range of home use (using measures taken from the Ofcom, 2016 media literacy survey) • Personal experience and confidence at work: confidence, and being a knowledge worker To understand the social and facilitating issues highlighted by technology acceptance models the questions were split across • Organizational challenges (predominantly facilitating issues): company/or­gan­i­za­ tional sector, company/organization size, and Internal organizational issues (e.g., finances and legacy systems) • Organizational culture (predominantly social issues): attitudes to digital in the organization, and digital leadership in the organization

Survey and Analysis Methods Defining Digital Solutions What is meant by “the roll-out of digital solutions”? Often the focus is on high end, big ticket, or disruptive digital technologies such as cloud solutions, social media, and mobile applications. The UK workforce is more likely to encounter a wider range of business-critical systems such as clocking on tools, or regulatory compliance tools. Considering this, the following was articulated to all of the respondents at the start of the survey:

“Digital Culture” and the Uptake of Digital Solutions   373 In this survey we are interested in understanding your use of digital technologies at work and in your home. When we talk about digital technologies we specifically mean software, apps, devices, and any equipment that uses the internet to play a role in digitizing documents, processes or tasks.

The survey included Table 13.1 to provide examples and a reference point for respondents.

Sample and Analyses The survey comprised 3040 online interviews with UK employees aged 16 and over. Data was gathered between April 12 and 16, 2016. The survey was a nationally representative structured sample of UK employees based on an existing panel managed by Censuswide (www.censuswide.com). The survey was designed by the authors and administered by Censuswide. The data were subjected to a range of statistical analyses: • Categorical data were subjected to χ2 analyses, and statistically significant variations between cells were identified by column proportion z-tests. • Ordinal and ratio data relations were subjected to bivariate correlation analyses, using both Pearson and Spearman methods as appropriate. • Comparison of ordinal and ratio data by category was undertaken using General Linear Modelling (ANOVA, MANOVA). Table 13.1  Defining Digital Solutions Work activity

Examples

Systems to manage people in your workplace

Online timesheet or expenses systems/Social recruitment tools

Systems to manage the finances and official documents in your workplace

eInvoicing tools/Digital document archiving/Online billing and payment solutions

Sales and customer service systems

Online customer journey mapping/relationship management/communications tools

Marketing systems

Web/email/social media marketing tools/Digital customer data or intelligence tools

Management systems

Digital business intelligence tools/Online business process management systems

Information systems for work place, shop floor or remote working

Remote diagnostics/maintenance tools/design tools/ Systems to control or manage manufacturing processes

Communications tools

Mobile devices/Workplace communications/social media

374   Simeon J. Yates and Eleanor Lockley • Home use data was grouped using the K-means Cluster method. • The overall regression model was developed using SPSS Automatic Linear Modelling. • Significance levels were set at p < .05, and for multiple tests significance levels were set using the Bonferroni method

The Extent to which UK Organizations and Sectors Are Digitizing The survey sought to understand the overall national position in terms of who, how many, and why digital technologies are being rolled out. The UK workforce responses about the existent roll-outs are compared by organizational size and sector variations to answer three questions: • Whether there have been roll-outs • How many roll-outs • If UK workers have perceived an increase in the rate of roll-outs

Presence and Number of Digital Roll-Outs, by Organizational Size and Sector As the survey covered a representative sample of the UK workforce, this includes people who do not use digital technologies at work, and those who have not seen any new dig­ ital tools at work for quite some time. Indeed, overall 29% of the surveyed workers were not aware of digital solutions being rolled out in their organizations. This number is in line with prior research that found over 30% of SMEs were not online and not using many, if any, digital tools (Lloyds Bank, 2017). Taking a binary measure of workers’ experiencing digital rollouts (or not), employees of smaller organizations were statistically significantly more likely to indicate that they had not experienced digital rollouts (see Figure 13.1; one-way ANOVA, Welch (1, 3039) = 1384.6, p < .001, with a medium effect size (η2 [eta squared] = .54)). The size of an organization might influence the number of digital solutions that the UK workforce encounters. Looking at the number of new digital solutions the UK workforce has reported, there is a statistically significant difference in relation to the size of organizations (one-way ANOVA, Welch (6, 2369) = 800.6, p < .001, medium to large effect size (η2 = .11), with the average company size for no roll-outs being 50–99 employees. Those working in organizations of 100 or more people are more likely to have encountered three or more new digital technologies over the last two years than those in smaller organizations (see Figure 13.2 and Table 13.2).

“Digital Culture” and the Uptake of Digital Solutions   375 90 80

Percentage

70 60 50 40 30 20

More than 500 employees

250–500 employees

100–249 employees

50–99 employees

Self employed

2–9 employees

10–49 employees

10

Organisation size No digital roll outs

Digital roll outs

Figure 13.1  Digital roll-outs (or not) by company size.

Number of roll outs

More than 10 6 to 10 3 to 6 1 to 2

250–500 employees

More than 500 employees

Company size

100–249 employees

50–99 employees

10–49 employees

2–9 employees

Self employed

None

Figure 13.2  Number of digital roll-outs by organization size (area represents proportion of cases).

376   Simeon J. Yates and Eleanor Lockley Table 13.2  Organization Size and Number of Digital Roll-Outs Number of new digital solutions

Average company size

None One Many

50–99 100–249 250–500

100

Percentage

80 60 40 20

IT & Telecoms

Finance

Legal

HR

Professional services

Sales, Media & Marketing

Healthcare

Education

Architecture, Engineering & Building

No digital roll outs Digital roll outs

Manufacturing & Utilities

Arts & Culture

Retail, Catering & Leisure

Travel & Transport

Other

0

Sectors

Figure 13.3  Roll-outs or not by sector.

While more than 60% of the UK workforce in all sectors had knowledge of a digital solution roll-out in their organization, there are statistically significant differences between the most active and least active sectors in terms of undertaking any digital rollouts (chi-square test, χ2 (13, 3039) = 93.75, p < .001, small to medium Cramer’s V = .18). Comparing proportions (z-tests at p < .05 with Bonferroni adjustment), the workforce in the Other, Retail, and Catering & Leisure sectors were the least likely to see new dig­ ital solutions. Not surprisingly, IT and telecoms, followed by Finance and by Professional services, were statistically the most likely (see Figure 13.3).

“Digital Culture” and the Uptake of Digital Solutions   377

Somewhat agree Neither agree or disagree Somewhat disagree

Travel & Transport

Sales, Media & Marketing

Retail, Catering & Leisure

Professional services

Other

Manufacturing & Utilities

Legal

IT & Telecoms

HR

Healthcare

Finance

Education

Arts & Culture

Totally disagree Architecture, Engineering & Building

Leaders pushing thorugh digital but other areas of business not int

Totally agree

Sector

Figure 13.4  Digital solution roll-outs by sector (area represents proportion of cases).

The number of roll-outs the UK workforce has experienced at work in the last two years also significantly varies across sectors. The same sectors as above are statistically more likely to have new digital solutions than the other sectors (see Figure 13.4; one-way ANOVA, Welch (13, 2369) = 8.85, p < .001, small effect size (η2 = .05)).

Increase in Digital Solutions Being Used, by Organizational Size and Sector The survey also asked if the UK workforce had seen an increase in the number of digital solutions deployed in their industry in the last two years. Employees of smaller or­gan­i­ za­tions were statistically less likely to report this (one-way ANOVA, Welch (2, 3040) = 16.5, p < .001), but the effect size was very small (η2 = .01). There are some statistical differences by sector (χ2 (13, 2616) = 37.53, p < .001, small to medium Cramer’s V = .12). Based on comparing proportions (z-tests at p < .05 with Bonferroni adjustment), workers in the IT & Telecoms sector were statistically most likely to indicate they had seen an increase in new solutions, while workers in the Other and in the Travel and transport sectors were the least likely (see Figure 13.5).

378   Simeon J. Yates and Eleanor Lockley 110 100 90 80 Percentage

70 60 50 40 30 20 10

No increase in solutions

Legal

Sales, Media & Marketing

IT & Telecoms

HR

Architecture, Engineering & Building

Sector

Arts & Culture

Professional services

Healthcare

Finance

Education

Retail, Catering & Leisure

Manufacturing & Utilities

Other

Travel & Transport

0

Yes an increase in solutions

Figure 13.5  Increase in roll-outs over the last two years by sector.

Reasons for Digital Roll-outs, by Organizational Size and Sector Respondents who had experienced digital roll-outs were asked what they thought the organization’s goals or reasoning had been for the implementation. The three main reasons given by the UK workforce were to cut costs, automate processes, and improve productivity (see Figure 13.6). There were some statistical differences by company size (χ2 (54, 9296) = 190.50, p  0.4). There was relatively weak correlation between factors (r < .30) except for factors 1 and 4 (r = .59). These four factors were therefore retained and factor scores were calculated using the Anderson-Rubin method. Table 13.7 provides the items, factors, and loadings. The four factors appear to measure: Table 13.7  Factor Analysis Factors and Loadings Item New digital solutions rolled out in my workplace do not add value

1 .71

2

3

.02 −.17

4 .02

My organization does not provide the right support when solutions are .70 rolled out

−.26

.01

.09

I have tried to suggest digital technologies that would benefit our organization but nothing has come of it

.69

.11

.07

.04

Culturally, my organization is not ready to embrace digital solutions

.66

−.07 −.02

.19

“Digital Culture” and the Uptake of Digital Solutions   395 .63

.06

.07

.13

Only some of my employees use the digital technologies available to us .58 in the workplace

–.09

.14

.04

I wish my organization would focus on creating a digital culture

.57

.15

.28

.05

We have digital technologies in place already but haven’t had the training to make best use of them

.50

–-.17 –.04

.28

Our company leaders don’t see digital technologies as a priority

.41

.06 –.18

.46

Our company leaders don’t see the significance of adopting a more digital way of working

.44

.06 –.17 .46

A lack of digital technologies hinders my ability to do my job effectively

Do you have confidence in the leadership team at your organization to -.12 navigate a more digital world?

.74

.00 –.01

The organization I work for has a clear digital vision

.10

.72

.08 -.13

As an organization, we take pride in the way we have adopted digital technology

.08

.65

.18 –.14

The digital solutions rolled out by my organization met my expectations

.04

.64

.18 –.06

Did the leadership team consult employees prior to the provision of new digital technologies?

.05

.63 –.07

.06

Financial incentive

.36

.54 –.11

.02

Were the potential benefits of these new digital technologies clearly communicated to you by your organization?

–.28

.39 –.01

.06

Tied to personal development goals

–.09

.33

.02

.03

I feel confident using digital technologies at home

–.16

.03

.81

.06

I like to have access to the latest technology

.06

.16

.74

.00

When not everyone adopts digital technology in the same way it makes it less effective

.23

–.04

.52

.07

We don’t have the necessary hardware to allow us to adopt a more digital way of working

–.05

–.03

.01 .87

We don’t have the necessary connectivity to support a more digital way of working

.01

–.03 –.05 .82 .12

.73

.15

.71

We are tied to existing tech or systems that mean we can’t change to more digital ways of working

.13 –.01 –.03

.70

Our leaders are trying to push through new digital ways of working, but the wider business isn’t interested in changing the way things are already done

.38

.39

The new digital systems don’t easily connect with older systems we have in place Financial pressures are preventing investment in digital technologies

.00 –.17 –.03

.12

.07 –.05

396   Simeon J. Yates and Eleanor Lockley

1. Negative digital culture 2. Positive digital culture 3. Personal confidence at home 4. Organizational challenges

A Model of Factors Leading to Perceived Success in Digital Technology Implementation It is now necessary to explore how all these issues work in combination. There are four areas of the UK workforce’s experience and attitudes that might affect their perceptions of digital roll-outs: • Personal confidence at home (this is measured by the factor score; it is also useful to include other personal aspects such as gender, age, job type, and level of home use) • Personal confidence at work (this is measured with a survey question; it is also possible to include other work aspects such as being a knowledge worker and level of employment) • Organizational factors (measured using the factor score; some other or­gan­i­za­ tional issues such as the size and sector of the organization can also be included) • Organizational cultural factors (this can be measured using the positive and negative culture factor scores) It was also noted previously that there is a strong correlation between perceptions of usefulness and the UK workforce’s belief that digital roll-outs were successful. Therefore, this has been used as a measure of engagement with digital technologies. As this was an exploratory analysis of the data, we undertook an All Possible Regressions or Best Subsets Regression approach. Though there is debate over this approach, it has been recommended by scholars in a number of recent statistical methodology texts (such as Chatterjee & Hadi, 2015; Montgomery et al., 2012). All of the measures discussed in the previous sections were therefore placed into an Automatic linear regression under IBM SPSS. The analysis therefore develops a model providing the highest R2 solution. This allows us to assess how much each factor affects the proportion of perceived roll-outs seen as being successful. The regression analysis produced a model that predicts 27% of attitudes to digital rollouts. This is a reasonably robust result in the context of social research—especially given the complexity of the social and work context being studied. Within the model the following factors were most important: Confidence with ICT at work; Positive digital culture at work; and Negative digital culture at work. Table 13.8 and Figure 13.19 present the key features of the model. It is notable that confidence at work is the most important predictor here, and home confidence the least. Though statistically significant, home confidence explains barely 1% of the variance, compared to 40% for work confidence. Positive cultural factors also outweigh the

“Digital Culture” and the Uptake of Digital Solutions   397 Table 13.8  Key Predictors of UK Workforce Perceptions of Successful Digital Roll-Outs Type

Positive factors

Importance (%) Negative factors

Personal

Personal confidence with digital at work

40

Lack of workplace digital confidence and efficacy

40

Workplace

Positive digital culture including clear leadership

26

Negative digital culture, including lack of leadership

13

Workplace

Being in the arts, professional service, retail, and Catering/ Leisure sectors

6

Being in the Health, Legal, and Travel sectors

6

Workplace

Small to medium company size (0–500 employees)

6

Organizational issues— finances, legacy systems

3

Personal

Personal home ICT confidence

1

Age (35–45)

3

Intercept Q13a-Feel very ineffective at work

Importance (%)

Coefficient Estimate Positive Negative

Q13a-Neither effective or ineffective at work Q13a-Fairly effective at work Q13a-Very effective at work Positive digital culture factor Negative digital culture factor Health care, legal and travel sectors

rcdQ5cScale...

Arts, professional, retail and leisure sectors Mid sized company Small company Organisational issues Age Home confidence

Figure 13.19  Regression model of perceptions of successful digital roll-outs.

398   Simeon J. Yates and Eleanor Lockley impact of negative ones (26% to 13%). Issues such as organizational sector, age, and company size are associated with the outcome but only by a small amount (3% to 6%).

Conclusions for Organizations from the Model Positive attitudes to digital in the workplace are driven mainly by issues that or­gan­i­za­ tions can address: • Ensuring workers feel efficacious and confident in digital tool use • Ensuring a positive organizational culture around the value of digital with management showing clear leadership and involving the work force There are, of course, practical problems to overcome, but these are secondary to the workplace culture issues, with some sectors having greater challenges than others. Key challenges perceived by workers are finances, connectivity, and legacy systems. Personal factors such as gender, level of employment, home use, and being a knowledge worker, though they show a significant association with perceptions of roll-outs, are not, in this model, the major factors in predicting attitudes to digital roll-outs. Importantly, home confidence with digital technologies does not seem to spill over to positive attitudes at work. The implication for organizations is that they can shape the success of their digital roll-outs. Organizational cultural factors outdo many traditional practical constraints. As the technology acceptance models argue, linking the personal to the wider or­gan­i­za­ tion context is key: • Ensuring that workers feel confident in their use of technologies through good training and confidence in the leadership team through good communication • Providing the organization with a well communicated clear digital vision • Taking organizational pride in the adoption of digital technology • Taking time to match digital solutions to worker expectations • Valuing the contributions of workers and consulting employees prior to the provision of new digital technologies • Ensuring that digital solution use is built into staff development and rewards systems These recommendations may seem like good “common sense,” but to take this approach to digital technology solutions is in fact to turn the telescope around and look at the problem from the opposite angle. Very often digital solutions are brought into organizations to solve challenges that the organization faces. They are implemented to make things better in and of themselves. However, instead they need to be considered just one part of an ­overall process of culture change to address the challenges being faced—the integration of digital technologies is only part of the solution. If organizations fail to address the cultural context into which they are placing often expensive and strategic digital solutions, they risk such interventions adding to the challenges they face, not relieving them.

“Digital Culture” and the Uptake of Digital Solutions   399

Conclusion Overall Summary This research represents one of the few national surveys of attitudes to digital technology. It is also one of the few national studies of attitudes and opinions of the UK workforce, unlike for instance the Ofcom (2016) research that focuses on personal and home use, or other commercial research that focuses on business, such as the Lloyds Bank digital index (2017). Academic studies have tended to focus on smaller case studies of businesses and single technologies (Abduwaila & Ali,  2013; Barnes,  2012; Bruque & Moyano, 2007; Cresswell & Sheikh, 2013; Fitzgerald et al., 2013; Rauniar, Rawski, Yang, & Johnson, 2014; Xie et al., 2013). Though the results presented are very much in line with prior studies, the findings are scaled to a national context. This study indicates that a large proportion of the UK workforce is not seeing the benefits of digital solutions. In line with other research (Lloyds Bank, 2017), 29% of the UK workforce has not seen new digital tools at work. A further 33% of those experiencing new digital tools did not think the tools have been very successful. A total of just over half of the UK workforce (51%) say they do not have access to, or are holding negative views of, digital technologies. The findings also indicate that there may be a disconnect between the use of technology at home and work. Individuals may feel they are digitally efficacious at home, but this may not transfer to work. This is important for both or­gan­i­ za­tions and government, as it cannot be assumed that people are able to transfer skills from their everyday social use of technology to the workplace. It also cannot be assumed that “digital natives” and “millennials” entering the workforce represent a skills base or a resource to further digitize organizations (GOV.UK, 2016). Thus the challenges of making a digital UK cannot be solved simply by implementing the latest new technology. Nor does it seem it will be solved by millennials taking a lead. But this research does provide some clear guidance on a route forward that is in fact within the ability of organizations to address.

Culture and Strategy “Culture eats strategy for breakfast” is a quote apocryphally attributed to the business commentator Peter Drucker, although the idea that culture is key to the success of or­gan­i­za­tional strategy is fairly well established in the academic literature (for example Harper & Utley, 2001; Jackson, 2011; Leidner & Kayworth, 2006; O’Reilly et al., 1991). In these arguments, culture determines and limits strategies (Schein, 2010). Much of this work highlights ways in which organizations can work with their culture. Culture is, in this context, the everyday practices, beliefs, and attitudes of workers in an organization which is built upon its history, workers’ experiences, and the sectors it works in, as well as those bigger issues of the national and community cultures outside the organization.

400   Simeon J. Yates and Eleanor Lockley Many of these things organizations cannot change by themselves. But this work has found that some of the most important aspects of organizational culture that impact digital are those that can be changed. The following factors appear to influence attitudes to digital solutions (though with widely varying strength): • Personal experience and confidence at home (knowledge of ICTs from home use) • Personal experience and confidence in use of work-based ICTs • Company organizational issues (practical issues) • Company/organizational sector • Company/organization size • Internal organizational issues (e.g., finances and legacy systems) • Company leadership and attitudes (company culture) • Attitudes to digital in the organization • Digital leadership in the organization The overall model implies that the most important issues are clearly under the control of the organization: • Personal experience and confidence in use of work based ICTs (some things that can be addressed through training support and good communication, through both face-to-face and some mediated channels) • Company leadership and attitudes (taking a lead on digital, making clear the benefits, listening to ideas and developments from colleagues, making digital an or­gan­i­za­tion wide priority, planning for digital disruption and having a clear dig­ ital vision that has been effectively shared) • Attitudes to digital in the organization (ensuing that the benefits of digital solutions are understood across the organization, understanding the likely points of resistance) • Company organizational issues (ensuring that practical and traditional barriers to digital systems—from poor equipment and connections to legacy systems and financial constraints—are understood and addressed) At the heart of the findings is the need for organizations to understand that making digital solutions a success is a process of cultural change in their organization. This change will need to be supported and managed, and certainly should not be driven by simply introducing digital tools and hoping they will force the change.

Final Conclusion This research indicates that the UK workforce sees organizational culture and leadership as barriers to taking up digital solutions—and not traditional factors such as legacy systems or costs. This study provides evidence to suggest that organizations cannot rely on the workforce bringing their personal expertise to the workplace. Social media savvy millennials may not be the solution to help organizations face

“Digital Culture” and the Uptake of Digital Solutions   401 dig­ital disruption and transformation. Importantly, UK workers are generally positive about taking on digital tools—where they have had experience of it—but they are looking for support, leadership, and engagement to make these changes successful. More broadly, if personal private experience with technology does not guarantee confidence at work, then it may be necessary to ensure that the UK workforce has the skills needed in the digitally transformed workplace. The results indicate that smaller workplaces are less likely to take on new digital solutions, and that practical challenges vary across sectors. But importantly the key issues are leadership, vision, and the transformation of organizational culture.

References Abduwaila, F. & Ali, M. (2013). The effect of organizational culture on CRM success. European, Mediterranean and Middle Eastern conference on information systems 2013 (EMCIS2013), October 17–18, Windsor, United Kingdom. Adams, D. A., Nelson, R. R., & Todd, P. A. (1992). Perceived usefulness, ease of use, and usage of information technology: A replication. MIS Quarterly, 16(2), 227–247. Barnes, S-A. (2012). The differential impact of ICT on employees: Narratives from a hi-tech organization. New Technology, Work and Employment, 27(2), 120–132. Bradley J., Loucks, J., Maccauley, J., Noronha, A., & Wade, M. (2015). Digital vortex: How dig­ ital disruption is redefining industries. global centre for digital business transformation. White Paper. Available at http://www.cisco.com/c/dam/en/us/solutions/collateral/ industry-solutions/digital-vortex-report.pdf Accessed 17 May 2017. Brecher, D., Laurenceau, C., & Sloman, C. (2016). (2016) Digital disruption. Accenture Strategy. White paper. Available at: https://www.accenture.com/t00010101T000000__w__/ gb-en/_acnmedia/PDF-4/Accenture-Strategy-Digital-Workforce-Future-of-Work. pdf#zoom=50 Accessed 17 May 2017. Bruque, S. & Moyano, J. (2007). Organizational determinants of information technology adoption and implementation in SMEs: The case of family and cooperative firms. Technovation, 27(5), 241–253. Chatterjee, S. & Hadi, A. S. (2015). Regression analysis by example, 5th ed. Hoboken, NJ: John Wiley & Sons. Cresswell, K. & Sheikh, A. (2013). Organizational issues in the implementation and adoption of health information technology innovations: An interpretative review. International Journal of Medical Informatics, 82(5), e73–e86. Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982–1003. Edwards, P. & Ramirez, P. (2016). When should workers embrace or resist new technology?. New Technology, Work and Employment, 31(2), 99–113. GOV.UK. House of Commons Science and Technology Committee. (2016). Digital skills crisis. Government report https://www.publications.parliament.uk/pa/cm201617/cmselect/cmsctech/ 270/270.pdf Accessed 17 May 2017. GOV.UK. (2014). Government digital inclusion strategy. Government report. https://www.gov. uk/government/publications/government-digital-inclusion-strategy/government-digitalinclusion-strategy Accessed May 2017.

402   Simeon J. Yates and Eleanor Lockley Harper, G. R. & Utley, D. R. (2001). Organizational culture and successful information technology implementation. Engineering Management Journal, 13(2), 11–15. Jackson, S. (2011). Organizational culture and information systems adoption: A three-perspective approach. Information and Organization, 21(2), 57–83. Leidner, D. E. & Kayworth, T. (2006). A review of culture in information systems research: Toward a theory of information technology culture conflict. MIS Quarterly, 30(2), 357–399. Lloyds Bank (2017). Consumer Digital Index. Report http://www.lloydsbank.com/bankingwith-us/whats-happening/consumer-digital-index.asp Accessed 15 November 2017 Loucks, J., Maccauley, J., Noronha, A., & Wade, M. (2016). Workforce transformation in the dig­ ital vortex: Reimagining work for digital business agility. Available at https://connectedfutures.cisco.com/report/workforce-transformation-in-the-digital-vortex/Accessed 17 May 2017. Montgomery, D. C., Peck, E. A., & Vining, G, G. (2012). Introduction to linear regression analysis (Vol. 821). New York: John Wiley & Sons. Ofcom. (2016). Media literacy. Market Research Report. http://stakeholders.ofcom.org.uk/ market-data-research/media-literacy/ Accessed 17 May 2017. O’Reilly, C.  A., Chatman, J., & Caldwell, D.  F. (1991). People and organizational culture: A profile comparison approach to assessing person-organization fit. Academy of Management Journal, 34(3), 487–516. Reynolds, N-S. (2015). Making sense of new technology during organisational change. New Technology, Work and Employment, 30(2), 145–157. Robey, D. & Azevedo, A. (1994). Cultural analysis of the organizational consequences of information technology. Accounting, Management and Information Technologies, 4(1), 23–37. Rauniar, R., Rawski, G., Yang, J., & Johnson, B. (2014). Technology acceptance model (TAM) and social media usage: an empirical study on Facebook. Journal of Enterprise Information Management, 27(1), 6–30. Schein, E. H. (2010). Organizational culture and leadership (Vol. 2). New York: John Wiley & Sons. Segars, A. H. & Grover, V. (1993). Re-examining perceived ease of use and usefulness: A confirmatory factor analysis. MIS Quarterly, 17(4), 517–525. Szajna, B. (1994). Software evaluation and choice: Predictive validation of the technology acceptance instrument. MIS Quarterly, 18(3), 319–324. Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic motivation, and emotion into the technology acceptance model. Information Systems Research, 11(4), 342–365. Venkatesh, V. & Davis, F.  D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186–204. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. Xie, S., Helfert, M., Lugmayr, A., Heimgärtner, R., & Holzinger, A. (2013, July). Influence of organizational culture and communication on the successful implementation of information technology in hospitals. In International conference on cross-cultural design (pp. 165–174). Berlin, Heidelberg: Springer, Yates, S., Kirby, J., & Lockley E. (2015). Digital media use: Differences and inequalities in relation to class and age. Sociological Research Online, 20(4), 1–21. Yates, S., & Lockley, E. (2018). Social media and social class. American Behavioral Scientist, 62(9), 1291–1316.

SECTION 5

C OM M U N I T I E S , I DE N T I T I E S , AND CLASS

chapter 14

ESRC R ev iew: Com m u n itie s a n d Iden titie s Simeon J. Yates, Jordana Blejmar, Bridgette Wessels, and Claire Taylor

Introduction The initial scoping questions for this domain were: “How do we define and authenticate ourselves in a digital age?” and “What new forms of communities and work emerge as a result of digital technologies—for example, new forms of coordination including large-scale and remote collaboration?” This chapter briefly explores the outcomes of the literature review and expert Delphi review process for the communities and identities domain. As with the other review chapters, the goal is not to work through a large number of examples from the literature. Instead, building on the methods described in chapter 2, we first set out the results of the digital humanitiesbased analyses of the literature and the content analysis of methods and theory. We highlight the major themes and topics within the literature—providing a few general examples. These are not intended to be the “most important” examples from the literature but, rather, simply indicative of the types of work. This is then followed by the presentation of the content analysis that sought to identify the key theories and methods in use within the literature. Next, we outline the results from the Delphi review of experts. This concludes with the key questions, topics, and challenges we identified, and we compare these to the results from the literature work. In the last section, we present the recommendations for areas of future study.

406   Simeon J. Yates et al.

Initial Comments The literature, Delphi, and workshop data all raise questions about how senses of community are perceived and experienced in a digital age. The initial ESRC scoping questions were thought to be appropriate, although it was argued that inclusion of the word “work” was too specific. A focus on work was deemed to draw attention to one narrow characteristic or context in which digital communities are found. Therefore, experts sought to broaden the view, depending on context and institutional landscape, as online communities tend to be structured and shaped by offline institutions as well as political, social, and geographic contexts. At the same time, research has often emphasized more autonomous, less institutionally bounded online communities or associations (see Katz et al., 2004, for a review). In terms of the idea of identity, many of the responses to the Delphi questions demonstrated a notable uncertainty about the idea of “authentication” in the ESRC scoping questions. Many of the responses interpreted authentication in terms of having an “authentic” sense of identity, rather than the technical process of individuals authenticating themselves so as to use or access digital media, services, systems, or institutions. Obviously, these issues overlap to some extent, but the majority of the responses and the surveyed literature pointed to wider questions of identity, self, and the links to community membership and community identity. We should also note that issues of identity and community appeared throughout all seven of the domains studied, with a considerable amount of overlap in the area of citizenship and politics. Here the focus was on political identity and political community or group membership. Such issues of community and identity have a long history in digital media research, going back to the 1980s and 1990s, evidenced by the Rheingold’s (1993) examination of the “Whole Earth 'Lectronic Link” (WELL), or Turkle’s early work on technology and the self (1984), life online (1995) and identity (1999). Much of the detailed analysis of group and community dynamics can be found in the initial work during the 1980s and 1990s on both workplace and educational collaboration (e.g., Mason & Kaye, 1989). Further, concerns about how best to design and manage systems to support digital communities also appeared in the 1970s (e.g., Hiltz & Turoff, 1978), 1980s (e.g., Zuboff, 1988), and 1990s (e.g., Mynatt et al., 1997), and continue (see Kraut & Resnick, 2011). In general terms, our analysis of the literature focused on the period that Wellman and Haythornthwaite (2002) called the “second age of the internet,” and considered changing senses of community and identity in regards to networks and networked individualism. In overall terms, the main argument in the late 1990s and early 2000s was that community is increasingly being developed on the basis of communication rather than inhabitancy or physical proximity. Instead of communities arising from a shared location—from people inhabiting the same physical location—instead, a networked sense of community developed. These new types of communities form around shared values and social organization, and are built on choices and strategies of social actors, family, individuals, or social groups. Wellman and Haythornthwaite (2002) argued that networks of interpersonal ties provide sociability, support, information, and a sense of belonging and social identity. One key transformation that arises in comparison to

ESRC Review: Communities and Identities   407 e­ arlier, place-based communities is the development of weak and strong networks in which users adapt digital resources to meet the needs of network sociability. More recently, Wellman with Rainie (2014) have deepened and extended this earlier argument in line with the diffusion of digital services into social life. Currently, digitally supported social networks are the “new social operating system.” The authors argue that large, loosely knit social circles of networked individuals expand opportunities for learning, problem solving, decision making, working, and personal interaction.

Literature Analysis The literature analysis was designed to create two analytic outcomes. First, the goal was to identify key topics within the existing literature. This would allow the comparison with areas of future importance identified by the Delphi review. The first round of collected literature was analyzed to create concept pairs and trios; the combined first and second rounds of literature were then analyzed to identify key topic clusters. The results of these two approaches were then compared. The second goal was to explore the predominance of specific, theories, methods, and approaches within the data. As noted in chapter 2, the literature data were subjected to two analyses.

Topics Table 14.1 shows the 13 most common concepts (covering 2% or more of the identified cases) identified from the first round of literature. Table  14.2 lists their subtopics. Table 14.1  Analysis Concepts Ranked Concepts Group Computer Community Gender Identity Child Knowledge Network Machine Communication Leadership College Game

Percent 13.7 13.6 10.8 6.8 6.5 4.0 3.9 3.8 3.4 3.2 2.8 2.5 2.1

Note: Concepts occurring in at least 2% of the cases.

408   Simeon J. Yates et al. Table 14.2  Concept Pairings—Main and Secondary Concepts Concepts

Percent

child game laptop object programming robot stage

4.0 .7 .4 1.6 .6 .4 .4

college friend medium student

2.5 .7 .7 1.1

communication cue dynamics leadership park personality psychology uncertainty

3.2 .4 .4 .5 .4 .4 .8 .4

community designer educator empathy leadership Lurker membership moderator Poster sociability Student usability

10.8 1.1 .3 .6 1.7 .8 .8 .8 .5 1.6 1.1 1.4

Concepts Computer fear hacker language mastery mind object owner presence programming psychology self toy transparency world

Percent 13.6 .2 1.0 1.1 .6 1.6 1.3 .6 .5 1.7 .7 1.0 1.0 .3 1.9

game mind object play screen simulation something space spear

2.1 .2 .2 .6 .2 .2 .3 .5 1.0

gender genre helper herring identity judge man message performance word

6.8 .9 .6 .5 1.4 .5 .6 1.2 .5 .5

Concepts group identification identity individuality in-group lea manipulation membership negotiation prediction prentice psychology side spear

Percent 13.7 1.3 3.3 .8 .5 .6 1.5 .7 .4 1.1 .3 1.4 .8 1.0

identity in-group influence member norm path pilot prediction psychology

6.5 .4 .7 1.8 1.0 .8 .3 .7 .7

knowledge organization platform source

3.9 2.3 .3 1.3

leadership network participant role

2.8 .9 .9 1.1

machine object program programming system thing way world

3.4 .4 .5 .3 .3 .5 .8 .5

network proportion single size tie

3.8 .7 .4 .7 2.0

Note: The bolded term is the main concept; the unbolded terms below that and above the line are the related subconcepts.

ESRC Review: Communities and Identities   409 In Table 14.2 the main concept is marked in bold, and various concepts follow. Not surprisingly, the two main concepts are “group” and “community,” followed by “identity” and “gender.” It is interesting to note that the analysis here pulled out more general aspects of computer use rather than a specific social media or digital platform, as can also be seen in the Citizenship and Politics domain (chapter 16). As with the other domains, we can see a shift in focus within the literature between 2000 and 2016 (see Figures 14.1 and 14.2).1 If we explore the visualizations of the data, we find that in the 2000–2004 literature concept pairs such as group membership, group identity, and group norms are some of the most common. This contrasts with the 2012–2016 where concept pairs such as community participation, community leadership, community knowledge and community network are more common. Based on exploring the papers behind these results, there appears to have been a shift from more socialpsychological work on group membership and identity to more sociological work on community dynamics. We would argue that the long history of work in these areas, and their persistence as research topics, though with clear changes in focus, indicate that

community/

contact/ne

collectivi communication

analysis/ community

community

contact/t

community/i

behavior/

community/ net

model/netw

network/supp

network/well

community/ people

liverpool/

community/s

group/identity community/

member/net system/us

community/

group/norm

internet/pe

network/tie

kin/networ group/study

community

group/member effect/group group/people

group/sys

community/i

friend/netw

anonymity

community/ member

community/ gro

support/tie

people/sys

group/influ

group/work

communica group/partici group/ind

group/inte member/it member/su

community

link/netw

system/use

Figure 14.1  Communities and Identities 2000–2004: Most frequent concept pairs.

410   Simeon J. Yates et al.

helper/su

facebook/

leadershi

attention

lpc/messa

friend/su leader/lead

hpc/message

gender/su

helper/hpc helper/mess

community

medium/po community/r

community/

college/st

applicati

change/valu

facebook/

gender/mess knowledge

applicati

community/

community/

knowledge/o

interactio community/ net

message/supp model/var

medium/ mow

game/pla

leader/parti

community

leadershi

medium/pe capital/co

college/fr

facebook

message/p

facebook/

behavior

attention/ communication gender/hel communic informat informat

college/ child/la

leadersh

Figure 14.2  Communities and Identities 2012–2016: Most frequent concept pairs.

research into communities and identities remains fundamental to understanding how we live in the digital age. As noted in chapter 2, the second approach to the analysis of the literature explored the extraction of topics using a different methodology, based on a factor analysis of salience and relevance measures. As noted in chapter 2, we utilized both bespoke tools and the Wordstat software. Unlike the concept mapping, which pulled out some of the underlying ontological links, the identification of topics produced groups that more overtly fitted theory and methods in the literature. This was the case for all of the literature analyses. The 12 topics identified (using Wordstat) are presented in Table  14.3. Table 14.4 presents an analysis of the overlap between the topics and concepts analyses. Seven key topics stand out from the analysis (see Table  14.3): Online community, Identity, Friendship network, Mobile phone, Children, Gender, and Education Online community. The first thing to note is that the topic analysis has not brought out the distinction between research focused on topics clustered around group as opposed to community—with both featuring under the same topic. This points out the benefits of

ESRC Review: Communities and Identities   411 Table 14.3  Wordstat Analysis of Topics Topics

Keywords

Eigen-value

Freq

Cases

% Cases

Online community

ONLIN; DATE; WALTHER; INTERPERSON; COMMUN; INTERACT; BEHAVIOR; CMC

11.92

12,789

153

96.8

Mobile phone PHONE; MOBIL; SERVIC; HANDSET; MARKET; PERCENT AAKHU; KATZ; EDIT; APPARATGEIST; MOBIL; PERFORM TEXT; PHONE; MESSAG; CELL; SM; MOBIL; SEND; PHILIPPIN RINGTON; RING; MUSIC PICTUR; PHOTOGRAPH; PHOTO; CAMERA; IMAG

3.44

1859

124

78.5

Children

ALIV; TOI; CHILDREN; CHILD; OBJECT; ROBOT; MACHIN; PHYSIC; PSYCHOLOG

3.25

3889

130

82.38

Migration TRANSNAT; MIGRAT; DIASPORA; and diaspora MIGRANT; GLOBAL; ETHNIC; COSMOPOLITAN; ICT; CULTUR; DIGIT RELIGI; RELIGION; SUPERNATUR; TEEN; ISLAM; MUSLIM GITAL; PASSAG; MIGRANT; YOUTH

2.92

2114

105

66.5

Identity POSTM; SPEAR; TURNER; HASLAM; (psychology/ GROUP; IDENT; PSYCHOLOGI; social) INTERGROUP

2.57

6241

149

94.3

Gender

MEN; WOMEN; MALE; FEMAL; GENDER

2.49

3597

118

74.7

Education

EDUC; SCHOOL; TEACHER; STUDENT; LEARN; RESOURC; FUTUR; PARENT; CHILDREN; COLLEG

2.17

6520

144

91.1

Friendship network

TI; NETWORK; WELLMAN; KIN; LOCAL; FRIEND

2.04

5683

145

91.8

Facebook

FACEBOOK; ESTEEM; CAPIT; COLLEG; MEASUR; VARIABL

1.82

2665

128

81.0

Computing

MACHIN; PROGRAM; COMPUT; INTELLIG; AI; SOMETH; HACKER; SYSTEM

1.58

5511

143

90.5

Governance

EUROPEAN; POLIT; EU; POLICI; EUROP; GOVERN; DEMOCRAT; CITIZEN; NATION; SPHERE IDENTI; CATION; DEDUCT; MANIPUL

1.57

5127

139

88.0

1.55

1534

96

60.8

Identity (assessment)

using different approaches to the automatic evaluation of literature and the extraction of key terms. Differences in method and measures are likely to provide different views on the same materials. For example, some approaches based on an ontology (model of relations between concepts) drawn from theory or inductive coding may highlight certain

Table 14.4  Comparison between Concepts and WordStat Topics Concept/Topic

Child College Communication Community Computer Game Gender Group Identity Knowledge Leadership Machine Network

Online Mobile Children Migration and Identity Gender Education Friendship Facebook Computing Governance Identity community phone diaspora (Psychology/ network (Assessment) Social) X X X

X

X

X X

X X X

X X

X

X

X

ESRC Review: Communities and Identities   413 aspects of the data—such as the idea of “group membership,” whereas an inductive but statistical analysis of word frequency might highlight the use of a variety of terms that form around elements of the idea of community. As noted earlier, the study of online community has a long history going back to the early 1980s. It is important to remember that the literature examined here represents that which was specifically identified as being about community or identity. Looking at all the data we find that community or group membership forms a core part of the theoretical backdrop to other domains, especially citizenship and politics, health and well-being, and communication and relationships. While we cannot hope to summarize this breadth of work here, it is important to note how community or group membership is fundamental to these other domains. It is of course the case that in the past two decades, for many people in socio-economically well-developed societies, digitally mediated membership of communities is a vital part of their contemporary sense of self, identity, and well-being. We can also note what might be described as apprehensions and even some unease about the nature of “digital identity” and “community” in the literature, the Delphi data, and workshops. We would argue that there remains a thread of argument around the “virtual,” “disembodied” or purely mediated identity and community concerning their credibility and authenticity as compared to face-to-face interactions and contexts. This touches on the distinction in the literature between discussions of identity and self in terms of their “performance” via digital media, and the search for and methods to digitally check the “authenticity” of identity online. We would go so far as to argue some these issues and questions may arise form disciplinary perspectives on questions of identity and community as much as they do from the analysis of data and cases. As we shall discuss, current theories are mainly drawn from social and behavioral psychology, networked approaches to sociology, and the mixed approaches found in computer-mediated communication studies. As already noted, the literature appears to have shifted from a more social-psychological focus on group processes and membership to a more sociological one focused on community dynamics, although there is no distinct dividing line, with both approaches prevalent across the period. Papers that address aspects of group or community behavior or structure cut across all the other domains—for example, group dynamics in relation to political action, or group support in relation to health conditions. Along with aspects of the communication and relationships domain, this appears to be one of the underpinning issues for the study of digital media. Identity. The discussion of online or digital groups is often tied up with social-psychological theories of group membership and identity. We have named the topic here “identity (psychology/social)” to distinguish it from work on identity verification via technologies. It is worth noting that there is a far greater breadth of literature—especially in journal article form—that takes a psychological or more likely social-psychological perspective on issues of digital identity. A prominent feature of this work is the SIDE model of online group behavior (see: Postmes et al., 1998; Spears et al., 2002). More recent papers openly acknowledge that approaches to community need to take on board a variety of disciplinary insights.

414   Simeon J. Yates et al. Friendship network. A number of papers link both psychological and sociological aspects of social networks via the idea of social capital (e.g., Steinfield et al., 2008). or offer overall models to help understand online communities (e.g., de Souza & Preece, 2004). At the same time, more recent research concerns the linkage between “online” community or interaction and other forms of community. In this context ideas of social capital are understood in sociological terms, typically in line with the work of Putnam (2000). In this characterization, social capital is understood as both a personal and community commodity that is linked to civic and political engagement and the building of community groups (e.g., Preece, 2002; Rainie & Wellman, 2012). These approaches are now applied to very specific contexts such as political group membership or social and psychological support within communities (digital and/or face-to-face; e.g., Wright et al., 2013). Mobile phone. The breadth of technologies covered by the literature includes MUDs (multi-use dimensions) and MOOs (MUD, object-oriented;1990s) via social media (Twitter and Facebook) through to current multiplayer online games. Notably, mobile phone use appears prominently in this domain. The analysis picked out mobile phones and devices as a key category. In looking at the literature we do not find a single focus for papers on the mobile phone. There has been a broad literature on the use of the mobile phone for quite some time (e.g., Katz & Aakhus, 2002; Ling, 2004), and research on mobile phone use could be found in all domains. In the context of community, the examination of mobiles often pointed to its use as part of existing communities or community maintenance, from basic citizenship (e.g., Mossberger et al., 2012) through to friendship networks. Children. Much of the work on mobile technologies and identity we identified focused on children and adolescents, with debates around the level and extent of digital communication in relation to personal development, friendship networks, and emotional wellbeing. This touches on a body of literature that focuses on “fears” about the levels or any types of use (e.g., whether high levels of use and highly immersive use might adversely affect young peoples’ developmental processes) or the dangers of mobile use (such as cyber bullying). This work cuts across our domains—especially into the governance domain as it touches on policy issues. Examples of such work include Livingstone (2003), Livingstone and Görzig (2014), and Subrahmanyam et al. (2001). Friendship networks form a consistent theme within the literature on young people, adolescent, and teen use of digital media (e.g., Mesch & Talmud, 2006; Valkenburg & Peter, 2009). One area that clearly incorporates both issues of identity and group membership is that of migrant and diasporic use of communications media (e.g., Panagakos & Horst, 2006). Gender. As well as age-related issues, another constant touchstone for the study of issues of identity in digital media has been the examination of gender variation in the uses of the medium and the content generated. This work goes back to very early studies in the 1980s of access to ICT to work in the 1990s on gender in relation to online interaction (Herring, 1996). Exploring this literature makes clear the transition in the examination of identity in digital media from assumptions of “anonymity” and lack of identity markers (social cues) through to very detailed analyses of the performance of or function of social and cultural categories in digital interaction. Current work where issues of gender

ESRC Review: Communities and Identities   415 performance or variation are address cover topics as varied as: interpersonal support (e.g., Spottswood et al.,  2013), experimenting with identity in various media (e.g., Valkenburg & Peter,  2008), variations over time in gender behaviors online (e.g., Kapidzic & Herring,  2011), gender and politics online (e.g., Vochocová et al., 2016), exposure to risks online (e.g., Sasson & Mesch,  2016), the feminist examination of mobile and spatial technologies (e.g., Leszczynski & Elwood, 2015), and many more. The methods and approaches used in this work vary from controlled experiments, statistical models, and socio-linguistic analysis through to ethnographic observation, with theoretical approaches varying from feminist linguistics to cognitive psychology. This all points to the fact that questions of gender variations, gender inequalities, and the gendered performance of identity in digital media remain key topics. Education. As noted earlier, much of the work prior to 2000 on online community focused on workplace and educational settings. This focus remains in the literature studied here but with a growing focus on the use of social media platforms rather than bespoke educational systems (e.g., Lampe et al., 2011). That said, a further analysis points to a methodological reason for the prominence of an educational topics and concepts in the literature. A very large proportion of the studies undertaken involve university students, adolescents, and young people. There are a variety of reasons for this, one of the key ones being that they are usually the early adopters of new digital media. The focus on young people is extended into studies about the role and use of social media in protests, social movements, and other types of civic mobilization. Boulianne’s (2015) meta-analysis of 36 studies of social media and citizen engagement shows that the strongest relationship between the use of social media and involvement civic or political activities is by 18- and 24-year-olds. However, given the development media-hybridity (Chadwick & Dennis,  2017) in political communication, attention is beginning to be focused more on other age groups especially in areas of community, civic and political engagement (Wessels, 2018). Very often the focus is not directly on the age groups of those using social media but rather on the contexts and purposes of use. These include, for example, the way social media was used to organize clear-up activities in the 2014 floods in England (Miller, 2015) and for mapping local social activities. Summary. Our review therefore suggests that this domain is a relatively well-researched area yielding reliable findings from a range of disciplines, wide scale surveys, and experiments.

Theory, Method, and Approach As we described in chapter 2, the content analysis builds on Borah’s (2017) approach to analyzing a set of communications and media literature in regard to digital media use. Table 14.5 details the results with regard to the empirical approach taken in the literature. Most of the analyzed papers (62%) were mainly “inductive,” either describing findings or building theory and 38% were deductive and undertook theory testing.

416   Simeon J. Yates et al. The papers were split between 57% of papers that undertook primary or secondary data work with against 43% that undertook discursive reviews of, or were reflective on, existing research (Table 14.6). The main disciplines from which theory was used or for which theory was developed were: Sociology (38.1%), Psychology (30.9%), and Communication and media (19.6%). It is important to note that only actual use of theory for the purposes of design, synthesis or analysis were coded. General references to prior work and theory, such as broad reference to “network society” (e.g., Castells) or the general discussion of ideas of “community” were not coded. This distinction is important as it highlights the use of theory to design and analyze data or synthesize materials, as distinct from more general discussion. There was considerable variety in the specific theories applied from three main disciplines. Though there was no substantive clear preference, the most common theories were: • Sociological theories (38.1%), including Social network analysis (4%) and Technology acceptance models (3%) • Psychological theories (30.9%), including Social identity theory (7%) and Selfcategorization theory (3%) • Communications and media theories (19.6%), all identified as “Computer mediated communication” approaches The main research methods were relatively evenly split across surveys (14%), interviews (14%), literature reviews (14%) and experiments (12%; Table 14.7). The majority of the empirical work focused on specific groups (e.g., Students or Twitter users) with a limited number (14.3%) of general population studies (Table 14.8). Less than 3% of studies overtly stated that they were using a “big data” approach. Table 14.5  Epistemological Approach Percent Deductive (testing of existing theory) Inductive (conclusions driven by data)

38.0 62.0

Table 14.6  Empirical Approach Percent Discursive/descriptive (no new data or theory) Primary empirical (data collected and analyzed) Secondary empirical (analysis of existing data)

43.3 49.2 7.5

ESRC Review: Communities and Identities   417 Table 14.7  Research Method Percent No clear methods Survey Interview(s) Literature review (general or narrative) Experiment Content analysis Ethnography Theory building Social network analysis Textual (linguistic-discourse analysis) Other Focus groups

14.8 14.2 13.5 13.5 11.5 9.5 6.1 5.4 3.4 3.4 2.7 2.0

Table 14.8  Study Population Percent Case study(ies) General population Specific group

14.4 14.3 71.4

Delphi Review The literature review analysis explored the themes to be found within recent research publications. The following section details the results of the Delphi process for the citizenship and politics domain. There were three parts to the Delphi review: an initial survey, a confirmatory questionnaire to address the findings from the survey, and a confirmatory workshop. The goal of the Delphi process was to identify and prioritize areas for future research. These might include areas already covered by literature, but also new concerns, or the needs for a tighter focus on a specific issue. The process sought to identify suggested future scoping or research questions, key topics to address within these questions, and key challenges that might be encountered when researching these questions.

Future Research and Scoping Questions The Delphi review identified a set of scoping questions for the domain. These were coded into the three categories detailed in Table 14.9. These are listed in order of their initial and confirmatory survey ranked importance which matched, unlike some other.

418   Simeon J. Yates et al. Table 14.9  Delphi Review Scoping Questions Question category

Example questions

Community ­membership and processes

What is the glue that binds members to these communities? What different effects do digital technologies have on communities? Do digital technologies enhance or limit people’s sense of belonging in local, national and transnational communities? What are the net benefits of participation in online communities, considering both the positives (e.g., social support, information exchange) and the negatives (e.g., trolling, astroturfing) associated with online groups? What questions do we need to ask in relation to the reconfiguration of communities in a digital age that enable us to understand the politics and socio-technical dimensions at play? How has the definition of 'community' evolved since the inception of the digital age? (Relatedly: How do “digital natives”—people born since the mid-1980s who have never known a world without the Internet—define 'community')

Defining identity online

What are the differences in how we define ourselves in a digital age by gender, class, age, etc.? What does “identity” refer to in an online context and must it always be assumed there is a connection between identity and authenticity? What is an authentic identity these days anyway? What are the implications of the digital on questions of identity? How does the digital enable or disenable us to ask better questions of identity? How does personal identity evolve (or not) in the context of these communities? How are digital technologies being used to support interaction over distance?

Understanding remote relationships

The consultation workshop noted that online vs offline is too much of a duality, as many communities have blended media use, which extensive research also shows. The workshop also pointed out that the proposed questions from the Delphi survey focused more on community than identity, and that understanding the relationship between identity and community online is a key current and future research issue. They added that a more contemporary question might be that of understanding the specifics or different digital communities, building on the more general work already done. The workshop also highlighted the challenge of managing identity—pseudo-anonymity, authenticity, actual anonymity, and genuineness—and has argued that much of the existing research and the Delphi materials appears to have an overly positive take on digital participation—there is a need for work on negative aspects and the impacts of forced digital community participation. The topics within these areas identified in the Delphi review were coded into seven categories. Table  14.10 lists the percentage of responses in each category. Table  14.11

ESRC Review: Communities and Identities   419 Table 14.10  Key Topics Ranked by Percent of Delphi Survey Responses Topic

Percent

Exclusion/Inclusion Participation, action and social change Diaspora Gender/race/ethnicity Power Citizenship Digital labor

17 17 13 13 8 4 4

Topic

Percent

Ethics Legal Methods Norms Tolerance Urban

4 4 4 4 4 4

Table 14.11  Key Topics Ranked by Importance from Delphi Survey Topic

Very Important Neutral Unimportant Very important unimportant

Digital Community Exclusion/ Inclusion

87.5%

12.5%

0.0%

0.0%

0.0%

Digital community participation, action and social change

87.5

12.5

0.0

0.0

0.0

Power in online communities

75.0

12.5

12.5

0.0

0.0

Understanding global diaspora as digital communities

37.5

50.0

12.5

0.0

0.0

Understanding function of aspects of identity online (Gender/Race/Ethnicity/Sexuality)

37.5

37.5

25.0

0.0

0.0

­ rovides the ranked importance that respondents gave to categories, derived from the p ­confirmatory survey. There is clearly a strong overlap between both lists in this case, indicating that the categories repeatedly identified by respondents were also those they considered most important. The consultation workshop noted that these topics were important but already well studied for the majority of digital media. However, they also noted that new digital platforms lead to new challenges. The workshop also considered the considerable crossover to the Communication and Relationship domain. Workshop participants also identified potential gaps in the Delphi topics list, namely: the need to include “social class” as an element of digital identity, the need for a greater focus on identity rather than demographics, and that for work with diaspora there is the need to avoid assuming the goal is simply about national or community integration.

420   Simeon J. Yates et al.

Research Challenges In regard to the challenges of undertaking research in this area, six categories were identified in the data from the Delphi survey. Table 14.12 lists these categories and their rank based on the number of coded items, while Table 14.13 shows their ranking based on the confirmation survey. There is some divergence in the rankings, with methods coming at the top of the confirmation survey results, but a holistic understanding of online and offline behavior as the most frequently mentioned research challenge. In this case, all of the challenges are also shared with other domains. The consultation workshop dug a little deeper into these lists and identified specific further challenges for research in this domain, namely:

Table 14.12  Challenges Ranked by Percent of Cases Challenge

Percent

Holistic understanding of online and offline behavior Ethics of dealing with digital data Methods to address complexity of digital media use Big data—developing and utilizing large databases and corpora Comparative historical (diachronic) analysis of digital media use Representation of outputs

33 24 24 10 5 5

Table 14.13  Challenges Ranked by Importance from Delphi Survey Challenge

Very Important important

Neutral

Unimportant Very unimportant

Methods to address complexity of digital media use

75.0%

25.0%

0.0%

0.0%

0.0%

Ethics of dealing with digital data

62.5

37.5

0.0

0.0

0.0

Holistic understanding of online and offline behavior

50.0

50.0

0.0

0.0

0.0

Big data—developing and utilizing large databases and corpora

12.5

75.0

12.5

0.0

0.0

0.0

100.0

0.0

0.0

0.0

Comparative historical (­diachronic) analysis of digital media use

ESRC Review: Communities and Identities   421 • history and culture are important to the development of online community • how identity gets lost outside citizens control when it becomes part of big data • ethics of platforms use of big data, and • understanding privacy in online communities In conclusion, as with the other domains, we believe that the complexity and variety of potential work warrants consideration to be taken of all the question, topics, and challenges identified. While noting this, we would argue that the analysis here has identified the following key areas for future research: • Community membership and processes • Defining identity online • Understanding remote relationships Within these areas the top five topics to consider are • Digital community exclusion/inclusion • Digital community participation • Action and social change • Power in online communities • Understanding global diaspora as digital communities • Understanding function of aspects of identity online (gender, race, ethnicity, sexuality) In sum, key domain-specific challenges are around the holistic understanding of online and offline behavior within citizens’ lifeworlds.

Conclusion What appears as a new concern in the Delphi materials that is not as present in the literature is a range of concerns associated with issues of inequality in both access to and ­participation in online communities. The proposed research areas may typically probe whether digital processes can include or exclude certain individuals/groups/communities and if differences within and amongst these, whether physical, social, political or cultural, have a negative bearing on the dynamics of inclusion and exclusion. The review highlights that questions about identification, intersectionality, and tackling of systems of discrimination or disadvantage are of primary importance in understanding contemporary processes of “digital opt out” and “digital opt in.” This contrasts with much early work (1985–2000) on computer-mediated communication that emphasized the potential of a progressively developed digital age, and was, in some of its early manifestations at least, overly optimistic or utopian about the democratizing potentials of digital

422   Simeon J. Yates et al. technologies. There therefore remains a question about what current platforms might offer in terms of addressing, or reinforcing, persistent inequality. Overall we argue that the following issues need to be addressed. First, there should be more comprehensive research into identity and community to generate a deeper understanding of computer mediated communication (CMC) in terms of how, why, and where individual identities are formed. Second, what is important to communities (digital, face-to-face, or more likely blended) is the way digital services feature in community life. Third, we need to improve our understanding of how online systems can be better designed to support communities. Fourth, research on understanding and analyses of the dynamics of various types of online communities should beyond the knowledge gained in the late 1990s and early 2000s. Fifth, research is needed in issues around digital skills and how different communities or groups are impacted by linguistic and cultural specificities, and the ways in which they engage with and utilize digital technologies. Sixth, an investigation of digital diasporas, including their cultural, social, and political configurations and transformations of and through digital connectivity, is required. For example, more research is needed about connected migrants—how migrants and refugees are using digital technologies to connect with others, to find their place in the world, and to develop skills for employment and integration. Seventh, we need to understand how participation in digital communities influences collective action, either from among members of that community, or by members engaging collectively beyond those communities. Finally, we need more critical analysis of online participation, i.e., what it means for individuals, social groups, and society and is it empowering, exploitive, or both? Existing research has employed fairly traditional methods such as surveys and interviews. It is orientated towards psychological and sociological approaches, with some Information studies research. The work does not appear to have extensively employed digital tools and big data methods, though those approaches are becoming more prominent. Most notably the work appears to have been less “platform driven” or “platform specific,” but does tend to emphasize children and younger people. Moreover, the future research scopes identified in the Delphi process are substantially similar: Community membership and processes, defining identity online, and understanding remote relationships. While this has remained relatively constant, the notable shift is in the topics and challenges identified. As with other domains, there is a shift away from technology and platform foci to broader social science questions, though there remain some overlapping areas. As noted in the confirmatory workshop discussion, there is a greater concern with the negative aspects of online identity and community. As with the Communication and Relationships domain, there is a concern to look at multi-platform or “holistic” aspects of digital media use. Suggested future topic areas include: digital community exclusion/inclusion, digital community participation, action and social change, power in online communities, understanding global diaspora as digital communities, and understanding function of aspects of identity online (gender/race/ethnicity/sexuality). The key domain-specific challenge is an holistic understanding of combined online and offline behavior.

ESRC Review: Communities and Identities   423

Note 1. As part of the review, the Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves iden­ tifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high-frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualisations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https:// waysofbeingdigital.com/literature-analysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each domain, and the relative frequency of concepts associated with each cluster.

References Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Boulianne, S. (2015). Social media use and participation: A meta-analysis of current research. Information, Communication and Society, 18(5), 524–538. Chadwick, A. & Dennis, J. (2017). Social media, professional media and mobilization in contemporary Britain: Explaining the strengths and weaknesses of the citizen’s movement 38 Degrees. Political Studies, 65(1), 931–950. De Souza, C.  S. & Preece, J. (2004). A framework for analyzing and understanding online communities. Interacting with Computers, 16(3), 579–610. Herring, S. (1996). Gender and democracy in computer-mediated communication. In R. Kling (Ed.), Computerisation and controversy: Value conflicts and social choices (pp. 476–489). San Diego, CA: Academic Press. Hiltz, S. R. & Turoff, M. (1978). Network nation: Human communication via computer. Boston, MA: Addison-Wesley. Kapidzic, S. & Herring, S. C. (2011). Gender, communication, and self-presentation in teen chatrooms revisited: Have patterns changed? Journal of Computer-Mediated Communication, 17(1), 39–59. Katz, J. E. & Aakhus, M. (Eds.). (2002). Perpetual contact: Mobile communication, private talk, public performance. New York, NY: Cambridge University Press. Katz, J. E., Rice, R. E., Acord, S., Dasgupta, K., & David, K. (2004). Personal mediated communication and the concept of community in theory and practice. In P. Kalbfleisch (Ed.), Communication and community, communication yearbook 28 (pp. 315–370). Mahwah, NJ: Erlbaum. Kraut, R. E. & Resnick, P. (Eds.) (2011). Building successful online communities: Evidence-based social design. Cambridge, MA: MIT Press.

424   Simeon J. Yates et al. Lampe, C., Wohn, D. Y., Vitak, J., Ellison, N. B., & Wash, R. (2011). Student use of Facebook for organizing collaborative classroom activities. International Journal of ComputerSupported Collaborative Learning, 6(3), 329–347. Leszczynski, A. & Elwood, S. (2015). Feminist geographies of new spatial media. The Canadian Geographer/Le Géographe Canadien, 59(1), 12–28. Ling, R. (2004). Just connect The social world of the mobile phone. Psychology Review, 11, 10–13. Livingstone, S. (2003). Children’s use of the Internet: Reflections on the emerging research agenda. New Media & Society, 5(2), 147–166. Livingstone, S. & Görzig, A. (2014). When adolescents receive sexual messages on the internet: Explaining experiences of risk and harm. Computers in Human Behavior, 33, 8–15. Mason, R. & Kaye, A. (Eds.). (1989). Mindweave: Communication, computers, and distance education. Oxford, UK: Pergamon. Mesch, G.  S. & Talmud, I. (2006). Online friendship formation, communication channels, and social closeness. International Journal of Internet Science, 1(1), 29–44. Miller, C. (2015). Social action on social media. London: Nesta. Mossberger, K., Tolbert, C. J., & Hamilton, A. (2012). Measuring digital citizenship: Mobile access and broadband. International Journal of Communication, 6(37), 2492–2528. Mynatt, E. D., Adler, A., Ito, M., & O’Day, V. L. (1997, March). Design for network communities. In Proceedings of the ACM SIGCHI conference on human factors in computing systems (pp. 210–217). ACM. Panagakos, A. N. & Horst, H. A. (2006). Return to Cyberia: Technology and the social worlds of transnational migrants. Global Networks, 6(2), 109–124. Postmes, T., Spears, R., & Lea, M. (1998). Breaching or building social boundaries? Side-effects of computer-mediated communication. Communication Research, 25(6), 689–715. Preece, J. (2002). Supporting community and building social capital. Communications of the ACM, 45(4), 37–39. Putnam, R.  D. (2000). Bowling alone: The collapse and revival of American community. New York: Simon and Schuster. Rainie, L., & Wellman, B. (2012). Networked. The new social operating system. Massachussets: Massachusetts Institute of Technology. Rheingold, H. (1993). The virtual community: Finding connection in a computerized world. Boston, MA: Addison-Wesley Longman Publishing Co., Inc. Sasson, H. & Mesch, G. (2016). Gender differences in the factors explaining risky behaviour online. Journal of Youth and Adolescence, 45(5), 973–985. Spears, R., Lea, M., Corneliussen, R. A., Postmes, T., & Haar, W. T. (2002). Computer-mediated communication as a channel for social resistance: The strategic side of SIDE. Small Group Research, 33(5), 555–574. Spottswood, E. L., Walther, J. B., Holmstrom, A. J., & Ellison, N. B. (2013). Person-centered emotional support and gender attributions in computer-mediated communication. Human Communication Research, 39(3), 295–316. Steinfield, C., Ellison, N. B., & Lampe, C. (2008). Social capital, self-esteem, and use of online social network sites: A longitudinal analysis. Journal of Applied Developmental Psychology, 29(6), 434–445. Subrahmanyam, K., Greenfield, P., Kraut, R., & Gross, E. (2001). The impact of computer use on children’s and adolescents’ development. Journal of Applied Developmental Psychology, 22(1), 7–30.

ESRC Review: Communities and Identities   425 Turkle, S. (1984). The second self: Computers and the human spirit. New York, NY: Harper-Collins. Turkle, S. (1995). Life on the screen: Identity in the age of the Internet. New York, NY: Simon & Schuster. Turkle, S. (1999). Cyberspace and identity. Contemporary Sociology, 28(6), 643–648. Valkenburg, P.  M. & Peter, J. (2008). Adolescents’ identity experiments on the Internet: Consequences for social competence and self-concept unity. Communication Research, 35(2), 208–231. Valkenburg, P. M. & Peter, J. (2009). The effects of instant messaging on the quality of adolescents’ existing friendships: A longitudinal study. Journal of Communication, 59(1), 79–97. Wellman, B. & Haythornthwaite, C. (Eds.) (2002). The Internet in everyday life. Hoboken, NJ: Blackwell Publishing. Wellman, B. & Rainie, L. (2014). Networked: The new social operating system. Cambridge, MA: The MIT Press. Wessels, B. (2018). Communicative-civicness: Social media and political culture. London: Routledge. Wright, K.  B., Rosenberg, J., Egbert, N., Ploeger, N.  A., Bernard, D.  R., & King, S. (2013). Communication competence, social support, and depression among college students: A model of Facebook and face-to-face support network influence. Journal of Health Communication, 18(1), 41–57. Zuboff, S. (1988). In the age of the smart machine: The future of work and power. New York, NY: Basic Books.

chapter 15

Digita l Engagem en t a n d Cl ass Economic, Social, and Cultural Capital in a Digital Age Simeon J. Yates and Eleanor Lockley

Introduction This chapter reflects on an issue prevalent throughout all the domains of the ESRC review—that of digital inequalities, often referred to as digital exclusion or digital divides. In this chapter we look to broaden this question to ask: how do patterns of dig­ital media access, skills, uses, and practices relate to overall systems of social inequality and distinction? That is to say, what is the role for digital media in the formation of social class? Discussions of inequality in the use of digital media and technology have predominantly focused on issues measured by access and skills (van Dijk, 2005; van Dijk & Hacker, 2003). These are, of course, key issues for policymakers (see Yates et al.,  2014) and are key to many governmental digital strategies in the United Kingdom, the United States, and Europe (Mawson, 2001) (see chapters 19, 22, and 23). In looking to improve access and skills, such policy seeks to reduce and address both practical limitations for and impacts on citizens. These have been called the first (access) and second (skills) levels of the digital divide. But as with other areas of social exclusion the “in principle” provision of access or skills (education) does not guarantee either access or use. Given the near ubiquity in most developed nations of digital systems in both personal and work lives, we need to shift the discussion of digital divides, inequalities, exclusion, and inclusion onto a broader basis. In this chapter we draw on and integrate some of our prior work (see Yates et al., 2015b; Yates & Lockley, 2018) to argue that digital inequalities are better understood as products of broader social exclusion (no real surprise in that),

Digital Engagement and Class   427 but in making that shift we need to consider how these technologies mediate access to other areas of social and community life. Overall, we will take an approach to social inequalities based on the ideas of Bourdieu and his three forms of capital—economic, social, and cultural. As we argued in Yates and Lockley (2018), such a move re-orientates the question of digital inequalities away from access and skills towards understanding the inequalities in digital literacy and resources. As with other recent publications, especially the excellent review by Ignatow and Robinson (2017), this chapter is an attempt to utilize Bourdieu’s (1977, 1984, 1991, 1997) work in order to understand issues of digital inequality, though we focus much more on aspects of social, economic, and cultural capital. In making this argument we believe that there are strong parallels to the argument made in Hoggart’s (1957) influential work on literacy—accepting the dated form of that work. What we draw from Hoggart is the argument that literacies (in our case digital literacies) come in a variety of forms, and that these forms carry skills for everyday personal, cultural, and work life, but they are also valued differently in each of these fields. In other words, where high levels of literacy, understood as both extensive reading and writing ability as well as extensive engagement with certain literature, carry greater potential cultural capital, so too with digital literacies. This also draws on Helsper’s (2012) argument that digital inequalities have to be understood as being in cor­re­spond­ence with other “fields” of social, cultural, and economic inequality. However, we would like to go one step further and argue that digital inequalities and differences in use are becoming integral to the functioning of social, cultural, and economic capital. We are therefore making a more complex argument about digital inequalities than one based solely on “access” or on levels of “use.” Our argument is that the study of digital inequalities needs to understand the social, economic, and cultural consequences of levels of access, types of use, and the values attributed to that use in the context of individuals’ lived experience—their “life worlds”/labenswelt (Schütz, 1967) or its integration into their “habitus” (Bourdieu, 1984). Habitus here is viewed as the subjective system of expectations and predispositions acquired through past experience. Within this, we need to consider how we have built and are building doxa (Bourdieu & Nice, 1977)—systems of generally accepted value—in regard to the uses of digital systems. We first review definitions of digital inequality. Second, we make an argument for why academic research on digital inequalities needs to move away from access and skills. Third, we present an outline of our view of Bourdieu’s model of social, economic, and cultural capital. We argue that it is not necessary to add additional ideas of “information capital” or “digital capital” to the model. Fourth, we consider how inequities in the three forms of capital arising from digital media use (or lack thereof) correspond to key aspects of social inequality, distinction, and class. Fifth, we then briefly explore the three forms of capital and their relationship to digital media use, drawing on our own work and that of other scholars. We conclude by arguing, similar to Ignatow and Robinson (2017), that Bourdieu provides a strong foundation for understanding contemporary systems of inequality, distinction, and social class in relation to digital media use.

428   Simeon J. Yates and Eleanor Lockley

Defining Digital Inequality Distinctions among Digital Divides The terms used to describe digital inequalities have changed over the last four decades. Initially beginning with questions of access to information and communication technologies (ICTs) or broader questions of “information gaps.” These initial concerns were ­primarily focused on economically developed nations, but then moved to questions of international gaps and inequalities in access to ICTs—both between nations and regions, and within developing nations. Over the past two decades there has been an extensive movement around “ICT for development” (ICT4D; see Heeks, 2017; Walsham, 2017). In this paper we predominantly focus on developed nations, with our own data being UK based, and with most studies discussed being based on US and EU examples. In the literature we can identify a range of terms that directly or indirectly reference inequalities around access, skills, uses, or attitudes to digital media and technologies. For example: • Digital divide—inequity in access to contemporary digital media • ICT divide—inequity in access to information and communication technologies • Information divide—primarily inequity in access to information sources • Digital inequality—inequities in the uses, outcomes and value of digital media and technology engagements Other terms that imply or respond to such inequalities are the following: • Digital literacy—with the implication that there may be variations and inequities in levels of digital literacy • Digital inclusion—processes or policies to address digital inequalities, especially around access and use (see chapter 5) • Digital engagement—broader questions of how motivate and support users and citizens to engage with digital technologies The existing literature has also tended to mix together a range of issues around forms of inequality. We note seven main issues in these debates: 1. Binary measures of access to digital technology or not—such as PC or device ownership or Internet connection 2. Different levels of access—such as variations in broadband speed, or shared, rather than individual, access to devices in the home 3. Differences in digital skills/literacies—such as ability to use basic features vs. complex system use or deep skills in specific areas (media use, gaming, coding) 4. Differences in levels of use—such as measures of frequency of use or complexity of use

Digital Engagement and Class   429 5. Differences in types of use—either variety of use (extensive broad use vs. narrow use) or specific key types of use (e.g., educational use) or activities (e.g., searching for information, entertainment) 6. Differences in benefits from use—personal, financial, social, cultural, health, etc. 7. Differences in hazards from use—levels of potential risks and harms from using the technology Though all of these issues appear in the academic and policy literature, the primary focus of the majority of work to date has been on the first four of these issues. We would argue that a full understanding requires examination all seven of these aspects, as they all have consequences for citizens themselves, governments, the economy, and society as a whole. These consequences are also well documented in the literature; we highlight the following seven: 1. Material implications for people’s lives—for example, there is considerable evidence that those online save money and access better education and work. 2. Growth of “digital by default” and “digital only” service delivery—starting with such things as banking and travel booking, many businesses and government systems are moving to “digital by default” or “digital first” delivery, making dig­ital access and use core to systems for everyday life (see Yates et al., 2015a). 3. Centrality of digital to education—nearly all levels of education in developed nations is highly dependent on digital technologies and media—try being a university student without a computer. 4. Centrality of digital to work—again many areas of work now require the use of some form of technology in the workplace ranging across all levels of complexity (see, for example, chapters 12 and 13). 5. Growth in digital culture and leisure—access to all forms of culture and leisure is becoming highly mediated by digital technologies from arranging to meet friends, to downloading and streaming media to booking tickets for activities. 6. Workplace automation and the digital economy—these are key to productivity with contemporary societies becoming dependent on technology for economic growth (see chapter 11). 7. And by no means last, fairness—why should some social groups be excluded from key parts of contemporary society due to a lack of digital access or skills? In an attempt to classify forms of digital inequality and their consequences, a number of authors and policymakers (see Ragnedda, 2017; Van Dijk, 2005) talk about three types or levels of digital divide: • A first level digital divide based on inequalities in access • A second level digital divide based on uses • A third level digital divide based on outcomes We will return to this formulation of digital inequalities in the conclusion.

430   Simeon J. Yates and Eleanor Lockley

Why Do We Need to Shift Academic Research away from Questions of Access and Skills? We argue that a full understanding of the types of inequality and their potential impacts needs to extend beyond a focus on access and skills—important though they are. Our own research work developed out of projects focused on supporting digital inclusion through regional government policies (Goraya, Light, & Yates, 2012; Yates, Kirby, & Lockley, 2014). This work, like much other work in the field, started from the assumption that digital inclusion brought a variety of social, economic, and educational benefits. We would still maintain this argument. In fact, we would argue that this case has become stronger over the years, in part because ever more aspects of social life are dependent on digital media use, to the point that basic access and skills are required for undertaking everyday social, cultural, and economic activity. For example, in the United Kingdom, government policy has shifted to one of service delivery through “digital by default” or “digital first.” This includes all forms of service from tax returns, to hospital appointments, to social welfare (although such actions may simply “shift” costs and challenges elsewhere in the social welfare system to charities and other service providers; Yates, Kirby, & Lockley, 2015a). It continues to be the case that the number and extent of international, national, and regional policy interventions stand as testimony to the importance placed by political and economic actors on ensuring large-scale access to digital media and technology. But access to services is a very well-defined domain with clear benefits and hazards for citizens, and it focuses (rightly) on some of the most marginalized in society. In this context, basic access and skills remains a key concern. For policy research, identifying those without access and skills therefore remains a priority. Older age (post retirement) remains one of the most significant predictors of access and use. Unfortunately, this continues to lead some policymakers to consider issues of digital inequalities as one that will fade over time. Currently, the focus has shifted to skills. This shift has been evident in the close policy work the authors have undertaken in the United Kingdom with the Department for Digital, Culture, Media and Sport and with regional local government. As a result, policy research to address access to basic services and needs rightly remains focused on these issues of basic access and basic skills. But should the academic social research and broader policy discussion remain focused on these issues? We would argue not, and that such work should expand to a broader conception of digital inequalities. This is important because such a narrow focus has practical limitations. For example, a focus just on age has several flaws. First, many of the older citizens who currently find themselves excluded are former digital technology users who had access at earlier points in their lives. Second, the issue is not just about access but inequalities in types and levels of use. Third, variations in use with age will reflect aspects of life style, life stage, and inequalities that vary with age—not just

Digital Engagement and Class   431 experience with digital technologies. More generally though, for many developed nations, the growing ubiquity of digital media in everyday life, in everyday objects (washing machines, fridges, cars, etc.), automation, and even “artificial intelligence” means that questions of access and skills become blurred and redefined. We must start to ask: When are we, or our devices, not “online”? Recent reports by the Good Things Foundation (https://www.goodthingsfoundation. org/research-publications/real-digital-divide), based on work by Yates et al. (2015b), identified over 13 million UK citizens who were limited or non-users of the Internet, with the majority being from lower income households (see also chapter 15). Socioeconomic positions therefore influence access to what Selwyn (2003) calls the “opportunity structure” of digital technologies. This reaches beyond just access to broader digital literacy, highlighting that there is a range of experiences for those categorized as “dig­ itally included” (Clayton & MacDonald, 2013). As a result, we will argue here that, in the context of economically well-developed nations, a fuller social scientific understanding of how digital media interact with, reinforce, and reflect broader patterns of inequality is needed. This is in no way to deny the reality that for many nations and communities access and skills remains a priority.

Digital Inequality and Social Distinctions We take as a given that contemporary developed nations have well noted and somewhat entrenched levels of social inequality. We also take as a given that these structures of inequality are not solely driven by economics but that social and cultural factors are also key. We therefore take Bourdieu (1984, 1991, 1997) as a starting point as well as more recent work based on his approach (Bradley, 2014; Rollock, 2014; Savage, 2015; Savage et al., 2013, 2014). Within this view, inequities reflected in social class are driven by three forms of exchangeable capital (Bourdieu, 1984, 1997): • Economic capital: as generally understood in material terms of money, assets, and property. • Social capital: “the sum of the resources, actual or virtual, that accrue to an individual or a group by virtue of possessing a durable network of more or less institutionalized relationships of mutual acquaintance and recognition” (Bourdieu & Wacquant, 1992, p. 119). • Cultural capital: predominantly an aspect of education and socialization that allows individuals to demonstrate aspects of cultural consumption, knowledge and practice that differentiate them from other social groups; importantly different forms of cultural capital engender greater possibilities of exchange for other forms of capital. Bourdieu identified three main types of cultural capital (Bourdieu 1997; Bourdieu & Passeron, 1990):

432   Simeon J. Yates and Eleanor Lockley • Embodied cultural capital in the form of knowledge acquired over time through socialization and expressed through one’s habitus • Objectified cultural capital in the form of both owned and experienced cultural consumption that can be translated into other forms of capital and which may require appropriate embodied cultural capital to support their consumption • Institutionalized cultural capital, or the formal social and institutional recognition of a person’s cultural capital. Recent work on access to and uses of the Internet have made similar arguments. Grant (2007) clearly argues that economic capital alone is not a sufficient explanation of why people do or do not meaningfully engage with technology. Clayton and Macdonald (2013), drawing on Graham (2002) and Selwyn (2003), summaries this position as follows “The various forms of economic, social and cultural capital (Bourdieu, 1997) individuals bring to technology in terms of their own socio-economic positions and internalized dispositions or habitus, is key in influencing the way in which technology might (or might not) be used as well as perceptions of benefits gained”.  (p. 948)

The Idea of Digital Capital We have noted that the concept of “digital capital” has been proposed by a number of authors as a route to understanding the broader impacts of a lack of digital access or skills (for example Ignatow & Robinson,  2017; Ragnedda,  2017,  2018; Ragnedda & Ruiu, 2017). In our own work we were very sympathetic to the idea of digital capital as it helps to formulate questions about the integrated role of digital media and technologies in multiple aspects of citizens lives. We have on reflection come to see the idea of a distinctive “digital capital” as problematic. As we have already disccussed, for Bourdieu forms of capital are stocks of material, cultural, and social resources as well as abilities and aptitudes that are either scarce or differentially distributed among social groups and classes; and “[l]ike the more traditional form of capital, they can be transformed and productively reinvested” (Ignatow & Robinson, 2017, p. 952). Taking this definition, theorists have sought to define the combination of digital access, skills, and practices as a form of “digital capital.” The argument is most fully made in Ragnedda (2018). Our concerns with the concept arise from its utilization within a broadly Bourdieu-based framework—rather than an objection to the idea per se. Ragnedda (2018) makes a clear argument as to how digital media access, skills, and consequences (the first, second, and third digital divides) can be understood in terms of social, cultural, and economic capital. However, by the close of the article there are six forms of capital in play: economic, social, cultural, personal, political, and digital. The arguments made by Ragnedda for the importance of digital inequalities within each of these “capitals” is well made, and we are in very strong agreement with the importance of understanding the integrated

Digital Engagement and Class   433 impacts of digital inequalities. But we would argue that focusing on “digital capital” ­reifies the “digital” within the overall model of capital as developed by Bourdieu. This reification includes an element of separation between “online” (digital) and offline (non-digital) worlds that we would argue no longer holds—so that understanding dig­ital inequalities has to be grounded in the complexities of everyday social circumstances and actions. For example, taking a similar approach, we could make an argument for “domestic appliance capital.” This is a serious point, as the social and economic impact of domestic appliances is at least equal to or greater than, that of digital media and technologies (see chapter 23). But we would argue that both “digital capital” and “appliance capital” would be category mistakes as both likely refer to a technological aspect that cuts across the core forms of capital. Importantly the three core capitals in Bourdieu’s model—economic, social, and cultural—map onto broadly distinct classes of things. Economic capital predominantly refers to material ownership or access to material goods. Social capital refers to and is best understood as the value of and power within a person’s social network. Cultural capital is, as noted above, a product of accumulated information, knowledge, and relevant practice. As Ragnedda (2018) notes, Bourdieu also reflected on the role of “information” or “information capital” as an element of cultural capital (Bourdieu & Wacquant, 1992). But what is being defined as digital capital crosses all three. There are material (ownership and access to digital technologies), cultural (digital literacies, skills and practices), and social (uses of digital media within social networks) aspects. Similarly, political capital may be better understood as a specific type of social (network) capital—or combination of all three forms of ­capital—utilized within the political “field” (see discussion of field in the next section: Digital Inequalities in the Fields of Everyday Life). Personal capital appears to bear many similarities to Bourdieu’s concept of “habitus,” our embodied and habitual “­giving off ” of markers of economic, social, and cultural capital. It also is often used as a statement about citizens power and position within key social networks—their localized social capital. We would argue strongly that from the point of view of a Bourdieu-based approach to inequality and distinction that digital has a key role in how all three forms of capital function in contemporary society—but is not itself best theorized as a distinct form of capital. Outside of such a Bourdieu-based framework, then, it is certainly possible to ­theorize the combination of digital literacy and digital resources as “stocks of material, cultural, and social resources, abilities, and aptitudes” that are either necessary and/ or sufficient to impact citizens’ social lives and status. This is effectively the use being made by Ragnedda (who takes a predominantly Weberian approach; Ragnedda, 2017) to digital inequalities, and the usage found in other works (see Ignatow & Robinson, 2017). As we noted previously, we hold sympathies for this approach and greatly value the insights scholars working with this concept have provided. However, our broader concerns remain that such usage reifies the digital and may rely on an unsustainable argument for the differentiation of online and offline “worlds/behaviors.”

434   Simeon J. Yates and Eleanor Lockley

Digital Inequalities in the Fields of Everyday Life Helsper (2012) argues that digital inequalities map onto four “fields” of social exclusion: Economic, Social, Cultural, and Personal. To make this argument, Helsper draws on a variety of literature with regard to social exclusion (see Abrams et al., 2005; Anthias, 2001; Burchardt, Le Grand, & Piachaud,  2002; Chakravarty & D’Ambrosio,  2006; Chapman, Phimister, Shucksmith, Upward, & Vera-Toscano, 1998). Helsper takes as a starting point Burchardt et al.’s definition that a citizen is socially excluded “if he or she does not participate in key activities of the society in which he or she lives” (2002, p. 30). All of this literature and research argues that social exclusion is multi-dimensional. As Helsper notes, the key argument from Burchardt et al. is that “Measures of social exclusion attempt to identify not only those who lack economic resources but also those whose non-participation arises in different ways: through discrimination, chronic ill health, geographical location, or cultural identification, for example”.  (Burchardt, Le Grand, & Piachaud, 2002, p. 6)

Helper’s goal is to theorize how to measure the interactions between social and digital exclusion across these four fields, exploring how they reinforce and amplify each other. Though we are in full agreement with the overall argument around the need for a broad view of digital inequalities, Helsper’s use of the term “field” overlaps with but is not the same as that used by Bourdieu (Bourdieu, 1984, 1985, 1991, 1997; Bourdieu & Wacquant, 1992) in regard to systems of distinction and inequality. For Bourdieu, a field (champ in the original French) is a system of social positions that has an internal structure that is primarily defined by relationships of power between those positions. This might be most easily seen in a professional body or institution—such as a medical discipline, political movements, or a sporting association. But it can also be found in other social groups, institutions, or areas of social life such as arts and cultural sectors. For Bourdieu a field (champ) is a social arena of negotiation, conflict, and exchange over the appropriation of economic, social, and cultural capital. Fields (champ) in this model cut across social class and each other with political and economic activity being key in contemporary capitalist society. Importantly, for Bourdieu a field is therefore constituted by the social network that underpins it and differences in relationships within it. It is also a location where the three forms of capital may be exchanged, and their values recognized in different ways. The boundaries of fields (champ) are “marked” by where their effects “end.” Importantly, how participants within a field are able to make an effective use of the capitals they hold is a function of the adjustment of their habitus to and within the field. There is thus clear overlap with Helsper’s use of fields. A lack of status within, or access to specific fields, as defined by Bourdieu is one of the ways in which social distinction and inequalities function. Many aspects of inequality—economic, social, cultural—now arise from and can be mitigated by the utilization of digital literacies and digital

Digital Engagement and Class   435 resources in different fields (champ). Within the field of education, for example, higher digital literacies lead to advantages in terms of outcomes—leading to higher levels of institutionalized cultural capital. This is similarly true of many fields of work (manufacturing industries, technology industries, public services, etc.). In each case the roles of digital literacies and practices are key to success within these fields. Following Helsper, we would argue that an understanding of the role of digital media and technologies in the formation, maintenance, and function of contemporary distinction and social class needs to explore its articulation as literacies (skills), resources (access), and practices (uses) within the fields (champ) that citizens occupy and as part of their habitus. This helps to take some of the emphasis off the technology and explore the processes that underlie the formation of inequalities within which digital is playing a role. Of course, by asking questions about digital inequality we are focusing on this aspect of such processes alongside others (wealth, gender, ethnicity, sexuality, health, housing, etc.)— importantly, questioning how digital has changed (or not) these processes.

Class, Capital, and Digital Media Use We will now briefly review some examples of the integration of digital into or alongside existing inequalities in economic, cultural, and social capital. Given the nature of the data we can only point to a correspondence between digital and measures of such capital. In future work we wish to explore these processes in greater depth to highlight both causalities and the role of digital in the construction and maintenance of contemporary class structures.

Economic Capital In our prior work we have considered a variety for measures of socio-economic ­status  (economic capital) relative to levels and types of Internet use. We have mainly  used the two standard UK measures of socio-economic status: the NRS Social  Grade and the NS-SEC; Table  15.1. The first set of measures was initially designed for the National Readership Survey (NRS; http://www.nrs.co.uk/nrs-print/ lifestyle-and-classification-data/social-grade/); the second set is the National Statistics Socio-economic Classification (NS-SEC; https://www.ons.gov.uk/methodology/ ­ classificationsandstandards/otherclassifications/thenationalstatisticssocioeconomicclassificationnssecrebasedonsoc2010), designed to reflect employment relations and occupational conditions. We have compared these against rates of use and types of user, based on our measure of breadth/variety of use. Numerous studies have pointed to economic variations in access to ICT, digital media and technologies, measured in various ways. If we focus on Twitter as a contemporary

436   Simeon J. Yates and Eleanor Lockley Table 15.1  NRS Social Grades and NS-SEC Classifications NRS social grades A B C1 C2 D E

Higher managerial, administrative or professional Intermediate managerial, administrative or professional Supervisory or clerical and junior managerial, administrative or professional Skilled manual workers Semi and unskilled manual workers Casual or lowest grade workers, pensioners, and others who depend on the welfare state for their income NS-SEC classifications

1 2 3 4 5 6 7 8

Higher managerial and professional occupations Lower managerial and professional occupations Intermediate occupations (clerical, sales, service) Small employers and own account workers Lower supervisory and technical occupations Semi-routine occupations Routine occupations Never worked and long-term unemployed

technology that is often portrayed as having ubiquitous reach, there is a body of ­evidence that levels of use are strongly associated with socio-economic class position. Sloan (2017) used data from the British Social Attitudes Survey 2015 to explore Twitter use by NS-SEC classification. The research found a higher proportion of Twitter users in the higher NS-SEC classes 1 and 2. Sloan also notes comparable results from prior analyses (Sloan, 2015) where NS-SEC categories are algorithmically derived from Twitter profile data. Blank (2016) and Blank and Lutz (2017), using Oxford Internet Surveys data, note the lack of social representativeness of users on social media platforms. They demonstrate that all social media platforms are skewed towards content produced by younger, wealthier, and better educated citizens. The results reinforce the point that socioeconomic context is a major variable in which platform, and to what extent, citizens engage with social media. In our own work we looked at levels of social media use (see Figure 15.1) these clearly track economic status and forms form employment (NS-SEC). When we looked at overall types of use and types of users, we found similar variations. Using data from the 2017 Ofcom Media Literacy Survey, we have argued that it is possible to identify eight broad forms of the digital aspects of “habitus”: from the extensive user to the non-user (Yates et al., 2015b; Yates & Lockley, 2018). We use these in the same sense that one might have a proxy measure for different forms of cultural consumption— for example, people or groups with greater levels of “high” or “popular” arts consumption. These groupings of digital users are clearly differentiated across ­socio-economic

Digital Engagement and Class   437

Mean of How often do you access sites such as these? (revised in Y9)

2.3

2.2

2.1

2.0

1.9 Higher Intermediate Small Lower Semi-routine Never worked managerial, occupations employers and supervisory and routine and long-term administrative own account and technical occupations unemployed and workers occupations professional occupations HRP Socio-Economic Classification (NS-SEC based on SOC2010): Analytic Categories - 5 groups

Figure 15.1  Mean of frequency of social media use by social class (NS-SEC).

groups; this differentiation can be found in the Ofcom data going back to 2005. As Figure 15.2 shows, those in NRS social classes C2 and DE are most likely to be offline, or in one of the limited user types. In the case of social class group DE this is close to 70% of citizens. When we looked at limited users by age, though non-users and limited users tended to be over 55 years of age, “limited social media users” were predominantly under 55. This indicates that limited social media users are predominant among the younger poor. These results mirror results from studies by Blank (2016), Blank and Lutz (2017), Robinson (2009, 2011, 2014).

Cultural Capital The idea of digital literacies clearly intersects with Bourdieu’s idea of cultural capital. Literacy as the combined set of reading and writing skills and practices intersects with all three aspects—institutional (education and skills), embodied (practices), and objectified (ownership of artefacts such as books). This is equally true of digital literacies. For example, Straubhaar et al. (2012) note that social class affects citizens’ exposure to and

438   Simeon J. Yates and Eleanor Lockley

100

User types (7 cluster + non) Non-user

80

Limited

Percent

Limited info seeking 60

Social media limited / only Social media general

40

Non-media general Non-political extensive

20

0

Extensive

AB

C1 C2 NRS Social Class

DE

Figure 15.2  Type of Internet user by social class (NRS).

willingness to invest in skills and knowledge and shapes their disposition toward and familiarity with technology. Clayton and MacDonald (2013) argue: Accumulation of legitimized forms of cultural capital, including knowledge, skills and customs which are invested in, inherited and embodied differentially by social groups, is crucial in determining the ability to appropriate technology for socially valued purposes ( . . . ) Without legitimate knowledge, connections or reasons to meaningfully engage, individuals may struggle to make what is seen to be appropriate use of technology within a society in which they do not dictate what is “useful.” (p. 949)

We will note later this embedding of expectations and perceptions into established understandings—doxa to use Bourdieu’s term—of digital media use may be a key issue for future work. As noted earlier, a number of studies have utilized ideas of information and digital capital to explore specific cases. Robinson (2009, 2011, 2014) undertook ethnographic studies of approaches to educational and personal uses of ICT by US high school students. Robinson notes that their practices are “deeply rooted in the accumulation and internalization of information capital” (p. 533). Robinson’s work highlights how even within a digital-media rich nation and among groups with access, key differences in information capital can still accumulate. These inequalities lead to different educational and life opportunities that have the potential to underpin long term variations in

Digital Engagement and Class   439 outcomes. As part of our own work with policymakers, we have explored the key ­characteristics of Internet non-users and limited users, using data from the 2017 UK Ofcom Media Literacy Survey. A binary logistic regression was run to evaluate the key features of non-users and limited users. Table 15.2 outlines the impact of each of these for non-users and Table 15.3 for limited users. In both cases the six main predictors are level of education, age, presence of children in home, literacy, income, and social class. Table 15.2  Key Features of Non-users Variable

Effect

Age left education (16 or under; Those who left education at or under 16 years are 4.148 times 17–18; 19–20; 21 or over) more likely to be non-users than those who left education after 21 Age (10 year blocks from 26 to 86+)

Those over 86 years old are 33.585 times more likely to be non-users than those under 26

Number children under 18 in household

Each child in the house makes you 1.243 times less likely to be a non-user

Not “very” confident with literacy (Yes/No)

Those who are not “very” confident about their literacy are 3.487 times more likely to be non-users

Income

Those earning less than £10,399 pa are 2.328 times more likely to be non-users than those earning over £52,000 per annum NRS Social Class (AB; C1; C2; DE) Those in NRS social grades D&E are 3.545 times more likely to be non-users than those in social grades A&B

Table 15.3  Key Features of Limited Users Variable

Effect

Age left education (16 or under; 17–18; 19–20; 21 or over)

Those who left education at or under 16 years are 2.323 times more likely to be limited users than those who left education after 21

Age (10 year blocks from 26 to 86+)

Those over 86 years old are 25.771 times more likely to be limited users than those under 26

Number children under 18 in household

Each child in the house makes you 1.088 times less likely to be a limited user

Not “very” confident with literacy (Yes/No)

Those who are not “very” confident about their literacy are 2.744 times more likely to be limited

Income

Those earning less than £10,399 pa are 3.152 times more likely to be limited users than those earning over £52,000 per annum

NRS Social Class (AB; C1; C2; DE)

Those in NRS social grades D&E are 1.576 times more likely to be limited users than those in social grades A&B

440   Simeon J. Yates and Eleanor Lockley Leaving education at or under 16 years old, and a lack of confidence in literacy skills, are both associated with significantly higher likelihoods of being non-users or limited users of digital media. Importantly, both education and confidence with literacy are also key elements of Bourdieu’s conception of institutionalized cultural capital. Such a result indicates to us that inequalities in digital literacy are tightly bound up, like formal literacy, in nexus of inequalities in both economic and cultural capital. Taking a direct lead from Bourdieu, we have also assessed the links between digital media and technology use and broader patterns of cultural consumption (Yates & Lockley, 2018). Using the same multiple correspondence analysis method as Bourdieu we have graphed the relationship between Internet use, social media use, other forms of cultural consumption, and economic capital (see Figure 15.3). In Figure 15.3, dimension 1 (x-axis) explains the majority of the data and matches class, deprivation, cultural distinctions, and Internet use and access. Dimension 2 appears to differentiate between popular and what might be described as “high culture.” An examination of the graph highlights four clusters. Three mark our cultural distinction covering “high culture,” popular arts, and popular culture. The fourth identifies those who are socio-economically, culturally, and digitally excluded. High levels of social media use are situated closest to popular arts. It is interesting to note that levels of social media use follow a similar vector across the graph as that for the popular to “high” culture attendance (see Table 15.4). This would indicate an association between high levels of Internet and social media use and higher levels of objectified, and potentially embodied, cultural capital.

Dimension 2 (5.1%): popular (top [+]) and “high” (bottom [-]) culture

5 4 3

Popular culture

2 1 Popular arts

0 –1 –2 –3

Highest levels of socal media use

–4

“High” culture

–5

Socio-economically, culturally and digitally excluded

–6 –7 –8

–6

–5

–4 –3 –2 –1 0 1 2 Dimension 1 (82.6%): economic and cultural capital (left [-] high; right [+] low) NS-SEC IMD-Deprivation deciles Internet access No internet access Social media

No attenance at specific types of arts Attendance at specific arts Some attendance Overall absolutely no attendance No social media use

Figure 15.3  MCA analysis—overall results.

3

4

Digital Engagement and Class   441 Table 15.4  MCA Clustering of Arts Attendance Popular culture

Popular arts

“High arts”

Carnival Circus Pantomime Film Musical

Street performance Cultural festivals Ethnic dance performance Art exhibition Art and crafts event Arts exhibition in public space Live music

Digital arts exhibition Jazz Literary event Classical music Ballet Contemporary dance Opera

Source: Yates and Lockley (2018).

Social Capital Social capital can be defined in terms of the value derived from a citizen’s network of social ties (Son & Lin, 2008). Social capital is much harder to measure and assess, as it requires specialized survey and analytical work (Abbasi et al.,  2014; Kikuchi & Coleman, 2012). Social network analysis and the examination of digital communities (see chapter 14) are routes to assessing elements of social capital. The concept of social capital has two broad provenances. The first is most clearly presented in the work of Putnam (2000) and can be traced back to the sociologist Durkheim. In this characterization, social capital is understood as both a personal and community commodity. It is predominantly linked to civic and political engagement, the formation of community (Brehm & Rahn, 1997; Son & Lin, 2008; Fischer, 1982a, 1982b; Rainie & Wellman, 2012), and it is often viewed in terms of political community (see chapter 16). The second provenance is Bourdieu’s model of distinction and social class (1985). For Bourdieu, social capital is a product of the opportunities that can be gained from the structure of interpersonal networks—that is, the extent to which social capital can be translated into or exchanged for other forms of capital (economic or cultural). These two definitions clearly overlap and may not be mutually exclusive (Burt, 2005; de Zúñiga, Homero, & Valenzuela, 2012). The majority of work focusing on digital media has predominantly taken the “community” view of social capital, occasionally mixing this with ideas from Bourdieu (Chen,  2013; Daniel et al.,  2003; Ellison,  2007;  2011; North et al.,  2008; Patulny et al., 2007; Steinfield et al., 2012). As Neves (2013) points out, much of this research is focused on the “positive” benefits of social capital derived from digital media or Internet use, with a focus on the types of links and ties such technologies support. For example, Phua et al. (2017) comparatively analyses social capital within four social networking sites: Facebook, Twitter, Instagram, and Snapchat. They apply uses and gratifications theory and draw on Putnam’s (2000) distinction between bridging (weak ties) and bonding (strong kinship ties). Their results indicate that Twitter users had the highest

442   Simeon J. Yates and Eleanor Lockley bridging social capital followed by Instagram, Facebook, and Snapchat, whereas Snapchat users had the highest bonding capital, followed by Facebook, Instagram, and Twitter. Valenzuela et al. (2009) focused on US students’ civic and political engagement via Facebook. They found a positive and significant association between intensity of Facebook use, group membership and “community” social capital as measured in terms of personal contentment, greater trust, and participation in civic and political activities. There are considerably fewer studies that formally follow Bourdieu in assessing social capital in terms of distinction and inequalities. One study (Clayton & MacDonald, 2013) drew on Bourdieu’s model of social capital to highlight the importance of the accumulation of relevant (digital) social and cultural capital in understanding how citizens make everyday use of technology. Looking at a set of neighborhoods within the UK city of Sunderland, they found evidence of the development of bridging ties and again reinforcement of existing bonding ties. They note: The character of the use of social networking in our qualitative sample also demonstrates a different set of practices to those of political participation and community development. Participants do not necessarily use technology to contact new people ( . . . ) or to engage in formal democratic processes, rather its use enables the maintenance of social relationships already established on a new and engaging platform. (p. 954)

None of these studies can clearly fully show whether social media use drives the creation of social capital, or rather that it provides an additional digital layer to existing social capital. Yet they all highlight the possibility that different social media might function to support and potentially intensify network ties and therefore existing social capital. Within our own work, taking a lead from the research that links social capital in various ways to social media use, we have found that both the range and types of social media citizens use consistently corresponds with their socio-economic class position. For example, the number of different social media platforms used drops from over 3 to just 1 between the top and the bottom of the UK NS-SEC social class scale scales (Figure 15.4). Second, the data indicates that types of social media used also vary by class. As may be expected, social media platforms aimed at professionals (LinkedIn, Vimeo) are predominantly used by NS-SEC professional groups. In fact, as found by Blank (2016) and Blank and Lutz (2017), Twitter is more likely to be used by professionals than by other groups. Facebook is used across the class groups, but still with an overrepresentation of higher professional class groups (Figure 15.5).

Conclusion Drawing on the empirical evidence from our own work and that of others—including those where we disagree on aspects of the theoretical interpretation—we would argue

Digital Engagement and Class   443 3.50

Mean of SNSrange

3.00

2.50

2.00

1.50

1.9 Intermediate Small Lower Semi-routine Never worked Higher managerial, occupations employers and supervisory and routine and long-term own account and technical occupations unemployed administrative workers occupations and professional occupations HRP Socio-Economic Classification (NS-SEC based on SOC2010): Analytic Categories - 5 groups

Figure 15.4  Mean number of social media platforms used by class.

that there is a compelling case for understanding digital inequalities and forms of digital exclusion as deeply embedded within broader social structures of inequality and distinction. The initial work on access—whether ICT, Internet, or digital media—rightly pointed out a concrete and crucial issue. But in the same way that access to literacy in the 1800’s required forms of physical access to books and print but rapidly developed into a range of distinctions around levels and types of literacy some valued more than others (see Hoggart, 1957), a similar argument can be made for digital literacies. In the discussions here we have tried to take the argument of Helsper (2012) further by pointing out that it is through the differential effects of greater and lesser levels of digital literacies and digital resources within specific fields (champ) of life (education, work, professional life) that the material impacts and outcomes of digital inequalities function. A very strong case for this approach in the context of the field (champ) of education can be found in Robinson’s ethnographic work on digital media use in US high-school contexts (Robinson, 2009, 2011, 2014). We also believe that taking a broader view of the impact of digital inequalities in terms of social, economic, and cultural capital and its functioning within fields requires us to look again at the “levels” of the digital divide:

444   Simeon J. Yates and Eleanor Lockley

100

HRP Socio-Economic Classification (NS-SEC based on SOC2010): Analytic Categories - 3 groups

Percent

80

Higher managerial, administrative and professional occupations

60

Intermediate occupations Routine and manual occupations

40

Never worked and longterm unemployed

20

Linkedin Vimeo Flickr WorldPress Reddit Twitter Pinterest Spotify Blogger Tumblr Google+ Instagram YouTube Myspace Facebook None of these Don’t know

100

Which SNS sites have you accessed in the last 12 months

Figure 15.5  Social media platforms used by social class (NS-SEC).

• A first level of digital divide based on inequalities in access • A second level digital divide based on uses • A third level digital divide based on outcomes We can see the considerable policy value of these distinctions. They highlight potential areas of intervention and also mark out a move from material challenges, through engagement challenges, to consequences. But in the examples we discussed we can identify in all cases how access and use impact outcomes—such as limited access and use influencing employment opportunity or social networks. All three levels have always operated; what has changed has been the academic and policy research focus over time. These questions have shifted from empirical questions about who does and does not have access, via questions of inequalities in use to concerns about impacts. All of the work done to address these questions, taken as a whole, provides a rich picture of inequalities in digital literacies and resources. They do not, though, address our starting question of how digital media are now integrated into the processes that form and maintain contemporary social class groupings and distinctions. We would argue that to do this fully requires the utilization of Bourdieu’s work, especially the examination of the role of digital literacies and resources in both constituting and being derived from of social, cultural and economic capital, as well as their functioning within special fields

Digital Engagement and Class   445 (champ). In this sense we are making a similar argument to that of Ignatow and Robinson (2017), but with a much stronger focus on class distinctions. Where we differ from Ignatow and Robinson and other scholars such as Ragnedda (2017, 2018) is that we do not believe that it is necessary to conceptualize of “information capital” or “digital capital” in doing this. We would argue that to do so reifies digital media and technology when in fact we need to focus on their role and function within the three forms of capital and the nature of contemporary fields. As a final comment we would like to return to a concept we highlighted in the introduction—that of doxa. Bourdieu argued that doxa—as taken-for-granted assumptions about the social world and social processes—set limits on social mobility by solidifying accepted beliefs and expectations. Certain artefacts, practices, and beliefs are “for me/ us” or are “for others” and they “make sense/don’t make sense” to “me/us.” Doxa are therefore assumptions about which forms of consumption or social practices are appropriate to their social and class position, taken on by individuals. Bourdieu argued that doxa derive from socialization and therefore the social environment in which citizens develop; and these environments now have a strong digital component. The example Bourdieu often gave of doxa functioning to create inequities was that of the “accepted” explanation of school performance being about innate “cleverness,” rather than being about the much higher levels of social, economic, and cultural capital experienced by one set of students over another. In our policy work we often find that reasons for not accessing the Internet or using certain digital services are premised by “that is not for me” or “that is not for people like us.” As we have noted in our recent work (Yates et al., 2015b; Yates & Lockley, 2018) limited social-media-only users of the Internet represent a significant proportion of younger users from lower socio-economic groups, whereas extensive users, both in terms of breadth and regularity of use, are more common in higher socio-economic groups. Alongside this we have limited education showing a strong correspondence with limited use. Our concern is therefore, much like formal literacy (see Hoggart, 1957), that doxa around digital media use may be forming in relation to class status. As result, some forms of use may become defined as “not for us”—for example, complex use requiring laptops, or general data analysis skills, or specific types of social media use—and as a result are creating forms of digital inequality that become culturally rather than materially embedded.

References Abbasi, A., Wigand, R. T., & Hossain, L. (2014). Measuring social capital through network analysis and its influence on individual performance. Library & Information Science Research, 36(1), 66–73. Abrams, D., Hogg, M.  A., & Marques, J.  M. (2005). A social psychological framework for understanding social inclusion and exclusion. In D. Abrams, M. A. Hogg, & J. M. Marques (Eds.), The social psychology of exclusion and inclusion (pp. 1–24). New York, NY: Psychology Press (Taylor & Francis Books). Anthias, F. (2001). The concept of “’social division’ ” and theorizing social stratification: Looking at ethnicity and class. Sociology,35(4), 835–854.

446   Simeon J. Yates and Eleanor Lockley Blank, G. (2016). The digital divide among Twitter users and its implications for social research. Social Science Computer Review, 35(6), 679–697. Blank, G., & Lutz, C. (2017). Representativeness of social media in Great Britain: Investigating Facebook, LinkedIn, Twitter, Pinterest, Google+, and Instagram. American Behavioral Scientist, 61(7), 741–756. Bourdieu, P. (1977). Outline of a theory of practice. Cambridge: Cambridge University Press. Bourdieu, P. (1984). Distinction: A social critique of the judgement of taste. Boston, MA: Harvard University Press. Bourdieu, P. (1985). The social space and the genesis of groups. Information (International Social Science Council), 24(2), 195–220. Bourdieu, P. (1991). Language and symbolic power. Boston, MA: Harvard University Press. Bourdieu, P. (1997). The forms of capital. In A. H. Halsey, H. Lauder, P. Brown, & Stuart Wells, A. (Eds.), Education: Culture, economy, and society (pp. 55–92). Oxford: Oxford University Press. Bourdieu, P., & Nice, R. (1977). Outline of a theory of practice. Cambridge, UK: Cambridge University Press. Bourdieu, P., & Passeron, J. C. (1990). Reproduction in education, society and culture (Vol. 4). Thousand Oaks, CA: Sage. Bourdieu, P., & Wacquant, L. J. (1992). An invitation to reflexive sociology. Chicago: University of Chicago Press. Bradley, H. (2014). Class descriptors or class relations? Thoughts towards a critique of Savage et al. Sociology, 48(3), 429–436. Brehm, J., & Rahn, W. (1997). Individual-level evidence for the causes and consequences of social capital. American Journal of Political Science, 41(3), 999–1023. Burchardt, T., Le Grand, J., & Piachaud, D. (2002). Introduction. In J. Hills, J. Le Grand, & D.  Piachaud (Eds.), Understanding social exclusion (pp. 1–12). Oxford, UK: Oxford University Press. Burt, R.S. (2005) Brokerage and closure: An introduction to social capital. Oxford, UK: Oxford University Press. Chakravarty, S. R., & D’Ambrosio, C. (2006). The measurement of social exclusion. Review of Income and Wealth, 52(3), 377–398. Chapman, P., Phimister, E., Shucksmith, M., Upward, R., & Vera-Toscano, E. (1998). Poverty and exclusion in rural Britain: The dynamics of low income and employment. York, England: York Publishing Services. Chen, W. (2013). The implications of social capital for the digital divides in America. The Information Society, 29(1), 13–25. Clayton, J., & Macdonald, S. J. (2013). The limits of technology: Social class, occupation and digital inclusion in the city of Sunderland, England. Information, Communication & Society, 16(6), 945–966. Daniel, B., Schwier, R., & McCalla, G. (2003). Social capital in virtual learning communities and distributed communities of practice. Canadian Journal of Learning and Technology/La revue Canadienne de l’apprentissage et de la technologie, 29(3). https://www.learntechlib.org/p/43189/ Ellison, N. B., Steinfield, C., & Lampe, C. (2007). The benefits of Facebook “friends:” Social capital and college students’ use of online social network sites. Journal of ComputerMediated Communication, 12(4), 1143–1168. Ellison, N. B., Steinfield, C., & Lampe, C. (2011). Connection strategies: Social capital implications of Facebook-enabled communication practices. New Media & Society, 13(6), 873–892.

Digital Engagement and Class   447 Fischer, C. S. (1982a). To dwell among friends: Personal networks in town and city. Chicago: University of Chicago Press. Fischer, C. S. (1982b). What do we mean by “friend”? An inductive study. Social Networks, 3(4), 287–306. Gil de Zúñiga, H., Jung, N., & Valenzuela, S. (2012). Social media use for news and individuals’ social capital, civic engagement and political participation. Journal of Computer--Mediated Communication, 17(3), 319–336. Goraya, H., Light, A., & Yates, S. (2012). Contact networks and the digital inclusion of isolated community members. Digital Divide 2011 Yearbook, 32. https://www.nammi.ru/sites/ default/files/Digital_Divide_2011.pdf#page=32 Graham, S. (2002). Bridging urban digital divides? Urban polarisation and information and communications technologies (ICTs). Urban Studies, 39(1), 33–56. Grant, L. (2007). Learning to be part of the knowledge economy: Digital divides and media literacy. Futurelab, Bristol. http://www.academia.edu/download/11024110/digital_divides_ media_literacy.pdf Heeks, R. (2017). Information and communication technology for development (ICT4D). New York, NY: Routledge. Helsper, E. J. (2012). A corresponding fields model for the links between social and digital exclusion. Communication Theory, 22(4), 403–426. Hoggart, R. (1957). The uses of literacy: Aspects of working-class life, with special reference to publications and entertainments. London: Chatto and Windus. Ignatow, G., & Robinson, L. (2017). Pierre Bourdieu: Theorizing the digital. Information, Communication & Society, 20(7), 950–966. Kikuchi, M., & Coleman, C. L. (2012). Explicating and measuring social relationships in social capital research. Communication Theory, 22(2), 187–203. Mawson, J. (2001). The end of social exclusion? On information technology policy as a key to social inclusion in large European cities. Regional Studies Journal, 35(9), 861–877. Neves, B.  B. (2013). Social capital and Internet use: The irrelevant, the bad, and the good. Sociology Compass, 7(8), 599–611. North, S., Snyder, I., & Bulfin, S. (2008). Digital tastes: Social class and young people’s technology use. Information, Communication & Society, 11(7), 895–911. Patulny, R. V., & Lind Haase Svendsen, G. (2007). Exploring the social capital grid: Bonding, bridging, qualitative, quantitative. International Journal of Sociology and Social Policy, 27(1/2), 32–51. Phua, J., Jin, S. V., & Kim, J. J. (2017). Uses and gratifications of social networking sites for bridging and bonding social capital: A comparison of Facebook, Twitter, Instagram, and Snapchat. Computers in Human Behavior, 72, 115–122. Putnam, R. D. (2000). Bowling alone: America’s declining social capital. In Culture and Politics (pp. 223–234). New York: Palgrave Macmillan US. Rainie, L., & Wellman, B. (2012). Networked: The new social operating system. Cambridge, MA: The MIT Press. Ragnedda, M. (2017). The third digital divide: A Weberian approach to digital inequalities. New York: Routledge. Ragnedda, M. (2018). Conceptualizing digital capital. Telematics and Informatics, 35(8), 2366–2375. Ragnedda, M., & Ruiu, M. L. (2017). Social capital and the three levels of digital divide. In M.  Ragnedda, & W.  Muschert (Eds.), Theorizing digital divides (pp. 27–40). New York: Routledge.

448   Simeon J. Yates and Eleanor Lockley Robinson, L. (2009). A taste for the necessary: A Bourdieuian approach to digital inequality. Information, Communication & Society, 12(4), 488–507. Robinson, L. (2011). Information-channel preferences and information-opportunity structures. Information, Communication & Society, 14(4), 472–494. Robinson, L. (2014). Endowed, Entrepreneurial, and Empowered-Strivers: Doing a lot with a lot, doing a lot with a little. Information, Communication & Society, 17(5), 521–536. Rollock, N. (2014). Race, class and “the harmony of dispositions.” Sociology, 48(3), 445–451. Savage, M. (2015). Social class in the 21st century. Penguin UK. Savage, M., Devine, F. Cunningham, N., Taylor, M., Li., Y., Hjellbrekke Le Roux, B., Friedman, S., & Miles, A. (2013). A new model of social class? Findings from the BBC’s Great British Class Survey Experiment. Sociology, 47(2), 219–250. Selwyn, N. (2003). ICT for all? Access and use of public ICT sites in the UK. Information Communication & Society, 6(3), 350–375. Schütz, A. (1967). The phenomenology of the social world. Evanston, IL: Northwestern University Press. Sloan, L. (2017). Who Tweets in the United Kingdom? Profiling the Twitter population using the British Social Attitudes Survey 2015. Social Media + Society, 3(1), 1–11. Son, J., & Lin, N. (2008). Social capital and civic action: A network-based approach. Social Science Research, 37(1), 330–349. Steinfield, C., Ellison, N., Lampe, C., & Vitak, J. (2012). Online social network sites and the concept of social capital. In F. L. F. Lee, L. Leung, J. Qui., & D. S. C. Chu (Eds.), Frontiers in New Media Research (pp. 115–131). New York: Routledge. Straubhaar, J., Spence, J., Tufekci, Z., & Lentz, R. (2012). The persistence of inequity in the technopolis: Austin, Texas. Austin, TX: University of Texas Press. Valenzuela, S., Park, N., & Kee, K. F. (2009). Is there social capital in a social network site? Facebook use and college students’ life satisfaction, trust, and participation. Journal of Computer-Mediated Communication, 14(4), 875–901. Van Dijk, J. A. (2005). The deepening divide: Inequality in the information society. Thousand Oaks, CA: Sage Publications. Van Dijk, J., & Hacker, K. (2003). The digital divide as a complex and dynamic phenomenon. The Information Society, 19(4), 315–326. Walsham, G. (2017). ICT4D research: Reflections on history and future agenda. Information Technology for Development, 23(1), 18–41. Yates, S., Kirby, J., & Lockley, E. (2014). Supporting digital engagement: Final report to Sheffield City Council. http://shura.shu.ac.uk/9547/1/Supporting-Digital-Engagement-Report1.pdf Yates, S. J., Kirby, J., & Lockley, E. (2015a). “Digital-by-default”: Reinforcing exclusion through technology. In L. Foster, A. Brunton, C. Deming, & T. Haux (Eds.), Defence of Welfare 2 (pp. 158–161). Cambridge, UK: Polity Press. Yates, S., Kirby, J., & Lockley, E. (2015b). Digital media use: Differences and inequalities in relation to class and age. Sociological Research Online, 20(4), 12. Yates, S., & Lockley, E. (2018). Social media and social class. American Behavioral Scientist, 62(9), 1291–1316.

Section 6

C I T I Z E NSH I P, P OL I T IC S , A N D PA RT IC I PAT ION

chapter 16

ESCR R ev iew Citizenship and Politics Simeon J. Yates, Bridgette Wessels, Paul Hepburn, Alexander Frame, and Vishanth Weerakkody

Introduction This chapter briefly explores the outcomes of the literature review and expert Delphi review process for the citizenship and politics domain. As with the other review chapters the goal is not to work through a large number of examples from the literature. Instead, building on the methods described in chapter 2, we will first set out the results of the digital humanities-based analyses of the literature and the content analysis of methods and theory. We will highlight the major topics and concepts within the ­literature—providing a few general examples. These are not intended to be the “most important” examples from the literature but rather simply indicative of the types of work. This is then followed by the presentation of the content analysis that sought to identify the key theories and methods in use within the literature. Next, we outline the results from the Delphi review of experts. This concludes with the key questions, topics and challenges we identified, and we compare these to the results from the literature work. In the last section, we will present the recommendations for areas of future study. As a reminder, the initial scoping questions for this area of work were: • How digital technology impacts on our autonomy, agency and privacy—illustrated by the paradox of emancipation and control; and • Whether and how our understanding of citizenship is evolving in the digital age— for example whether technology helps or hinders us in participating at individual and community levels.

452   Simeon J. Yates et al.

Initial Comments On one level this part of the project could not have taken place at a more interesting and challenging time, with both the Brexit referendum and the election of a social media active Donald Trump as US president. Behind both events are very complex issues of polarization in politics, and raise deep questions about the role of media, especially digital media, in all levels of political activity. Unfortunately, this means that there has been a small explosion of research on this topic since the current analysis was completed. As we will discuss later, this issue and concern comes through in the Delphi work, and we reflect on next steps in regard to polarized political communication and digital media in the conclusion. Possibly reflecting on this context, this domain had the greatest number of Delphi returns and identified starting literature; in terms of both the number of responses and in the extent and detail of the responses. As a result, a considerable amount of work was undertaken in the analysis and in the final consultation workshop focused on reducing the breath of material gathered. This chapter therefore has a slightly different structure to the other six ESRC review chapters, as the consultation workshop materials are integrated rather than separately reported. The team reflected on the reasons for this much stronger response. As we noted earlier, the Delphi process took place just after the Brexit vote and during the US presidential election. It is possible that the issues around citizenship, politics, and digital media struck a chord with respondents at this time. We also noted that the project steering group had a number of members whose current or prior work has touched on this area. Thus, though we tried to ensure as balanced a response as we could, this may have biased the snowball sample or potentially motivated respondents in this area. Of course, both factors could have played together. We would also like to highlight something that comes through in the comparison of the concept maps from the 2000–2004 period with that of the 2012–2016 period (see Figures 16.1 and 16.2).1 In examining the visualizations of the data from these two periods we find that concept pairs such as public sphere and a focus on government and candidate Internet use are apparent in the earlier literature. These are replaced with foci around participation and engagement for the later literature. We feel this marks a transition from an initial focus on the potential for the Internet and digital media to facilitate public debate and enhance the public sphere, to one focused on the role of networks (social media) to support and enhance political engagement. As noted later, we may now be in a third stage where the focus is on the role of networks in creating “echo chambers” or “filter bubbles,” therefore negating the potentials to enhance the public sphere or disrupt political institutions. Scholars, particularly in political studies and media studies, have noted the rise of the use of social media in civic and political spheres. In broad terms, attention has focused on the ways in which social media facilitates engagement in politics and participation in politics (Dahlgren, 2013) and the characteristics of that communication and relationship between citizens and politics (Papacharissi, 2015). There is a general consensus that a number of factors need to be addressed to realize the potential of digital communication to enhance

ESCR Review: Citizenship and Politics   453

group/int

form/netw

action/we

age/infor communica

internet/ network/so community

internet/mo

government/

governmen

government/inf

citizen/com

information/ soc

information

community

candidate/si

man/woman

form/info network/st

communication/

communication site/web

communicat

citizen/gove

community

campaign/w campaign/s

citizen/i

information/ne

communication internet/use

communication/ medium

democracy/

link/site democracy

citizen/in

october/u

newcastl

communicat

internet/p communicati governmen

sphere/w informat

informati communica access/inter

effect/in

cramer/gr

Figure 16.1  Citizenship 2000–2004: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2000–2004. The diameter of each circle reflects the frequency of the concept pairs.

­ articipation. A continuing issue of inequality remains, with social and political inequality p adding to any existing digital divides. Further, questions of how to develop open and deliberative participation using online communication remain difficult to address. There is a better understanding about the threats that digital communication poses in terms of filter bubbles (Sunstein, 2001) and the personalization of news and other information that might limit an open debate. However, this is an area that requires more research.

Literature Analysis The literature analysis was designed to create two analytic outcomes. First, the goal was to identify key topics within the existing literature. This would allow the comparison

454   Simeon J. Yates et al.

activity/med

medium/platf

service/sms

medium/ movem engagement/

activity/par effect/mediu study/use

participation medium/relati

action/grou information/i

internaet/lit

engagement/

grievance/s

medium/study

medium/use

action/ moveme

citizen/mediu medium/role

engagement/ medium/participat

communication/ net information/ society

citizen/serv

group/use

action/ organ

communication/ medium

engagement/medium

medium/news

medium/politic

medium/society action/ medium

channel/ser network/ organiza

communication/ phone/ servi

participati

internet/parti

participation

effect/parti

participati

democracy/ med

action/networ

facebook/ me

communication/ soc

people/ poli

delivery/se

information

group/user

Figure 16.2  Citizenship 2012–2016: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2012–2016. The diameter of each circle reflects the frequency of the concept pairs.

with areas of future importance identified by the Delphi review. Second, we conducted a content analysis of the literature to explore the predominance of specific, theories, methods and approaches. As noted in the chapter 2, the literature data were subjected to two analyses. The first round of collected literature was analyzed to create concept pairs and trios, and then the combined first and second rounds of literature were analyzed to identify key topic clusters. The results of these two approaches were then compared. Table 16.1 lists the 10 most common concepts identified from the first round of literature. These represent the concepts covering 2% or more of the identified cases. Table 16.2 lists the concept pairs within these groups. In Table 16.2 the first part of the concept pair is marked in bold with various second elements presented in the list below this first part. Unsurprisingly the concepts of citizenship, action and networks were ranked top. This reinforces the point that much of the underlying conceptual base of the literature on digital media and politics is focused on the three-way interface between: citizens; political action and engagement; and the role or impacts of networks (digital or otherwise).

ESCR Review: Citizenship and Politics   455 Table 16.1  Analysis Concepts Ranked Concept

Percent

Citizen Action Network Campaign Citizenship Channel Access Engagement Government Participation Information Link Delivery

7.56 7.32 6.21 5.35 4.35 4.08 3.46 3.35 2.92 2.81 2.59 2.43 2.40

Table 16.2  Concept Pairings—Main and Secondary Concepts Concepts

Percent

Concepts

Percent

Citizen democracy engagement government participatory perception

13.79 2.96 2.91 4.43 1.58 1.92

campaign candidate election movement party practice

9.75 2.02 2.91 1.03 2.86 .94

action activism campaign frame membership protest talk

13.35 1.87 1.82 3.20 1.18 4.29 .99

citizenship engagement people phenomenon study youth

7.93 2.02 2.17 .89 1.43 1.43

network power recognition strength transformation

11.33 6.65 2.36 1.18 1.13

channel citizen consumer phone service

7.44 2.17 .99 1.43 2.86

access citizenship latino percentage survey white

6.31 .44 1.67 1.33 1.77 1.08

Concepts

Percent

engagement norm participation use

6.11 1.13 2.46 2.51

government latino responsiveness stage

5.32 1.13 2.27 1.92

participation participatory protest

5.12 2.86 2.27

information literacy overload protest

4.73 1.43 .49 2.81

link pattern site twitter

4.43 1.13 2.41 .89

delivery perception phone value

4.38 1.48 1.38 1.53

Note: bolded term is the main concept; the unbolded terms below that and above the line are the related subconcepts.

456   Simeon J. Yates et al. Our second approach to the analysis of the literature explored the extraction of topics using a different methodology, based on a factor analysis of salience and relevance measures. We utilized both custom-developed tools and the WordStat software. Unlike the concept mapping which pulled out some of the underlying ontological links, the identification of topics produced groups that more overtly fitted theory and methods in the literature. (This was the case for all of the literature analyses.) Table 16.3 presents the 15 topics identified by WordStat, and Table 16.4 maps these to the concepts analysis.

Table 16.3  WordStat Analysis of Topics Concept topics

Keywords

Eigen-value

Freq

Public sphere

SPHERE; DELIB; HABERMA; DEMOCRACI; DELIBER; DEMOCRAT; PUBLIC; DEBAT; DISCOURS; FORUM; POLIT

Measurement

VARIABL; REGRESS; STATIST; TEST; TABL; MODEL; MEASUR; PREDICT; ESTIM; SIGNI; SAMPL; CORREL

Social network analysis

Cases % Cases

10.5

29,329

486

98.0

3.19

18,205

474

95.6

INFECT; NODE; CONTAGION; NEIGHBOR; THRESHOLD; TI

2.77

2144

313

63.1

Protest and activism

MOVEMENT; PROTEST; ACTION; COLLECT; ORGAN; ACTIVIST; OCCUPI

2.69

12,940

473

95.4

Governance

GOVERN; SERVIC; POLICI; PUBLIC; SECTOR; ADMINISTR; MANAG

2.52

20,565

490

98.8

Elections

ELECT; PARTI; VOTER; CAMPAIGN; CANDID; ELECTOR; VOTE

2.37

11,159

407

82.1

Cyber hate crime

CRIME; VICTIM; HATE; GUARDIANSHIP; CYBER; POLIC; SECUR

2.14

2632

317

63.9

Partisan politics EXPOSUR; PARTISAN; POLAR; ATTITUD; ATTITUDIN; PERCEIV; OPINION

2.01

5060

429

86.5

Web and social media

SITE; WEB; PAGE; USER; BLOG; SEARCH; LINK; GOOGL; FACEBOOK

1.92

14,607

470

94.8

Gender and ethnicity

GENDER; WOMEN; EDUC; FEMAL; ETHNIC

1.83

4741

400

80.7

Civic engagement

CIVIC; ENGAG; CITIZENSHIP; YOUTH; LEARN

1.81

8650

455

91.7

Mobile

PHONE; MOBIL; SM; CHANNEL

1.72

3746

395

79.6

Political online fora

FORUM; THREAD; TALK

1.65

2255

325

65.5

Homophili

HOMOPHILI; NOIS; AGENT; NEIGHBOR; INFLUENC

1.63

2044

315

63.5

Twitter

TWITTER; TWEET; HASHTAG

1.57

2267

181

36.5

Table 16.4  Comparison between Concepts and WordStat Topics Concept/Topic

Citizen Action Network Campaign Citizenship Channel Access Engagement Government Participation Information Link Delivery

Twitter Social network analysis Homophily Cyber hate crime Political online fora Mobile Gender and ethnicity Elections Partisan politics Civic engagement X Web and social media Protest and activism X Measurement Public sphere Governance

X X

X X

X X

X X

X X X X

X X X

458   Simeon J. Yates et al.

Topics The eight key areas emerging from the analysis (see Table  16.3) are Public sphere, Measurement, Social network analysis, Protest and activism, Governance, Elections, Cyber hate crime, and Partisan politics. The category of measurement reflects the greater proportion of work in this domain that employs statistical analysis (see the following section on methods). Although the issue and topic of governance is important in this domain, there is a separate full section of the book (see Section 7) dedicated to this topic. In order of importance, the first topic reflects the broader question of civil society and the public sphere whereas the others focus on specific actions or contexts such as election campaigns—though of course ideally these issues should strongly intersect. From the analysis in Table 16.3 it is clear that the idea of the public sphere is a key topic in the academic debate around the impact of digital media on politics. This is clearly articulated in works by highly influential authors (e.g., Castells, 2008) but also in many individual studies. As we noted previously, one of the general findings from the content analysis was the utilization of very wide-ranging ideas or publications as “scene setters”—such as the idea from Castells (1996) of the “network society”—but without the detail of this work being substantively engaged with. This appears to be the case with the idea of the public sphere (Habermas, 1991), where the Habermasian concept is repeatedly pointed to without the full theoretical model being employed. As we noted earlier, the longitudinal view over the last two decades points to a shift from the focus on public sphere to one on more individualistic issues of participation and engagement. There also appears to be a shift away from the earlier literature that tended to focus on the potentials for digital media to support the public sphere and deliberative democracy. Papers exploring the idea of the public sphere were not uncritical but utilized the concept to examine the potentials for deliberative democracy through digital media either theoretically (e.g., Dahlberg,  2001, Dahlgren,  2005) or through the analysis of interactions (e.g., Papacharissi, 2004). The focus has since shifted towards the analysis of actual network interactions (often via social media) and the extent to which political engagement, influence, and action are developed. In the time between the literature analysis undertaken by the ESRC project and the current publication the focus has shifted somewhat to the failing of the public sphere and the rise of “echo chambers” and “mini publics” (see Frame & Brachotte, 2015). This concern clearly comes through in the Delphi analysis of experts reported later in the chapter. As with several of the other domains, Twitter and social network analysis are two prominent and linked topics. The intersection of Twitter, politics, and citizenship is fraught with challenges, especially as both the technology itself and its uses have continued to change over the last decade (Bimber et al., 2015). As a result, it may not always be helpful to draw comparisons with traditional political behavior. Within the actual study of the use of Twitter in politics and political action there is a focus on collective action. For example, Kende, van Zomeren, Ujhelyi, & Lantos et al. (2016) proposed and tested how the social affirmation use of social media motivates individuals for collective action to achieve social change. In this frame of work, a good number of analyses are focused on the nature of

ESCR Review: Citizenship and Politics   459 social capital, or psychological group membership measures, as routes to understanding political social action undertaken through or supported by Twitter. Much of this focuses on young citizens, but often has a strong element of networked individualism (e.g., Rainie & Wellman, 2012) with the examination of the potential for platforms such as Facebook, Twitter and YouTube to influence political stances and civic engagement (see Loader et al., 2014). A reasonably comprehensive overview can be found in Weller et al. (2014). There are many papers that take on board the theory or orientation of social network analysis (SNA) in their approach, such as the ideas of weak and strong ties or network power, or reference SNA-based studies, though they do not necessarily use specific SNA methods. Examples include political participation (e.g., Bennett, 2012; Chan, 2016), campaigning (Bruns & Highfield, 2013), and influence (e.g., Grudz et al., 2014). The specific use of SNA methods was in fact very limited (see Table 16.6) and was to be found in papers with a strong methodological focus. These may not take on politics and citizenship directly but elucidate how influence may spread in both digital and non-digital communications networks and how these interact (e.g., Haythorthwaite, 2002). More pragmatically, the literature focuses on the actual practices and online behaviors. Activism and protest appear in more recent literature, with authors focusing on the role of social and networked media in engagement and organization of politics. For example, Agarwal et al. (2014) compare the use of digital media by two very different political groups: the Republican Tea Party movement and the Occupy Wall Street movement. Other studies attempt to analyze the links between the types of social media used, contexts of interaction with similar or other groups, and likely political participation (e.g., Kim & Chen, 2016). Some studies try to assess the extent to which online activity leads to other forms of political action (digital or not), from voting to collective action (e.g., Schumann and Klein, 2015; Theocharis & Lowe, 2016). In regard to elections this work goes back to the mid-1990s (e.g., Yates & Perrone, 1998) and early 2000s (e.g., Coleman, 2001a, 2001b) with a strong focus on the United Kingdom and the United States (e.g., Foot & Schneider, 2002). The breadth and variety of this work has grown extensively over the past decade to include a wider variety of nations and forms of electoral process (e.g., Gadekar et al., 2011; Vromen, 2015). The issue of “second screen” communication in the electoral context has also been receiving increasing attention, for example during important televised campaign debates in various countries. Furthermore, questions of online hate or partisan interaction are at this time a key issue. Studies range from analyses of homophily in political group membership (e.g., Colleoni, et al., 2014) through arguments that social and digital media use have gone hand in hand with more personalized politics (e.g., Bennett, 2012). This then bleeds over into issues of digital governance and online crime, be it terrorism or hate crime. We would argue therefore that there appears to be a general, though not universal, shift in the literature over the last two decades, from ideas of the potential role of digital media in the broader public sphere to much more specific and analytics-based assessments of the specifics of network dynamics in regard to political action, engagement, and participation. This situation is seen in a range of areas, including political communication and news, connective action, and the media hybridity. In the area of political communication,

460   Simeon J. Yates et al. research has identified that people access their news using both social media and mainstream (whether public or commercial) media (Rainie et al.,  2011; Oxford Reuters Institute for the Study of Journalism, 2016). The wider media environment for political communication has resulted in the development of media hybrid systems in political communication (Chadwick & Dennis, 2017). What Chadwick and Dennis argue is that both traditional media communication and social media communication are configured in different ways to reach different social groups. There is an organizational dimension to this that draws on the networked logic of social media; here, both connective action and collective action is mobilized (Bennett & Segerberg, 2013).

Theory, Method, and Approach As chapter 2 describes, the content analysis builds on Borah’s (2017) approach to analyzing a set of communications and media literature in regard to digital media use. Table 16.5 provides the results with regard to the empirical approach taken in the literature. The majority of the papers (45%) undertook primary data collection, with 33% being theoreticalsynthesis of current or prior work. The main disciplines from which theory was used or for which theory was developed were: politics and public administration (48.6%), sociology (28.0%), and communication and media (14.3%). It is important to note that only actual use of theory for the purposes of design, synthesis, or analysis were coded. General references to prior work and theory, such as broad reference to “network society” (Castells) or “public sphere” (Habermas), were not coded. This distinction is important, as it highlights the use of theory to design and analyze data or synthesize materials, as distinct from more general discussion. There was considerable variety in the specific theories applied from these disciplines, with no clear preference. Ideas of the public sphere (6%) and political participation (5%) were the most common in the political science literature. The main research methods were literature reviews (33%), surveys (29%), content analysis (8%), and interviews (7%; see Table 16.6). The majority of the empirical work focused on specific groups (e.g., Facebook users) with a limited number of general population studies (see Table 16.7). Where new data was analyzed, the majority (53%) of the analyses were qualitative, though the methods varied, the remainder being statistical (see Table 16.8). Only one study overtly stated that they were using a “big data” approach. Compared to the other domains, there is a stronger emphasis on both empirical data collection and quantitative analysis in the literature analyzed here. As with all the domains, both big data and SNA are so far only used to a limited extent.

Delphi Review The literature review analysis explored the themes found within recent research publications. The following sections detail the results of the Delphi process for the Citizenship

ESCR Review: Citizenship and Politics   461 Table 16.5  Empirical Approach Percent Primary empirical (data collected and analyzed) Theoretical (synthesis of current or prior work) Discursive/descriptive (no new data or theory) Secondary empirical (analysis of existing data)

45.1 33.3 13.7 7.8

Table 16.6  Research Method Percent Literature review (general or narrative) Survey Content analysis Interview(s) Theory building Other Experiment Ethnography Focus groups Social network analysis Textual (linguistic-discourse analysis) Meta-analysis or systematic review

32.7 28.6 7.8 6.9 6.9 4.2 3.2 3.2 2.8 1.8 .9 .9

Table 16.7  Study Population Percent Specific group General population Case study(ies)

48.8 33.7 17.4

Table 16.8  Analytic Approach Percent Qualitative (textual-non-discourse) Statistical (numerical) None Not applicable Discourse (textual-linguistic-discourse)

53.5 32.2 8.3 5.2 .9

462   Simeon J. Yates et al. and Politics domain. There were three parts to the Delphi review: an initial survey, a confirmatory questionnaire to address the findings from the survey, and a confirmatory workshop. The goal of the Delphi process was to identify and prioritize areas for future research. These might include areas already covered by literature but also new concerns, or the needs for a tighter focus on a specific issue. The process sought to identify suggested future scoping or research questions, key topics to address within these questions, and key challenges that might be encountered when researching these questions.

Future Research and Scoping Questions Given the amount of input to the Delphi process for this domain, the suggestions for scoping and research questions were coded into the eight categories and 36 specific questions, which were grouped into 21 questions, as detailed in Table 16.9. The process used two measures to assess the importance of these questions. The first was how

Table 16.9  Delphi Review Scoping Questions Question category

Questions

Digital technologies, radicalization, mobilization and political action

In what ways do digital technologies impact traditional forms of mobilization, collective action, and/or political participation? How have “negative” online behaviors (such as trolling and flaming) impacted on civic and political activity?

Digital technologies, emancipation, agency and control

How and in what ways are digital technologies challenging or reinforcing existing power relations? What are the impacts on our autonomy, agency, dignity and privacy?

Digital technologies and the disruption of current political institutions

How do new technologies disrupt and challenge incumbent political institutions? What are the opportunities and challenges facing democracy in an age of digital participation? How do social media affect the quality of democracy/citizenship? And what about non-democratic states?

Digital technologies, political identity, emotion and empowerment

Does access to digital technologies have a positive emotional impact on citizens, making them feel empowered, with a voice and potential influence?

Digital technologies and new forms of citizenship

How does technology enlarge or change our understanding of, and interaction with, citizens outside of our own national borders? What constitutes citizenship? Is it meaningful to talk about digital citizenship? Does digital expand the notion or simply provide a new space for the exercising of citizenship rights and duties? How are youth engaging with digital technologies and online politics?

ESCR Review: Citizenship and Politics   463 Question category

Questions

Digital technologies and governance

How does technology improve governance (i.e., government’s responsiveness to citizen concerns and ability to effectively manage competing interests)? Does electronic governance transform relationships between states and citizens and the nature of politics?

Digital technologies, groups and elites

How do political elites use digital media? How do old and new parties use new technologies and with what consequences? Does new media promote populism? How do emerging media platforms impact the ongoing digital divide?

Digital technologies, political communication, debate and media

How do new ecosystems of information and delivery impact on political participation, opinion forming, and education? How do people perceive “success” in online political participation? How does digital media interact with traditional media in shaping public opinion?

Table 16.10  Delphi Review Scoping Questions Ranked by Number of Cases and by Importance Question category

Rank: Number of cases

Rank: Importance (Percent)

Digital technologies, radicalization, mobilization, and political action

3

1 (21)

Digital technologies and the disruption of current political institutions

1

2 (17)

Digital technologies and new forms of citizenship

6

3 (16)

Digital technologies, political communication, debate, and media

2

3 (16)

Digital technologies and governance

8

4 (12)

Digital technologies, emancipation, agency, and control

4

5 (10)

Digital technologies, political identity, emotion, and empowerment Digital technologies, groups and elites

5

6 (6)

7

7 (1)

f­ requently the suggestion came up in the Delphi survey, and the second was how important these topics were ranked by the Delphi interviewees. Table 16.10 shows the ranking of these categories by the number of questions allocated to the category and by their ranked importance from the confirmatory survey. It is important to note that ranked importance is almost same in both tables. As chapter 25 describes, there are a number of

464   Simeon J. Yates et al. areas identified in the scoping questions and challenges that are cross cutting, a key one of these being governance. As a result, there are also some strong overlaps with the Governance and Security domain (see chapters 22 and 23) that will be addressed there.

Key Topics If we turn next to specific topics that might cross cut these questions, we find that topics that were most commonly cited in the Delphi process were also those deemed most important in the confirmatory survey (see Table 16.11 and Table 16.12). These topics also closely match the proposed research and scoping questions. Given the number and detail of the scoping questions provided in the initial rounds of the Delphi process, this overlap was highly likely. One of the reasons for this was that as respondents interpreted differently the idea of there being two levels, one of overarching questions and then the topics within them, questions for some respondents were topics for others. This distinction appeared to be clearer for other domains, where the volume of responses was lower. Overall, though, this does provide reinforcing evidence, along with the broad support of the consultation workshop, for the relevance of the questions and topics. We do note, however, that a key comment made in the confirmatory workshop was that the literature and Delphi work had not really addressed the issue of digital media use in the context of major state control and censorship. We agree that this is a topic that appears not to have had as much attention in the literature we surveyed nor in the Delphi review. If we leave aside the governance issues for chapters 22 and 23, then it is clear that the Delphi panel, confirmatory survey, and the workshop see the future focus for research

Table 16.11  Key Topics Ranked by Percent of Delphi Survey Responses Topic Divides Mobilization Talk Control Data Media Other Participation Citizenship Engagement Governance Privacy Identity Methods

Percent 8 8 7 6 6 6 6 6 5 4 4 4 3 3

Topic Technologies Civic Commercial Cultural Direct democracy Empowerment Geopolitics Policy Trust Young people Contestation Parties Populism State

Percent 3 2 2 2 2 2 2 2 2 2 1 1 1 1

ESCR Review: Citizenship and Politics   465 Table 16.12  Key Topics Ranked by Importance from Delphi Survey Topic

Very important

Important

Neutral

Unimportant

Very unimportant

Governance in a digital age

51.9%

37.0%

11.1%

0.0%

0.0%

Political mobilization via digital media

48.1

40.7

7.4

3.7

0.0

Digital and state control

48.1

37.0

11.1

3.7

0.0

Citizenship in a digital age

48.1

33.3

14.8

3.7

0.0

Data—big, small, and citizen

44.4

37.0

14.8

3.7

0.0

Political participation and engagement

44.4

37.0

14.8

3.7

0.0

Privacy in a digital age

40.7

40.7

11.1

3.7

0.0

Political media, old and new

29.6

44.4

18.5

7.4

0.0

Digital divides

22.2

59.3

11.1

7.4

0.0

Political identity in a digital age Online debate and interaction

22.2 18.5

48.1 70.4

29.6 11.1

0.0 0.0

0.0 0.0

to be focused on citizenship, participation, and engagement, with specific concerns about mobilization and radicalization. The questions about empowerment, public sphere, links to older media, and emancipation that are present in some of the earlier literature in this domain has moved down the list of priorities. Importantly, there is a growing concern for how digital technologies are disrupting politics and political institutions.

Research Challenges Our final set of questions asked the Delphi panel about the challenges that may be faced in undertaking research in these areas. These were placed into 14 categories and ranked by the number of coded items (Table 16.13). None of the main challenges was deemed to be specific to any of the seven domains by the consultation workshop. Table 16.14 shows the ranking of these by the confirmation survey. Such cross-cutting topics and challenges are discussed in the concluding chapter 25.

Conclusion The concepts and topic mapping analyses generated very similar results, and these closely map onto the Delphi results. The close mapping of the Delphi and literature analyses

466   Simeon J. Yates et al. Table 16.13  Challenges Ranked by Percent of Cases Challenge

Percent

Methods Theory Big data Epistemology/ontology Ethics Psychology Technology Exclusion Education Funding Impact Individualism Policy Training

42 14 12 7 6 5 4 2 1 1 1 1 1 1

Table 16.14  Challenges Ranked by Importance from Delphi Survey Challenge Developing new theory Developing new methods Dealing with “big data” Ethics Epistemological and ontological issues

Very important Important Neutral Unimportant Very unimportant 55.6% 44.4 44.4 37.0 37.0

37.0% 33.3 33.3 51.9 25.9

7.4% 18.5 18.5 7.4 25.9

0.0% 3.7 3.7 0.0 7.4

0.0% 0.0 0.0 3.7 3.7

potentially indicates that this is a well-developed domain of research with clear foci. The consensus around the consolidation of research questions in the consultation workshop reinforces this. There may be a number of good clear reasons for this emphasis. Political communication and behavior are substantive aspects of both communication studies and political science. These are both areas that have been dramatically affected in very public ways by digital media, in contrast to the very real but less visible impacts of digital technologies on governance or public policy. There are also indications that the visibility of digital media—from the web to social media—have made (at least some) processes of political communication very visible and open to analysis. Given that the literature and the Delphi recommendations strongly overlap, the research has not identified any clear new topic gaps to highlight for future work. Rather, the Delphi work appears to confirm the patterns found in the literature, with a move away from the assessment of the potentials of digital media for deliberative politics, development of the public sphere and broad civic engagement, to a clearer focus on the role of networks in political mobilization, influence, partisan politics, and more

ESCR Review: Citizenship and Politics   467 i­ ndividualistic measures of engagement and political action. This may be a reaction to political changes experienced over the last five years that have often been associated with the use of digital media—such as the Arab Spring, Brexit, and the election of Donald Trump. It might also reflect the nature of available data (e.g., Twitter and Facebook posts) or the nature of these media which are more notably individualistic in form than either prior mass media or even older forms of digital media (e.g., Rheingold, 1993). The disruption caused by social media itself or its use and misuse in politics (cf. Cambridge Analytica scandal) are a further clear set of contemporary questions. Yet underneath this is both an empirical and theoretical need to fully understand what our current social media-based politics is telling us about citizens’ behavior and political processes, and vice versa. We would argue that for the health of democratic institutions, there is a need to empirically understand contemporary political behavior and participation in the context of digital technology use. As a word of caution, in the other domains we noticed a “platform focus” in many studies. In the case of politics, an example might be a focus on political uses of Twitter, in contrast to boarder studies of the full range of digital media that citizens may utilize for political communication. Though there are examples of platform focus, it does not appear as pronounced in relation to political research as in other domains. As with the other domains, we believe that the complexity and variety of potential work warrants consideration to be taken of all the questions topics and challenges identified. Noting this, we would argue that the analysis here has identified four key areas for future research:

1. “Digital technologies,” radicalization, mobilization, and political action 2. “Digital technologies” and the disruption of current political institutions 3. “Digital technologies” and new forms of citizenship 4. “Digital technologies,” political communication, debate, and media

We note that the Governance and Security domain significantly addresses the issue of “Digital technologies and governance,” which was the top ranked topic in the confirmatory survey. The other key topics identified fit within these four scoping areas, except for Digital and state control. This topic, identified as well in comments at the consultation workshop, reminds us that not all politics are democratic and there is no necessary causal link between digital media use and open societies.

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high-frequency ­co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted

468   Simeon J. Yates et al. in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualisations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https://waysofbeingdigital.com/literatureanalysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each domain and the relative frequency of concepts associated with each cluster.

References Agarwal, S. D., Barthel, M. L., Rost, C., Borning, A., Bennett, W. L., & Johnson, C. N. (2014). Grassroots organizing in the digital age: Considering values and technology in Tea Party and Occupy Wall Street. Information, Communication & Society, 17(3), 326–341. Bennett, L. (2012). The personalization of politics: Political identity, social media, and changing patterns of participation. Annals of the American Academy of Political and Social Science, 644(1), 20–39. Bennett, W. L. & Segerberg, A. (2013). The logic of connective action: Digital media and the personalization of contentious politics. Cambridge, UK: Cambridge University Press. Bimber, B., Cunill, M. C., Copeland, L., & Gibson, R. (2015). Digital media and political participation: The moderating role of political interest across acts and over time. Social Science Computer Review, 33(1), 21–42. Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Bruns, A. & Highfield, T. (2013). Political networks on Twitter. Information, Communication & Society, 16(5), 667–669. Castells, M. (1996). The rise of the network society: The information age: Economy, society, and culture (Vol. I) (Information Age Series). London: Blackwell. Castells, M. (2008). The new public sphere: Global civil society, communication networks, and global governance. Annals of the American Academy of Political and Social Science, 616(1), 78–93. Chadwick, A. & Dennis, J. (2017). Social media, professional media and mobilisation in contemporary Britain: Explaining the strengths and weaknesses of the citizen’s movement 38 Degrees. Political Studies, 65(1), 931–950. Chan, M. (2016). Social networks sites and political engagement: Exploring the impact of Facebook connections and uses on political protest and participation. Mass Communication and Society, 19(4), 430–451. Coleman, S. (2001a). Online campaigning. Parliamentary Affairs, 54, 679–688. Coleman, S. (2001b). Click for democracy. The World Today, 57(5), 17–18.

ESCR Review: Citizenship and Politics   469 Colleoni, E., Rozza, A., & Arvidsson, A. (2014). Echo public or public sphere? Predicting political orientation and measuring political homophily in Twitter using big data. Journal of Communication, 64, 317–332. Dahlberg, L. (2001). The Habermasian public sphere encounters cyber-reality. The Public, 8(3), 83–96. Dahlgren, P. (2005). The internet, public spheres, and political communication: Dispersion and deliberation. Political Communication, 22(2), 147–162. Dahlgren, P. (2013). The political web: Media, participation and alternative democracy. Basingstoke: Palgrave MacMillan. Foot, K. A. & Schneider, S. M. (2002). Online action in campaign 2000: An exploratory analysis of the US political Web sphere. Journal of Broadcasting & Electronic Media, 46(2), 222–244. Frame, A. & Brachotte, G. (Eds.). (2015). Citizen participation and political communication in a digital world. Abingdon, UK: Routledge. Gadekar, R. et al. (2011). Web Sites for E-electioneering in Maharashtra and Gujarat, India. Internet Research, 21(4), 435–457. Grudz, A. et al. (2014). Geography of Twitter networks. Social Networks, 34, 37–81. Habermas, J. (1991). The structural transformation of the public sphere: An inquiry into a category of bourgeois society. Cambridge, MA: The MIT press. Haythorthwaite, C. (2002). Strong, weak, and latent ties and the impact of new media. Information Society, 18(5), 385–401. Kende, A., van Zomeren, M., Ujhelyi, A., & Lantos, N. A. (2016). The social affirmation use of social media as a motivator of collective action. Journal of Applied Social Psychology, 46(8), 453–469. Kim, Y. & Chen H-T. (2016). Social media and online political participation: The mediating role of exposure to cross-cutting and like-minded perspectives. Telematics and Informatics, 33, 320–330. Loader, B. D., Vromen, A., & Xenos, M. A. (2014). The networked young citizen: Social media, political participation and civic engagement. Information, Communication & Society, 17(2), 143–150. Oxford Reuters Institute for the Study of Journalism. (2016). Reuters Institute Digital News Report. http://reutersinstitute.politics.ox.ac.uk/sites/default/files/research/files/Digital%25 20News%2520Report%25202016.pdf Retrieved July 3, 2018. Papacharissi, Z. (2004). Democracy online: Civility, politeness, and the democratic potential of online political discussion groups. New Media & Society, 6(2), 259–283. Papacharissi, Z. (2015). Affective publics: sentiments, technology and politics. Oxford, NY: Oxford University Press. Rainie, L., Purcell, K. & Smith, A. (2011). The Social Side of the Internet. Pew Research Centre. http://www.pewinternet.org/2011/01/18/the-social-side-of-the-internet/ Retrieved July 3, 2018. Rainie, L. & Wellman, B. (2012). The Individual in a networked world: Two scenarios. The Futurist, July-August, 24–28. Rheingold, H. (1993). The virtual community: Finding connection in a computerized world. Boston, MA: Addison-Wesley Longman Publishing Co., Inc. Schumann, S. & Klein, O. (2015). Substitute or stepping stone? Assessing the impact of lowthreshold online collective actions on offline participation. European Journal of Social Psychology, 45, 308–322.

470   Simeon J. Yates et al. Sunstein, C. (2001). Echo chambers: Bush v. Gore, impeachment and beyond. Princeton, NJ: Princeton University Press. Theocharis, Y. & Lowe, W. (2016). Does Facebook increase political participation? Evidence from a field experiment. Information, Communication & Society, 19(10), 1465–1486. Vromen, A. (2015). Campaign entrepreneurs in online collective action: GetUp! In Australia. Social Movement Studies, 14(2), 195–213. Weller, K., Bruns, A., Burgess, J. E., Mahrt, M., & Puschmann, C. (2014). Twitter and society: An introduction. In K. Weller et al. (Eds.), Twitter and society (pp. xxix–xxxvii). New York: Peter Lang. Yates, S. J. & Perrone, J. L. (1998). Politics on the Web. Assignation, 15(4), 49–53.

chapter 17

Digita l Ecol ogy of Fr ee Speech Authenticity, Identity, and Self-Censorship Yenn Lee and Alison Scott-Baumann

Introduction The political climate around the world in recent years, especially with regard to national-level elections and referenda across Europe and in the United States, has been volatile and conflictual. Observers have expressed concerns about the rise of populism or even fascism (Allen, 2018; Merelli, 2016; O’Brien, 2018), and have attributed much of the phenomenon to the propagation of problematic online content, including deliberate disinformation (a.k.a., “fake news”), racist and misogynous remarks, and material designed to lead to religious or political radicalization (Beckett & Burke,  2017; Marwick & Lewis, 2017). Many have attempted to define “populism,” but as Canovan (1981) points out, the term has been used to describe such a wide variety of phenomena, left and right and democracy and dictatorship alike, that it seemingly evades a universal definition. Nevertheless, Mudde and Kaltwasser (2017) have captured one of the most defining characteristics of the term; that is, populism is a “thin” ideology, often “parasitic” on another ideology, and it operates on the basis of dividing society into two antagonistic groups: the honest population and a corrupt elite. Albertazzi and McDonnell (2008) have added a third group, “others,” stating that populism, as a political strategy, “pits a virtuous and homogeneous people against a set of elites and dangerous ‘others’ who are together depicted as depriving (or attempting to deprive) the sovereign people of their rights, values, prosperity, identity and voice” (p. 3). In this divisive process, one rhetorical tactic that all populist leaders commonly draw upon is to assert that politics is easy but the elites and experts are lying that it is not. Promising simple solutions to complex social issues is at the heart of the populist appeal,

472   Yenn Lee and Alison Scott-Baumann which is why politicians “from India’s Narendra Modi to Turkey’s Recep Tayyip Erdoğan, from Hungary’s Viktor Orbán to Poland’s Jaroslaw Kaczynski, and from France’s Marine Le Pen to Italy’s Beppe Grillo sound surprisingly similar to each other despite their considerable ideological differences,” according to Mounk (2018, p. 38). One of the dangers with such populist rhetoric is that it further alienates people who are already at the margins; that is, those who have typically been “othered” in a given society. On top of that, researchers such as Brooker and Barnett (2016), Kaján (2017), and Jane (2018) demonstrate the significant extent to which “othering” and hate speech have now gone digital. Recent discussions on problematic content tend to center on the young generation of “millennials” (e.g., Fernández-Armesto,  2016; Hurst,  2017; Poushter,  2015). For instance, it is argued that universities are experiencing a “free speech crisis” globally, despite there being no consensus yet on who or what is causing the crisis if there indeed is one. Various reasons have been speculated, including “no platforming” or “safe space” policies adopted by more and more universities (although this speculation has been somewhat countered by the UK’s Joint Committee on Human Rights,  2018), “political correctness gone mad” (as proclaimed by, for example, the so-called “Intellectual Dark Web”; see Weiss, 2018), the far-right’s cyber harassment against academics (Cuevas, 2018), and donors’ influence (at least at American private universities; see Perry, 2018), to name a few. A key challenge in investigating this possible crisis is that in the digital age the boundaries of a university campus are becoming more and more “porous.” In other words, while being physically present on campus, students can also simultaneously interact with a broader world through various digital means. In this context, millennials, who are also referred to as “native” users of digital technology (Prensky, 2001), are often characterized by their irreducible reliance upon major social media services to express themselves. Increasingly marginalized in the traditional political arena (Sloam, 2014), young adults indeed seem to be taking their politics to “online third spaces” nowadays and seeking new ways to represent their interests and struggles (Graham, Jackson, & Wright, 2016; Wright, 2012). One of the distinct features of the contemporary political landscape, particularly after the 2008 global financial crisis, is that the institutions of representative democracy are going through a transformative shift worldwide. Europe, for example, has witnessed the mushrooming of “non-traditional” political parties and their electoral successes, such as the Five Star Movement in Italy (founded in 2009), the Pirate Party in Iceland (founded in 2012), and Podemos in Spain (founded in 2014). This shift to alternative forms of representation may not be an inherently positive or negative development. Regardless, the role that social media have played is significant (Gerbaudo, 2018)—not in the sense of being a replacement to traditional mass media, but of being an integral part of a wider information ecosystem where older and newer media interact, compete, collaborate, and learn from each other (Chadwick, 2013). What complicates matters is that in such a complex “hybrid” media environment there have emerged technologically sophisticated attempts to hijack people’s attention

Digital Ecology of Free Speech   473 and manipulate public discourses. According to a 2017 report by international human rights watchdog Freedom House (2017), bots and fake news outlets distorted elections in at least 18 countries between 2016 and 2017. The Oxford Internet Institute in the United Kingdom has also provided evidence of “cyber troops” and other formally or­gan­ized social media manipulation campaigns in operation in 48 countries as of 2018 (Bradshaw & Howard, 2018). In short, the propagation of problematic online content needs to be examined and understood against the backdrop of the hybridization of mass and social media, the emergence of non-traditional forms of power and institutions, and the growing technological capacity for manipulating information. All of these compel us to revisit and update the existing understanding of speech acts, political participation, and public opinion formation. The present chapter provides a thematic mapping of the current literature on free speech in the digital age and points to where further research is needed. This review is concerned in particular with how and to what extent the contours of free speech are contested and redefined in today’s mesh of physical and digital worlds.

Methodology In order to survey a field that is expanding and evolving rapidly, our methodology involves two key design decisions. First, for a more systematic and comprehensive monitoring of the latest developments, we have created and maintained a dynamic and hierarchical list of “buzzwords” in media and public discourses, presented in Table 17.1. The list has been developed with full awareness and caution that many of the words are contested terms, such as “counter-terrorism” and “radicalization,” which makes them very difficult to deal with online. Over 130 words or phrases have been added to the list between 2017 and 2018 (131 as of July 2018), including “algorithmic bias,” “alternative fact,” “attention merchant” (Wu,  2016), “computational propaganda” (Howard & Kollanyi, 2016), “content moderation,” “information disorder,” “micro-targeting,” and “meme.” These words and phrases have then served as multiple entry points to the literature, and also as tools for thematically organizing the material collected. A computerassisted qualitative data analysis software package (NVivo 11 Pro for Windows) has been used for this process and the overall project. The thematic categories in the left column of the table have been updated and revised as the review has progressed. The other decision was to look beyond traditional research publications and pay equally close attention to journalistic accounts and grey literature, with an aim to integrate scholarly perspectives with media and public discourses. Many independent research institutes, think tanks, and NGOs have been prolific contributors to moving the field forward. These include Data & Society (a non-profit research institute, founded in New York in 2014, that explores the societal implications of data-driven technologies, https://datasociety.net/), Demos (a UK think tank with a dedicated research branch for social media analysis, https://www.demos.co.uk/), the Electronic

474   Yenn Lee and Alison Scott-Baumann Table 17.1  List of Keywords Used in the Process of Literature Review (A posteriori) themes

Keywords

Content regulation

moderation curation filter censorship self-censorship algorithm algorithmic bias algorithmic transparency platform platform accountability intermediary liability

free speech hate speech extreme speech no-platforming safe space Generation Snowflake quarantine civility

Truth and authenticity

“fake news” “alternative fact” “post-truth” “echo chamber” “filter bubble” disinformation misinformation malinformation rumour conspiracy theory

information disorder information pollution media literacy information literacy digital literacy data literacy coding education digital forensics anti-intellectualism anti-rationalism anti-establishment

Attention economy

“attention merchant” clickbait SEO bot shillbot chatbot troll troll farm Facebook “dark post” Cambridge Analytica micro-targeting psychographic profiling

meme viral emoji GIF “digital native” millennial dual screening “viewertariat” influencer

Digital privacy and harassment

“networked privacy” “context collapse” “contextual integrity” anonymity doxxing cyber bullying digilantism “online othering” online misogyny misogynoir “e-bile” revenge porn

men’s rights activist (MRA) manosphere incel Red Pill Reddit 4chan “normie” Intellectual Dark Web cyber-Hindus

Digital Ecology of Free Speech   475 National laws and politics

“computational propaganda” “cyber troop” infowar Astroturfing “stealth campaign” data ethics data justice data ownership #BlackLivesMatter #JeSuisCharlie #MeToo #MAGA alt-right gaslighting dog-whistle politics

online radicalisation counter-terrorism sedition net neutrality right to be forgotten right to explanation mass surveillance peer surveillance “sousveillance” Prevent (UK) GDPR (EU) Investigatory Powers (UK) Social Credit System (China) First Amendment (US) Section 230 of CDA (US)

State of the art

hybrid media virtual reality big data blockchain Internet of Things AI machine learning gamification gig economy digital nomad digital exile

darknet deep web Tor encryption hacking DDoS biometric data GPS tracking

Frontier Foundation (an international NGO committed to defending Internet civil ­liberties, https://www.eff.org/), Freedom House (an international watchdog on human rights including digital freedom of expression, https://freedomhouse.org/), and EU-funded research networks such as VOX-Pol (focused on tackling violent online political extremism, https://www.voxpol.eu/) and ONLINERPOL (focused on creating enabling online spaces of political expression, especially in the context of a multi-faith society such as India, http://www.fordigitaldignity.com/). Meanwhile, an increasing number of higher education institutions have established dedicated research centers, such as the University of Oxford’s Oxford Internet Institute (https://www.oii.ox.ac. uk/), Harvard University’s Berkman Klein Center (https://cyber.harvard.edu/) and First Draft (https://firstdraftnews.org/), New York University’s AI Now Institute (https://ainowinstitute.org/), and Cardiff University’s Data Justice Lab (https://datajusticelab.org/). Another noteworthy development is that Silicon Valley’s tech giants are also expanding their research presence, examples of which include Microsoft’s Social Media Collective (https://socialmediacollective.org/) and Facebook’s own research department (https://research.fb.com/).

476   Yenn Lee and Alison Scott-Baumann As the literature review phase commenced, two observations were immediately apparent. First, the existing literature treats free speech in a discrete and contrastive fashion. Questions have been recurrently raised about whether one’s right to free speech should ever be limited, especially in case the content is deemed harmful to others (free speech versus hate speech); whether one truly has freedom of speech if there appears to be a need to hold anything back (free speech versus political correctness and self-censorship); and whether individuals are now freer, or less free, to speak with the prevalence of dig­ital media (offline free speech versus online free speech). Second, the dominant ­discourse is that undesirable content ought to be “pinpointed,” avoided, removed, and institutionally intervened upon (Scrivens & Davies, 2018). Various government initiatives across Europe for the “prevention of online radicalization” epitomize this “pathological” approach (Conway et al., 2017; Home Affairs Committee, 2017). Van der Linden et al. (2017) extend the metaphor and suggest “inoculating the public against misinformation” (see also Roozenbeek & van der Linden, 2018, for information on a game rendition of this suggestion). The approach that pathologizes certain ideas and practices, however, has so far thrown up more questions than answers. Who decides what is intolerably extreme and what is acceptably moderate? Who designs and implements a mechanism that filters out extreme content? How can the public ensure the accountability of the filtering mech­an­ism? These, to name a few, are by no means new questions, but are now further complicated by discrepancies between emerging digital practices and established theories of freedom of expression. The remainder of this chapter exhibits those discrepancies—in four groupings—in order to develop better understandings of the issues, and to invite further, empirically grounded research.

Findings Beliefs, Opinions, and “Alternative Facts” Freedom of speech traditionally refers to the right to express one’s ideas and opinions without fear of government censorship or punishment. However, opinions are now increasingly taking the disguise of so-called “alternative facts.” Since first used by White House aide Kellyanne Conway in January 2017 in an attempt to override the media’s depiction of President Trump’s inauguration ceremony with a more flattering narrative of the event, the phrase “alternative facts” has become a shorthand for factually incorrect information that is nevertheless presented as if it were one of many equally valid options (Associated Press, 2018). This phenomenon is symptomatic of how anti-rationalism and anti-intellectualism have now peaked, most notably in—but not limited to—the United States. As pointed

Digital Ecology of Free Speech   477 out by numerous writers, including Isaac Asimov (“A cult of ignorance,” 1980), Al Gore (The assault on reason, 2007), Andrew Keen (The cult of the amateur, 2007), Susan Jacoby (The age of American unreason, 2009), Shaheed Nick Mohammed (The (dis) information age: The persistence of ignorance, 2012), and Tom Nichols (The death of expertise, 2017), popular unreason can be argued to be a manifestation of “inverted snobbery about educational privilege” (Townson, 2016), echoing the typical populist rhetoric against the elites and experts (boyd, 2017a; Cuevas, 2018; Daniels, 2018). Exemplified by “anti-vaxxers” (a group of campaigners against childhood vaccinations) and climate change sceptics (those who claim that the science behind global warming is nothing but a conspiracy), such anti-establishment views and attitudes are gaining ground fast (Townson, 2016). This is largely attributable to online connections that enable content to be distributed faster and wider than in pre-Internet times. However, another crucial development in recent years is that people who propagate questionable content are increasingly pre-empting criticism by rebranding of falsehoods as “alternative facts” and emotions/convictions-based politics as “post-truth politics” (Blackburn, 2017). To borrow Garland’s paraphrase (2018) of Alexander Nix, co-founder and CEO of controversial political consultancy Cambridge Analytica, information “needn’t be true as long as it’s believable [ . . . ] in the era of ‘alternative facts.’ ” Furthermore, based on a study of 126,000 news stories circulated on Twitter between 2006 and 2017, Vosoughi et al. (2018) demonstrate that rumors spread faster than factchecked stories on social media platforms. According to the researchers, rumors are more likely to get shared because of their “novelty appeal” and, contrary to popular assumption, it is humans rather than bots that are mainly responsible for such sharing. A multi-source study by Chadwick et al. (2018) adds that in the context of online debate individual users may see “sharing problematic news as a cultural norm, a practice that is simply part of ‘what it takes’ to engage politically on social media in order to attract attention and nudge others to take positions” (p. 15). Another neologism, “fake news,” popularly referring to deliberate disinformation, marks the other side of the coin of the heightened information disorder witnessed today. It has become a convenient tool for politicians to use to dismiss any negative press and public criticism against them. Many have pointed out that the term “fake news” discounts the seriousness of the situation at hand and is therefore unhelpful in addressing the rise of sophisticated operations to manipulate information flows and public opinion. Wardle and Derakhshan (2017), for example, argue that the word “fake” does not fully capture the complexity of the different types of misinformation, disinformation, and malinformation. Today’s problem is, so their argument goes, about the entire information ecosystem and its pollution with those various types of incorrect information (see also Jack, 2017; Wardle, 2017). Motivations to pollute the information ecosystem and “hack our attention” vary, ranging from financial to ideological, and in many cases such operations are also motivated by ego and pleasure (boyd, 2017b; Wu, 2016). Regardless of the motivations, what “alternative facts,” “fake news,” and giant Internet intermediaries such as Facebook and Google, who profile their users and deliver targeted adverts accordingly, have together

478   Yenn Lee and Alison Scott-Baumann created is a world where individual citizens no longer know if they are seeing the same information that others are. Tufekci (2017) emphasizes that meaningful public debate and deliberation are not possible “without a common basis of information,” and she goes further to call our time “the democracy-poisoning golden age of free speech” (Tufekci, 2018, italics added). In order to tackle this problem, earlier literature has placed the onus on individual users to actively seek diverse and more “nutritious” sources of news and information, breaking out of the so-called “filter bubbles” and “echo chambers,” or what Lim (2017) terms “algorithmic enclaves,” created by customized web experiences (Johnson, 2012; Miller, 2008; Newton, 2011). However, Freelon (2018a, 2018b) urges a more nuanced approach to the issue, highlighting that there are “ideological asymmetries in the consumption, distribution, and acceptance of disinformation” (2018a, p. 40). To be more specific, conservatives engage with “opinion-reinforcing disinformation” online more substantially than their liberal counterparts, and that is, according to him, due to the conservatives’ long-held distrust in factual reporting by the mass media. What is increasingly observed is the weaponization of conspiracy theories against political opponents. A case in point is that Parkland school shooting survivor David Hogg has been faced with “the plethora of right-wing websites and social media accounts spreading conspiracy theories about him,” including one that he is in fact a hired “crisis actor,” after he emerged as one of the student leaders of the US gun control movement in 2018 (Wilson, 2018).

Content Sharing as a Speech Act Content travels faster and reaches farther than ever in the digital environment. To theorize that, much attention has been paid to the ubiquitous and interconnected nature of online communication, but technological affordances do not fully explain the prevailing online culture of relaying content created by someone else (e.g., “sharing,” “retweeting,” “reblogging”). In this context, a growing body of literature has looked at Internet memes and virals. Shifman’s study (2012) of 30 prominent memetic videos on YouTube, for example, identified six common features of such videos: focus on ordinary people, flawed masculinity, humor, simplicity, repetitiveness, and whimsical content. Berger has written extensively on the psychological dimensions of viral phenomena, and one of his core findings is that content that produces greater emotional arousal—“making your heart race”—is more likely to get shared (Rees-Jones et al., 2015; see also Berger, 2013; Berger & Milkman, 2012; Milkman & Berger, 2014). Facebook’s in-house researcher Kramer and two co-authors at the University of Cornell have claimed, on the basis of a massive real-life experiment on the company’s platform, that “emotional contagion” occurs not only during in-person interaction, as previously assumed, but also on social media, even without direct exchanges or the assistance of nonverbal cues (Kramer et al., 2014). Their paper is not

Digital Ecology of Free Speech   479 about viral content per se, but it is nevertheless noteworthy that it highlights the important role that emotions play in digitally mediated interactions. Here, digitally mediated interactions do not only refer to exchanging lengthy, articulate messages. Frequently they involve nothing more than memes, hashtags, or even emojis, and those seemingly frivolous devices offer more than meets the eye. Wiggins and Bowers (2015, p. 1886) define memes as “remixed, iterated messages which are rapidly spread by members of participatory digital culture for the purpose of continuing a conversation.” At one level, by sharing relatable memes, one may feel “less alone” (Tait, 2017). In many different cultures, memes are also often found to act as “a kind of moral police of the internet,” enabling people to “express their values and disparage those of others in less direct and more acceptable ways” (Miller et al., 2016, p. xvi). In consequence, memes can be a powerful tool for popular mobilization. The “grassmud horse” meme, which was born out of a subversive pun to mock online censors in China, is an apt example (Wines, 2009). Considering memes in Chinese cyberspace as self-expressions in an authoritative climate, An (2013) calls them “the street art of the social web.” However, while being remixed and passed on, memes and other viral content often take on a life of their own, as seen in the case of online cartoon character Pepe the Frog. Its creator, Matt Furie, ended up deciding to kill the character off after it was appropriated as a symbol of white nationalism and the alt-right during the 2016 presidential election in the United States (BBC, 2017). The artist’s efforts to reclaim the innocence of the character were to no avail. To put it another way, “a meme is never just a meme,” in the words of Phillips and Milner (2017, italics added) with reference to Harvard’s decision to rescind admission offers from ten prospective students for having posted rape-apologist, pedophilic, and violently racist memes on Facebook. A May 2018 court ruling in India, observing that forwarding a social media post is equal to endorsing it, also echoes the point that content sharing is a speech act in its own right (Ashok, 2018). A similar evolution has been observed about hashtags. The initial function of hashtags on Twitter was to facilitate the aggregation of related information within the confines of the platform through a crowdsourced tagging system. However, a recently growing trend is that users—especially when facing controversies, conflicts, and crises—choose a pithy phrase, such as #BlackLivesMatter and #JeSuisCharlie [I am Charlie], that serves as a “mini statement” in its own right (Giglietto & Lee, 2017). Papacharissi (2016) argues that hashtags are signifiers that allow crowds to be rendered into “networked publics that want to tell their story collaboratively and on their own terms” (p. 308), or as she also calls them, “affective publics” (see also Anstead & O’Loughlin,  2011; Bruns & Burgess, 2015). Although online users’ readiness to pass appealing—and appalling—content on to others may be part of the human psyche, recent investigations into social media manipulation have also revealed that content shares are much more “engineered” than platform operators make them out to be. For example, it has been widely alleged in the media that Russian agents injected divisive rhetoric into American voters’ news feeds through paid

480   Yenn Lee and Alison Scott-Baumann adverts on Facebook to interfere with the 2016 US presidential campaign (Brandom, 2017). The field has seen an exponential growth of publications and commentary on content manipulation in both academic and non-academic outlets since 2016. Some have focused on who produces propagandist disinformation (e.g., Kirby,  2016; Upadhye, 2018), while others have examined the curious operations of bots and troll farms (e.g., Abeshouse,  2018; Albright,  2018; Confessore et al.,  2018; Digital Forensic Research Lab, 2017; Freedom House, 2017; King et al., 2017; Lotan, 2014; Phillips, 2015). Criticism has also been levelled at platform operators, namely Facebook, Twitter, and YouTube, for the lack of algorithmic transparency and accountability (e.g., Ananny & Crawford, 2018; Bridle, 2017; Hill, 2017). Interesting to note is that much of the criticism has come from former employees and other tech insiders (e.g., Cadwalladr,  2018; Lewis,  2017; Thompson & Vogelstein, 2018).

Privatization of Censorship Censorship has been traditionally understood as something exercised by governmental authorities in order for them to control information flows and to contain expressions of public dissent. However, digital content is increasingly “curated” and “moderated” by a multitude of private actors, including Facebook and other social media companies. Various metaphors have been invoked to describe this process. First, social media companies have carefully positioned themselves as “platforms,” on which their users express themselves. Gillespie (2010,  2017,  2018) points out that this positioning has helped the companies demonstrate their commitment to individual users’ freedom of expression, but simultaneously it has also helped them seek limited liability for what those users say. At least in the US legislative context, being categorized as a common carrier, rather than a publisher, proves to be vitally important. Section 230 of the Communications Decency Act (CDA) of 1996 protects online service providers from being held liable for content posted by their users, paving the way for Silicon Valley’s social media giants today (Electronic Frontier Foundation, n.d.). Napoli and Caplan (2017) challenge the “common carrier” status of social media platforms and online content aggregators, arguing that their businesses in fact fit well within the established parameters of media organizations. A similar semantic tension is observed in various sectors other than media. The two authors present Uber as an encapsulating example, since the company insists that it be considered as a digital service, not a transportation company. Europe’s highest court ruled otherwise in December 2017 (Scott, 2017). It is to be borne in mind that the question about who should regulate social media content is one question that receives differing responses in different countries. Based on a small number of elite interviews, Miridjanian (2018) states that France and Germany are in favor of the platforms themselves doing the job, and Italy thinks it should be the police, while the United States maintains its stance that it is a First Amendment issue. Silicon Valley companies, who shape digital practices around the world, are indeed routinely associated with libertarianism, although writers such as Morozov (interviewed

Digital Ecology of Free Speech   481 by Summers,  2015) and Broockman et al. (2017) clarify that their philosophies do vary but one commonality among them is that they explicitly oppose regulation. This anti-regulatory stance keeps coming up in various discussions on social media content moderation and freedom of expression, as illustrated in Murgia and Warrell (2017, regarding hate speech), Schulson (2018, regarding pseudoscience), and Swisher (2018, interviewing Facebook’s Zuckerberg on “Holocaust denialism”). With or without legal liability, social media platforms are known to monitor and moderate user-generated content as per their own policies. Suzor (2018) points out that there has been very little information, if at all, available to the public on what proportion of social media posts get moderated and why. This is where the metaphor of “community” is called on, as in Facebook’s Community Standards. Given the sheer volume of the content being generated every moment, moderation is largely done algorithmically, but also draws on cheap labor forces in developing countries (Liptak,  2017; Roberts et al.,  2017). The unpleasant and unappreciated work of removing inappropriate content—or what Roberts et al. (2017) describe as “scrubbing the net”—is increasingly outsourced to content moderators located in India or the Philippines, for example, and those moderators are usually left with no support for the mental toll that comes with the work. Additionally, Solon (2018) draws attention to a broader phenomenon of low-paid workers recruited for tasks that bots are supposed to perform; she calls the phenomenon “the rise of ‘pseudo-AI.’ ” The opacity of the algorithms underlying how content is curated and regulated on search engines and social media platforms has proved to pose a threat to democratic functioning. As discussed earlier, individual citizens no longer have a common basis of information (Tufekci, 2017), and this problem cannot simply be resolved by opening up the black box of algorithms. Ananny and Crawford (2018) point out that algorithmic transparency does not automatically lead to fair and effective governance. They identify many limits of the “transparency ideal,” the most notable of which is that it can “invoke neoliberal models of agency,” shifting the burden onto ordinary users, while those users may not have the expertise to scrutinize what they see nor the power to bring about changes where necessary. Another important point is raised by O’Neil (2017): despite the widely held belief that algorithms can make more objective and hence fairer decisions, they are in fact someone’s “opinions embedded in code.” It is therefore unsurprising that Facebook has repeatedly met with public outcries that the site bans posts it should allow (e.g., breastfeeding images on the ground of its no-nudity policy), while allowing posts it should ban (e.g., those glorifying violence against women on the ground of free speech; Hern,  2013; see also Facebook’s first-ever Community Standards Enforcement Report, Facebook, 2018). YouTube has also been subject to recurrent criticisms for its “algorithmic blind spot,” which has in particular left children vulnerable to predatory content (Kobie, 2017). In this context, the fundamental challenge is how to ensure checks and balances against the specific visions and cultures of Silicon Valley companies shaping the rest of the world (Noble, 2018; Stark, 2018). Platforms are not only becoming the arbiters of which content is worth seeing and sharing but also the facilitators of data-driven surveillance, often ending up benefiting

482   Yenn Lee and Alison Scott-Baumann corporate and political power holders. Facebook, for example, has been implicated in a series of privacy breach scandals over the past few years. The latest, and biggest to date, is one that broke out in March 2018 revealing that a London-based political data analytics firm called Cambridge Analytica had exploited a loophole in Facebook’s API and “harvested” personal data from as many as 87 million users for “psychographic voter profiling” in the run-up to the 2016 US presidential election, in order to aid, reportedly, then-candidate Donald Trump. Kim et al. (2018) term this new, technologically enabled practice a “stealth campaign,” warning that we are likely to observe more and more of it. The platform was also believed to have played a “determining role” in inciting violence and hatred against the Muslim Rohingya minority during a 2017 military crackdown in  Myanmar, according to Marzuki Darusman, chairman of the UN Independent International Fact-Finding Mission on Myanmar (cited in Miles, 2018). Concerns have also been raised about the possible abuse of mobile phone location data for surveillance purposes, exemplified in Lee’s case study (2011) of a South Korean corporate’s alleged tracking of “targeted workers” through GPS monitoring. In parallel, China has launched a Social Credit System, an expansion of the concept of financial credit scoring. As per this system, each citizen is rated on all aspects of daily life, rewarded for good deeds such as charity work and recycling, and losing points for  socially undesirable behaviors or even relationships, both online and offline (Zeng, 2018). In this setting, opting out altogether is easier said than done. It is reported that not being online for a prolonged period is also considered as a cause for suspicion by the authorities, especially when it comes to Muslim minorities (Sulaiman & Eckert, 2017). This system, combined with the Chinese government’s keen investment into artificial intelligence and facial recognition technology, has been compared to popular culture’s depictions of a dystopian future, ranging from William Gibson’s iconic cyberpunk novel Neuromancer (1984) to a British sci-fi TV series Black Mirror (2011 to present) (Friend, 2018).

Deliberate Ambiguity, Voluntary Invisibility, and Self-Censorship Given the wide array of complexities and new challenges, as outlined so far, more schools explicitly incorporate “information literacy” and “digital literacy” into their curricula (e.g., Horowitz,  2017; National Literacy Trust,  2018). Such efforts include advising students to be mindful of the possible consequences of whatever they post and share online, and in particular making them realize that on social media platforms their posts are not demarcated by their original purposes and intended audiences. Markwick and boyd (2011, 2014) refer to this phenomenon as a “context collapse,” i.e., social media technologies lump together otherwise distinct audiences and information norms, making it extremely difficult for individual users to manage their social presence and privacy (see also Litt, 2012).

Digital Ecology of Free Speech   483 According to a 2013 survey by the Pew Research Center (Hampton et al., 2014), people are indeed reluctant to express their opinions on controversial subjects on social media platforms, especially if they anticipate disagreement from their peers, and such reluctance is greater on social media than in face-to-face settings. This finding is important yet unsurprising, considering the highly networked and “public-by-default” nature of the digital communication environment. One of the tactics that Marwick and boyd (2014) have discovered being used among American teenagers to safeguard their online privacy is to embed private or sensitive information within what appears to be an ordinary message. This tactic of “hiding in plain sight,” also known as steganography, predates the digital age, but what is remarkable in the social media context is young users’ innovative application of it to create multi-layered access points for what they are really conveying in their online posts. In order to see past the surface content and unlock the full meaning of a post, members of the intended audience need to be able to recognize multiple referents (p. 22). In other words, it is “a sort of interpersonal encryption” achieved through use of “in-jokes, nicknames, code words, and ‘subtweeting,’ ” to name a few (boyd, cited in Quart, 2014). It has been observed, for example, that alt-right trolls are replacing racist slurs with everyday words (“googles,” “skypes,” “yahoos,” and “skittles”) in their tweets to circumvent content moderation. In this lexicon, as Sonnad (2016) deciphers, “googles” means the n-word, “skypes” means Jews, “yahoos” means “spic,” and “skittles” means Muslims and more specifically Muslim refugees, resulting in unintelligible posts such as “Chain the googles.” Moreover, in cyberspace, self-expression increasingly takes place under a fictitious or even non-human identity, challenging the existing notion of personhood and authorship (O’Hehir, 2015). On the one hand, Facebook CEO Mark Zuckerberg once famously claimed that “having two identities for yourself is an example of a lack of integrity” (cited in Kirkpatrick, 2011). This remark resonates with another made by Eric Schmidt in 2009, when he was the CEO of Google: “if you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place” (cited in Esguerra, 2009). A viewpoint shared in these two quotes, as also summarized by van der Nagel and Frith (2015), is that each one of us has one singular identity and we are authentic and behave better when we are true to that identity. On the other hand, many have challenged this viewpoint. Poole, founder of 4chan, argues that “identity is prismatic. [ . . . ] Google and Facebook would have you believe that you are a mirror and that there is one reflection that you have [ . . . ], but in fact we are more like diamonds” (cited in O’Reilly Media, 2011). This statement reminds us of Gergen’s (1991) concept of the “saturated self,” which suggests that the self in the era of postmodernism and digital technology is no longer a bounded being; it has instead been expanded to consist of multiple, overlapping, and relational selves. So far, empirical studies on the correlation between anonymity and undesirable social behavior online have yielded mixed results. Rowe (2015), for example, has found that the occurrence of uncivil reader comments was significantly higher on the Washington Post website, where users are able to maintain their anonymity, than on the Facebook page of

484   Yenn Lee and Alison Scott-Baumann the Washington Post, where commenters are identified. In contrast, Maia and Rezende’s findings (2016) from comparing levels of profanity in user discussions on YouTube, blogs, and Facebook indicate that digital affordances enabling anonymity and homophily are not as significantly related to the expression of foul language as was previously assumed. In this confusing landscape of digital information and communication, when someone decides against speaking unreservedly, that decision may not automatically amount to a case of self-censorship. Giglietto and Lee’s analysis (2017) of 74,047 tweets containing #JeNeSuisPasCharlie [I am not Charlie] in the wake of a shooting attack at the offices of French satirical weekly Charlie Hebdo in January 2015 shows that online users mobilize a repertoire of strategies and methods for engaging with sensitive subjects while navigating the challenges specific to speaking in the social media environment. Such a repertoire often includes being deliberately ambiguous or even cryptic, especially when the users in question are a minority voice. The idea of someone opting for ambiguity, silence, or even invisibility online, however, sits at odds with the mainstream depiction of free speech as a non-negotiable right that one either possesses or does not. No law, including the First Amendment of the US Constitution, protects individuals from all negative consequences of their speech acts. Nevertheless, free speech advocates are extremely resistant to the notion of legal interventions against speech acts. The crux of their argument is encapsulated in Justice Harlan’s epigram “one man’s vulgarity is another’s lyric” in a 1971 US Supreme Court case on freedom of speech (cited in Kannabiran, 2017). To put it another way, the power to decide what speech is socially acceptable should not be vested in the government or certain groups of people, the point also reiterated by Rowland (2018), attorney of the American Civil Liberties Union. The age-old debate over the limits of free speech is now further complicated by ­disagreement on whether posts on social media are public or private. While many such posts are indeed publicly accessible, Zimmer (2010, 2016) cautions that whoever accesses them, especially researchers, should remember that the content is posted within a certain context, which carries with it certain norms and expectations of ­privacy—or what Nissenbaum (2004) terms “contextual integrity.” Digitally mediated speech acts tend to attract particularly inconsistent interpretations of free speech. On the one hand, for example, an increasing number of universities in different parts of the world find themselves in a position where they need to take disciplinary action against students for posting hateful content on WhatsApp or Facebook (e.g., BBC, 2018a, 2018b), despite some arguing that the content has been posted to a closed group, hence private. On the other hand, female victims of digital harassment are often told that much of the blame is on them because they have posted certain information online in the first place. Hill (2012) aptly describes this tendency: “ ‘You’re too public with your digital data, ladies’ may be the new ‘your skirt was too short and you had it coming.’ ” There has already been a substantial literature on why people express themselves online (e.g., Barnes, 2018; Miller et al., 2016; Reagle, 2015), but why people choose not to is still a relatively underexplored question. Baumer (2015), Kania-Lundholm (2018),

Digital Ecology of Free Speech   485 Mayer-Schönberger (2009), and Robards and Lincoln (2017), among others, have redirected attention to this question, highlighting the need for a more nuanced understanding of voluntary disconnection or deletion from online services. Again, this is an area where the existing “either/or” framework is not helpful, and requires further efforts for theoretical reconciliation. To this end, a useful insight can be drawn from Lim’s statement (2014) that “the ability to be visible and invisible—present and hidden, alternating between online and offline spheres—creates a possibility for a movement to articulate its own view, identity, subjectivity and its own emancipation through online and offline practices” (p. 62, italics in original). This statement is made in the context of a study of social movements but in a broader interpretation it reminds us that exercising individual agency in our time involves navigating in both digital and physical spaces, and being able to be both visible (to intended audiences) and invisible (from surveillance and harassment).

Conclusion: Where from Here? Central Implications The purpose of this chapter has been to provide a synthesis of latest developments in the field of freedom of expression and digital media. It has identified several “tension points” in the current literature, where emerging digital practices unsettle the existing philosophies of truth, authenticity and control. The environment in which we now communicate is marked by an unprecedented level of hybridity. It is filled with humans and artificial agents; assigned and chosen identities; physical and virtual spaces and presences; facts, emotions, and convictions; and grassroots as well as “Astroturf ”—or even “stealth”—campaigns (Kim et al., 2018; Klotz, 2007; see also Table 17.2 for a summary of new free-speech challenges posed by digital technologies and practices). Consequently, nothing online can be taken at face value, but at the same time ­ordinary users cannot solely assume the burden of spotting and dismantling all manipulative operations. The strengthening of “media literacy,” “information literacy,” and “digital literacy” of the public has been, to borrow the words of Livingstone (2018), “everyone’s favourite solution” to ensuring a healthy information and communication ecosystem. Despite the importance and usefulness of those concepts, however, attention has been focused too much on individual citizens not being outsmarted by the media and the establishment. According to boyd (2017a), that approach has now “backfired,” deepening the already existing distrust of “liberal media” and experts and effectively contributing to the recent rise of popular unreason. One theme that runs across the present chapter is that pathologizing problematic online content is not suited to addressing today’s heightened information disorder. As revealed through news reports of the Russian information warfare (allegedly during the

486   Yenn Lee and Alison Scott-Baumann Table 17.2  Free Speech Challenges Posed by Digital Technologies and Practices Existing assumptions

Digital complications

Freedom of speech refers to the right to express one’s ideas and opinions without fear of government censorship or punishment.

Digital content is easier to manipulate than non-digital content.

Facts, falsehoods, and opinions are distinct categories.

Factually incorrect information is circulated as if it were one of many equally valid options (a.k.a., “alternative facts”).

Rumors and disinformation travel faster and further than in pre-digital times.

Cascading problematic content has been normalized and is now part of repertoires for online engagement (Chadwick et al., 2018). The label “fake news” has become a convenient tool for politicians and their supporters to use to dismiss any negative press and public criticism against them. Paid adverts are almost indistinguishable from organic content. A speech act entails composing a message with an audience in mind and delivering the message effectively to that audience.

Communications are increasingly taking place under fictitious or even non-human identities (e.g., bots). In social media and digital environments, intended audiences and eventual recipients of the content hardly match. Individual users cannot be sure how their social media posts are presented (or not presented) to others due to the opaqueness of content curation algorithms (Tufekci, 2017, 2018). Online users no longer know if they are seeing the same information that others are (Tufekci, 2017, 2018). Content forwarding is a speech act in its own right, and motivations vary. Content shares are much more “engineered” than platform operators make them out to be.

Censorship is exercised by governmental authorities to control information flows.

Content is now predominantly regulated by digital service providers as per their own policies and the interests of their stakeholders. The process of content moderation is opaque and hence has scope for abuse. There are various and conflicting interpretations of what is public versus private space online. Digital media technologies are grounded in the specifics of local laws and cultures.

Holding back what one wants to say to avoid sanctions is a case of self-censorship.

Users have developed new and innovative ways of managing their presence and privacy in a “context-collapsed” digital environment, including deliberate ambiguity and strategic invisibility.

Digital Ecology of Free Speech   487 Brexit referendum, the 2016 US presidential election, and the ongoing conflict in Ukraine) and the Cambridge Analytica scandal, state-of-the-art innovations in the engineering of attention and persuasion are bound to outpace policymakers, regulators, and the public. It will therefore be an almost unresolvable challenge for certain entities to define what content is acceptable for all, to design a mechanism that filters out the unacceptable, and to ensure that the mechanism is fair, inclusive, and accountable to the public it serves.

Avenues for Future Research As one way of addressing the challenges just summarized, here we suggest some of the attention be redirected from functional literacy (e.g., detecting fake news and avoiding trolls) to the heuristic purposes of language. A few decades earlier Abercrombie (1969) developed, with medical students, a paradigm of the “anatomy of judgement,” through which she demonstrated the power of preconceptions to distort how one evaluates evidence and the common fear of changing one’s position. Adapting this paradigm, we consider that free speech should be grounded in analyzing one’s own decisions. In other words, free speech is an ongoing process, not a static position, of reflection upon how one decides what one believes and how one knows what one can say in a given situation (Scott-Baumann, 2017, 2018). Building on this literature review chapter, our next step will be to investigate the ways in which university students understand and exercise their communicative agency in difficult social and political situations, both online and offline, including ones that involve charismatic speakers and provocative speech typical of populism. We are looking at how university students in different parts of Europe and Asia navigate controversial topics of race, gender, religion, and politics, and what enhances or undermines their agency, both online and offline. The ultimate goal of our ongoing and long-term investigation is to reconnect young people to democratic processes and encourage them to realize their potential as “diffusers” of innovative knowledge and attitudes (Rogers, 2003) in the face of socioeconomic and gender inequality. We focus on university students because they are portrayed contradictorily. On the one hand, the media warn that they are, unlike their counterparts in the 1960s and 1970s, becoming indifferent and inactive to dangerous political developments around them (Schwarz,  2018) or, worse yet, uncritically drawn to “hipster fascism” (Allen,  2018; O’Brien, 2018). On the other hand, in many countries, they are often steered away from political matters through various legal and policy interventions. The UK government’s Prevent agenda as part of the Counter-Terrorism and Security Act 2015 is one example of such interventions. Heath-Kelly (2017) shows that fears about students being radicalized into terrorism creates the perception that universities are “pre-criminal space” (p.  312). Born out of this perception, constraints are imposed upon the curriculum, upon the guest speakers who can be invited onto campus (Scott-Baumann, 2018, p. 242), and upon what student unions as charities may or may not comment upon (Charity

488   Yenn Lee and Alison Scott-Baumann Commission, 2018). To put it another way, a university campus is an ironic site of populism, as it is seen both as a symbol of elitism and intellectualism, which populists oppose, and as a place where young and susceptible minds may be recruited into populist waves. Our review suggests that policies to shield students from dissent and “dangerous ideas” (e.g., Islam and Muslim culture on UK campuses) may inadvertently—but in some cases also intentionally—do a disservice to the students and render them unable to work their way through to balanced discourse. Being exposed to opinions that are different from the dominant narratives can help young citizens to develop their capability to deconstruct some of the false and socially harmful arguments that they may encounter on campus as well as online, but more importantly such exposure can encourage them to anatomize their own expressions and judgements.

References Abercrombie, M.  L.  J. (1969). The anatomy of judgement: Investigation into the processes of perception and reasoning. London: Free Association Books. Abeshouse, B. (2018, February 8). Troll factories, bots and fake news: Inside the Wild West of social media. Al Jazeera. Retrieved from http://www.aljazeera.com/blogs/americas/2018/02/ troll-factories-bots-fake-news-wild-west-social-media-180207061815575.html Albertazzi, D. & McDonnell, D. (2008). Introduction: The sceptre and the spectre. In D. Albertazzi & D. McDonnell (Eds.), Twenty-first century populism: The spectre of Western European democracy (pp. 1–14). New York: Palgrave Macmillan. Albright, J. (2018, February 15). Trolls on Twitter: How mainstream and local news outlets were used to drive a polarized news agenda. Berkman Klein Center Collection. Retrieved from https://medium.com/berkman-klein-center/trolls-on-twitter-how-mainstream-andlocal-news-outlets-were-used-to-drive-a-polarized-news-agenda-e8b514e4a37a Allen, C. (2018, July 30). “Hipster fascists”: The normalization of the radical right isn’t just happening in America. Rantt Media. Retrieved from https://rantt.com/hipster-fasciststhe-normalization-of-the-radical-right-isnt-just-happening-in-america/ An, X.  M. (2013). Memes, the street art of the internet. CreativeMornings. Retrieved from https://creativemornings.com/talks/an-xiao-mina/1 [video] Ananny, M. & Crawford, K. (2018). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989. Anstead, N. & O’Loughlin, B. (2011). The emerging viewertariat and BBC Question Time: Television debate and real-time commenting online. The International Journal of Press/ Politics, 16(4), 440–462. Ashok, K. M. (2018, May 11). Forwarding social media posts equal to endorsing it: Madras HC denies anticipatory bail to BJP leader S. Ve Shekher. Live Law. Retrieved from http://www. livelaw.in/forwarding-social-media-posts-equal-to-endorsing-it-madras-hc-deniesanticipatory-bail-to-bjp-leader-s-ve-shekher/ Associated Press (2018, January 16). German linguists: “Alternative facts” the non-word of 2017. AP News. Retrieved from https://www.apnews.com/ea702667580744c2a53bf9475855 b2c2/German-linguists:-%E2%80%99Alternative-facts%E2%80%99-the-non-word-of-2017

Digital Ecology of Free Speech   489 Barnes, R. (2018). Uncovering online commenting culture: Trolls, fanboys and lurkers. Basingstoke, UK: Palgrave Macmillan. Baumer, E. P. S., Ames, M. G., Burrell, J., Brubaker, J. R., & Dourish, P. (2015). Why study technology non-use? First Monday, 20(11). Retrieved from http://firstmonday.org/ojs/ index.php/fm/article/view/6310/5137 BBC (2017, May 8). “Hijacked” Pepe the Frog “killed off.” BBC News. Retrieved from http:// www.bbc.co.uk/news/world-us-canada-39843468 BBC (2018a, March 20). Exeter university students suspended over racism and rape claims. BBC News. Retrieved from https://www.bbc.co.uk/news/uk-england-devon-43473517 BBC (2018b, May 9). University of Warwick suspends 11 students over hate posts. BBC News. Retrieved from https://www.bbc.co.uk/news/uk-england-coventry-warwickshire-44052070 Beckett, L. & Burke, J. (2017, May 27). Pathway to extremism: What neo-Nazis and jihadis have in common. The Guardian. Retrieved from http://www.theguardian.com/us-news/2017/ may/27/extremism-terrorism-far-right-neo-nazi-devon-arthurs Berger, J. (2013). Contagious: Why things catch on. New York, NY: Simon & Schuster. Berger, J. & Milkman, K. L. (2012). What makes online content viral? Journal of Marketing Research, 49(2), 192–205. Blackburn, S. (2017, July). There’s nothing new about post-truth politics. Prospect. Retrieved from https://www.prospectmagazine.co.uk/magazine/theres-nothing-new-about-posttruth-politics boyd, d. (2017a, January 5). l, Did media literacy backfire? Data & Society: Points. Retrieved from https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d boyd, d. (2017b, January 5). Hacking the attention economy. Data & Society: Points. Retrieved from https://points.datasociety.net/hacking-the-attention-economy-9fa1daca7a37 Bradshaw, S. & Howard, P. N. (2018). Challenging truth and trust: A global inventory of or­gan­ ized social media manipulation. Oxford, UK: Oxford Internet Institute. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2018/07/ct2018.pdf Brandom, R. (2017, November 7). Why were the Russian Facebook ads so strange? The Verge. Retrieved from https://www.theverge.com/2017/11/7/16620380/russia-facebook-ad-berniesanders-swimsuit Bridle, J. (2017, November 6). Something is wrong on the internet. James Brindle. Retrieved from https://medium.com/@jamesbridle/something-is-wrong-on-the-internet-c39c471271d2 Broockman, D., Ferenstein, G. F., & Malhotra, N. (2017). The political behaviour of wealthy Americans: Evidence from technology entrepreneurs. Stanford GSB Working Paper 3581. Retrieved from https://www.gsb.stanford.edu/gsb-cmis/gsb-cmis-download-auth/441556 Brooker, P. D. & Barnett, J. (2016, December 13). Where is the “alt-left” on social media? The Conversation. Retrieved from http://theconversation.com/where-is-the-alt-left-on-socialmedia-70290 Bruns, A. & Burgess, J. (2015). Twitter hashtags from ad hoc to calculated publics. In N. Rambukkana (Ed.), Hashtag publics: The power and politics of discursive networks (pp. 13–28). New York, NY: Peter Lang. Cadwalladr, C. (2018, March 18). “I made Steve Bannon’s psychological warfare tool”: Meet the data war whistleblower. The Guardian. Retrieved from: https://www.theguardian.com/ news/2018/mar/17/data-war-whistleblower-christopher-wylie-faceook-nix-bannon-trump Canovan, M. (1981). Populism. New York: Harcourt Brace Jovanovich. Chadwick, A. (2013). The hybrid media system: Politics and power. Oxford: Oxford University Press.

490   Yenn Lee and Alison Scott-Baumann Chadwick, A., Vaccari, C., & O’Loughlin, B. (2018). Do tabloid poison the well of social media? Explaining democratically dysfunctional news sharing. New Media & Society. Advance online publication. https://doi.org/10.1177/1461444818769689 Charity Commission. (2018). Charity commission operational guidance: 48. Students’ unions. Retrieved from http://ogs.charitycommission.gov.uk/g048a001.aspx Confessore, N., Dance, G. J. X., Harris, R., & Hansen, M. (2018, January 27). The follower factory. The New York Times. Retrieved from https://www.nytimes.com/interactive/2018/01/27/ technology/social-media-bots.html Conway, M., Khawaja, M., Lakhani, S., Reffin, J., Robertson, A., & Weir, D. (2017). Disrupting Daesh: Measuring takedown of online terrorist material and its impacts. VOX-Pol Network of Excellence. Retrieved from http://www.voxpol.eu/download/vox-pol_publication/ DCUJ5528-Disrupting-DAESH-1706-WEB-v2.pdf Cuevas, J. A. (2018, January–February). A new reality? The far right’s use of cyberharassment against academics. Academe. Retrieved from https://www.aaup.org/article/new-reality-farrights-use-cyberharassment-against-academics#.WooQAxPFK8 Daniels, J. (2018, February). Far-right attacks on faculty hurt us all. Clarion. Retrieved from http://www.psc-cuny.org/clarion/february-2018/far-right-attacks-faculty-hurt-us-all Digital Forensic Research Lab (2017, December 29). #BotSpot: How bot-makers decorate bots. DFR Lab. Retrieved from https://medium.com/dfrlab/botspot-how-bot-makers-decoratebots-4d2ae35bdf26 Electronic Frontier Foundation (n.d.). Section 230 of the Communications Decency Act. Retrieved from https://www.eff.org/issues/cda230 Esguerra, R. (2009, December10). Google CEO Eric Schmidt dismisses the importance of privacy. Electronic Frontier Foundation. Retrieved from https://www.eff.org/deeplinks/ 2009/12/google-ceo-eric-schmidt-dismisses-privacy Facebook (2018). Community standards enforcement preliminary report. Retrieved from https://transparency.facebook.com/community-standards-enforcement Fernández-Armesto, F. (2016, February 18). Taking offence: A new pandemic? Times Higher Education. Retrieved from https://www.timeshighereducation.com/comment/felipefernandez-armesto-taking-offence-a-new-pandemic Freedom House (2017). Freedom on the net 2017: Manipulating social media to undermine democracy. Freedom House. Retrieved from https://freedomhouse.org/report/freedom-net/ freedom-net-2017 Freelon, D. (2018a, March 8). Personalized information environments and their potential consequences for disinformation. In Annenberg School for Communication (Ed.), Understanding and addressing the disinformation ecosystem (pp. 38–44). Retrieved from https://firstdraftnews.org/wp-content/uploads/2018/03/The-Disinformation-Ecosystem20180207-v2.pdf Freelon, D. (2018b, May 2). Filter bubbles are only part of the problem. Trust, Media and Democracy. Retrieved from https://medium.com/trust-media-and-democracy/filterbubbles-are-only-part-of-the-problem-d3add635651c Friend, T. (2018, May 14). How frightened should we be of A.I.? The New Yorker. Retrieved from https://www.newyorker.com/magazine/2018/05/14/how-frightened-should-webe-of-ai Garland, C. (2018, April 3). “It needn’t be true as long as it’s believable”: Manipulating data to strategize propaganda in the era of “alternative facts.” Discover Society. https://discoversociety.

Digital Ecology of Free Speech   491 org/2018/04/03/it-neednt-be-true-as-long-as-its-believable-manipulating-data-tostrategize-propaganda-in-the-era-of-alternative-facts/ Gerbaudo, P. (2018). Social media and populism: An elective affinity? Media, Culture & Society. Advance online publication. https://doi.org/10.1177/0163443718772192 Gergen, K. J. (1991). The saturated self: Dilemmas of identity in contemporary life. New York: Basic Books. Giglietto, F. & Lee, Y. (2017). A Hashtag worth a thousand words: Discursive strategies around #JeNeSuisPasCharlie after the 2015 Charlie Hebdo shooting. Social Media + Society, 3(1), 2,056,305,116,686,992. Gillespie, T. (2010). The politics of “platforms.” New Media & Society, 12(3), 347–364. Gillespie, T. (2017, August 24). The platform metaphor, revisited. Culture Digitally. Retrieved from http://culturedigitally.org/2017/08/platform-metaphor/ Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. London: Yale University Press. Graham, T., Jackson, D., & Wright, S. (2016). “We need to get together and make ourselves heard”: Everyday online spaces as incubators of political action. Information, Communication & Society, 19(10), 1373–1389. Hampton, K., Rainie, L., Lu, W., Dwyer, M., Shin, I., & Purcell, K. (2014). Social media and the “spiral of silence.” Washington, DC: Pew Research Center. Retrieved from http://assets. pewresearch.org/wp-content/uploads/sites/14/2014/08/PI_Social-networks-anddebate_082614.pdf Heath-Kelly, C. (2017). The geography of pre-criminal space: Epidemiological imaginations of radicalisation risk in the UK Prevent Strategy, 2007–2017. Critical Studies on Terrorism, 10, 297–310. Hern, A. (2013, October 22). Facebook’s changing standards: From beheading to breastfeeding images. The Guardian. Retrieved from: https://www.theguardian.com/technology/2013/ oct/22/facebook-standards-beheading-breastfeeding-social-networking Hill, K. (2012, April 2). The reaction to “Girls Around Me” was far more disturbing than the “creepy” app itself. Forbes. Retrieved from https://www.forbes.com/sites/kashmirhill/ 2012/04/02/the-reaction-to-girls-around-me-was-far-more-disturbing-than-the-creepyapp-itself/#5efd768268d4 Hill, K. (2017, August 25). Facebook figured out my family secrets, and it won’t tell me how. Gizmodo. Retrieved from http://gizmodo.com/facebook-figured-out-my-family-secretsand-it-wont-tel-1797696163 Home Affairs Committee (2017). Hate crime: Abuse, hate and extremism online. House of Commons. Retrieved from https://publications.parliament.uk/pa/cm201617/cmselect/ cmhaff/609/609.pdf Horowitz, J. (2017, October 18). In Italian schools, reading, writing and recognizing fake news. New York Times. Retrieved from https://www.nytimes.com/2017/10/18/world/europe/italyfake-news.html Howard, P. N. & Kollanyi, B. (2016). Bots, #StrongerIn, and #Brexit: Computational propaganda during the UK–EU referendum. ArXiv:1606.06356 [cs.SI]. Retrieved from http://arxiv.org/ abs/1606.06356 Hurst, G. (2017, February 23). Lecturers offer advice on “stifling” right-wing views. The Times. Retrieved from https://www.thetimes.co.uk/edition/news/lecturers-offer-advice-onstifling-right-wing-views-9xh78vwcm?mc_cid=a8ee39dff5&mc_eid=7d3735eeeb

492   Yenn Lee and Alison Scott-Baumann Jack, C. (2017). Lexicon of lies: Terms for problematic information. Data & Society. Retrieved from https://datasociety.net/pubs/oh/DataAndSociety_LexiconofLies.pdf Jane, E. A. (2018). Gendered cyberhate as workplace harassment and economic vandalism. Feminist Media Studies. Advance online publication. https://doi.org/10.1080/14680777.2018 .1447344 Johnson, C. A. (2012). The information diet: A case for conscious consumption. Sebastopol, CA: O’Reilly Media. Joint Committee on Human Rights (2018). Freedom of speech in universities: Fourth report of session 2017–19. Retrieved from https://publications.parliament.uk/pa/jt201719/jtselect/ jtrights/589/589.pdf Kaján, E. (2017). Hate online: Anti-immigration rhetoric in Darknet. Nordia Geographical Publications, 46(3), 3–22. Kania-Lundholm, M. (2018, May 31). Why online disconnection matters: Voluntary ICT non-use. WIAS Talk, University of Westminster, London, UK. Kannabiran, K. (2017, July 5). “Hard words break no bones”: Sedition, free speech, academic freedoms and sovereignty in India . DiscoverSociety. Retrieved from https://discoversociety. org/2017/07/05/hard-words-break-no-bones-sedition-free-speech-academic-freedomsand-sovereignty-in-india/ Kim, Y. M., Hsu, J., Neiman, D., Kou, C., Bankston, L., Kim, S. Y., . . . , & Raskutt, G. (2018). The stealth media? Groups and targets behind divisive issue campaigns on Facebook. Political Communication. Advance online publication. https://doi.org/10.1080/10584609.2018.1476425 King, G., Pan, J., & Roberts, M.  E. (2017). How the Chinese government fabricates social media posts for strategic distraction, not engaged argument. American Political Science Review, 111(3), 484–501. Kirby, E. J. (2016, December 5). The city getting rich from fake news. BBC News. Retrieved from http://www.bbc.co.uk/news/magazine-38168281 Kirkpatrick, D. (2011). The Facebook effect: The real inside story of Mark Zuckerberg and the world’s fastest growing company. London: Virgin Books. Klotz, R. J. (2007). Internet campaigning for grassroots and Astroturf support. Social Science Computer Review, 25(1), 3–12. Kobie, N. (2017, November 24). YouTube, kids and Silicon Valley’s recurring algorithmic blind  spot. Wired. Retrieved from http://www.wired.co.uk/article/youtube-kidsmoderation-google Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massivescale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790. Lee, K.-S. (2011). Interrogating “Digital Korea”: Mobile phone tracking and the spatial expansion of labour control. Media International Australia, 141(1), 107–117. Lewis, P. (2017, October 6). “Our minds can be hijacked”: The tech insiders who fear a smartphone dystopia. The Guardian. Retrieved from http://www.theguardian.com/ technology/2017/oct/05/smartphone-addiction-silicon-valley-dystopia Lim, M. (2014). Seeing spatially: People, networks and movements in digital and urban spaces. International Development Planning Review, 36(1), 51–72. Lim, M. (2017). Freedom to hate: Social media, algorithmic enclaves, and the rise of tribal nationalism in Indonesia. Critical Asian Studies, 49(3), 411–427. Liptak, A. (2017, May 21). Leaked moderation guidelines reveal how Facebook approaches handling graphic content. The Verge. Retrieved from https://www.theverge.com/2017/5/21/ 15672242/facebook-files-moderation-guidelines-graphic-content-online

Digital Ecology of Free Speech   493 Litt, E. (2012). Knock, knock. Who’s there? The imagined audience. Journal of Broadcasting & Electronic Media, 56(3), 330–345. Livingstone, S. (2018, May 8). Media literacy—everyone’s favourite solution to the problems of regulation. LSE Media Policy Project Blog. Retrieved from http://blogs.lse.ac.uk/ mediapolicyproject/2018/05/08/media-literacy-everyones-favourite-solution-to-the-problemsof-regulation/ Lotan, G. (2014, June 5). (Fake) friends with (real) benefits. I ♥ data. Retrieved from https:// medium.com/i-data/fake-friends-with-real-benefits-eec8c4693bd3 Maia, R. C. M. & Rezende, T. A. S. (2016). Respect and disrespect in deliberation across the networked media environment: Examining multiple paths of political talk. Journal of Computer-Mediated Communication, 21(2), 121–139. Marwick, A. & boyd, d. (2011). I tweet honestly, I tweet passionately: Twitter users, context collapse, and the imagined audience. New Media & Society, 13(1), 114–133. Marwick, A. & boyd, d. (2014). Networked privacy: How teenagers negotiate context in social media. New Media & Society 16(7), 1051–1067. Marwick, A. & Lewis, R. (2017). Media manipulation and disinformation online. Data & Society. Retrieved from https://datasociety.net/pubs/oh/DataAndSociety_MediaManipulationAnd DisinformationOnline.pdf Mayer-Schönberger, V. (2009). Delete: The virtue of forgetting in the digital age. Princeton, NJ: Princeton University Press. Merelli, A. (2016, November 29). The key difference between populism and fascism. Quartz. Retrieved from https://qz.com/847040/the-key-difference-between-populism-and-fascism/ Miles, T. (2018, March 12). U.N.  investigators cite Facebook role in Myanmar crisis. Reuters. Retrieved from https://uk.reuters.com/article/us-myanmar-rohingya-facebook/u-ninvestigators-cite-facebook-role-in-myanmar-crisis-idUKKCN1GO2PN Milkman, K.  L. & Berger, J. (2014). The science of sharing and the sharing of science. Proceedings of the National Academy of Sciences, 111(Supplement 4), 13,642–13,649. Miller, A. (2008, March). How the news distorts our worldview. TED. Retrieved from https:// www.ted.com/talks/alisa_miller_shares_the_news_about_the_news [video] Miller, D., Costa, E., Haynes, N., McDonald, T., Nicolescu, R., Sinanan, J. Spyer, J., . . . Wang, X. (2016). How the world changed social media. London: UCL Press. Retrieved from http:// discovery.ucl.ac.uk/1474805/1/How-the-World-Changed-Social-Media.pdf Miridjanian, A. (2018, January 22). Réseaux sociaux: Comment réguler sans (toujours) censurer? Libération. Retrieved from http://www.liberation.fr/planete/2018/01/22/reseauxsociaux-comment-reguler-sans-toujours-censurer_1622027 Mounk, Y. (2018). The people vs. democracy: Why our freedom is in danger and how to save it. Cambridge, MA: Harvard University Press. Mudde, C. & Kaltwasser, C. R. (2017). Populism: A very short introduction. Oxford: Oxford University Press. Murgia, M. & Warrell, H. (2017, April 8). Why tech companies struggle with hate speech. The  Financial Times. Retrieved from https://www.ft.com/content/2aaa13b2-1ba2-11e7bcac-6d03d067f81f Napoli, P. M. & Caplan, R. (2017). Why media companies insist they’re not media companies, why they’re wrong, and why it matters. First Monday, 22(5). Retrieved from http://journals. uic.edu/ojs/index.php/fm/article/view/7051/6124 National Literacy Trust (2018, February 7). We launch call for evidence on fake news and critical literacy. National Literacy Trust. Retrieved from https://literacytrust.org.uk/news/ we-launch-call-evidence-fake-news-and-critical-literacy/

494   Yenn Lee and Alison Scott-Baumann Newton, E. (2011). How much comfort news is in your information diet? Searchlights and  Sunglasses. Retrieved from http://www.searchlightsandsunglasses.org/how-muchcomfort-news-is-in-your-information-diet/ Nissenbaum, H. (2004). Privacy as contextual integrity. Washington Law Review, 79(1), 119–158. Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press. O’Brien, P. (2018, March 1). Fascism in Italy: The hipster fascists trying to bring Mussolini back into the mainstream. Channel 4 News. Retrieved from https://www.channel4.com/ news/fascism-in-italy-the-hipster-fascists-trying-to-bring-mussolini-back-into-themainstream [video] O’Hehir, A. (2015, July 23). “A Gay Girl in Damascus”: Behind the twisted tale of a blogger who  “catfished” the whole world. Salon. Retrieved from https://www.salon.com/ 2015/07/23/a_gay_girl_in_damascus_behind_the_twisted_tale_of_a_blogger_who_ catfished_the_whole_world/ O’Neil, C. (2017, April). The era of blind faith in big data must end. TED. Retrieved from https://www.ted.com/talks/cathy_o_neil_the_era_of_blind_faith_in_big_data_must_end. [video] O’Reilly Media (2011, October 18). Chris Poole, “High Order Bit” talk. Web 2.0 Summit 2011. O’Reilly Media YouTube Channel. Retrieved from https://youtu.be/e3Zs74IH0mc [video] Papacharissi, Z. (2016). Affective publics and structures of storytelling: Sentiment, events and mediality. Information, Communication & Society, 19(3), 307–324. Perry, D.  M. (2018, May 11). If you’re worried about free speech on campus, don’t fear ­students—fear the Koch brothers. Pacific Standard. Retrieved from https://psmag.com/ education/koch-heads Phillips, W. (2015). This is why we can’t have nice things: Mapping the relationship between online trolling and mainstream culture. Cambridge, MA: MIT Press. Phillips, W. & Milner, R. M. (2017, June 6). The Harvard case shows a meme is never “just” a  meme. Motherboard. Retrieved from https://motherboard.vice.com/en_us/article/ zmen4y/the-harvard-case-shows-a-meme-is-never-just-a-meme Poushter, J. (2015, November 20). 40 percent of Millennials OK with limiting speech offensive to minorities. Pew Research Center. Retrieved from http://www.pewresearch.org/facttank/2015/11/20/40-of-millennials-ok-with-limiting-speech-offensive-to-minorities/ Prensky, M. (2001). Digital natives, digital immigrants: Part 1. On the Horizon, 9(5), 1–6. Quart, A. (2014, April 25). Status update. The New York Times. Retrieved from https://www. nytimes.com/2014/04/27/books/review/its-complicated-by-danah-boyd.html Reagle, J. M., Jr. (2015). Reading the comments: Likers, haters, and manipulators at the bottom of the web. Cambridge, MA: MIT Press. Rees-Jones, L., Milkman, K. L., & Berger, J. (2015, April 14). The secret to online success: What makes content go viral. Scientific American. Retrieved from https://www.scientificamerican. com/article/the-secret-to-online-success-what-makes-content-go-viral/ Robards, B. & Lincoln, S. (2017). Uncovering longitudinal life narratives: Scrolling back on Facebook. Qualitative Research, 17(6), 715–730. Roberts, S., Wells, B., Cassidy, C., & Howlader, S. (2017, May 28). Scrubbing the net: The content moderators. Al Jazeera. Retrieved from http://www.aljazeera.com/programmes/ listeningpost/2017/05/scrubbing-net-content-moderators-170527124251892.html

Digital Ecology of Free Speech   495 Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: The Free Press. Roozenbeek, J. & van der Linden, S. (2018, February 20). Now there’s a game you can play to “vaccinate” yourself against fake news. The Conversation. Retrieved from https://theconversation. com/now-theres-a-game-you-can-play-to-vaccinate-yourself-against-fake-news-92074 Rowe, I. (2015). Civility 2.0: A comparative analysis of incivility in online political discussion. Information, Communication & Society, 18(2), 121–138. Rowland, L. (2018, March 9). Free speech can be messy, but we need it. ACLU. Retrieved from https://www.aclu.org/blog/free-speech/free-speech-can-be-messy-we-need-it Schulson, M. (2018, June 8). Should Google and Facebook censor the pseudoscience posted to their sites? Slate. Retrieved from https://slate.com/technology/2018/06/should-googleand-facebook-censor-pseudoscience.html Schwarz, G. (2018, April 18). My family has a Nazi past. I see that ideology returning across Europe. The Guardian. https://www.theguardian.com/commentisfree/2018/apr/18/familynazi-past-ideology-europe-germany-fascism-far-right Scott, M. (2017, December 20). Uber is a transportation company, Europe’s highest court rules. Politico. Retrieved from https://www.politico.eu/article/uber-ecj-ruling/ Scott-Baumann, A. (2017). Ideology, utopia and Islam on campus: How to free speech a little from its own terrors. Education, Citizenship and Social Justice, 12(2), 159–176. Scott-Baumann, A. (2018). “Dual use research of concern” and “select agents”: How researchers can use free speech to avoid “weaponising” academia. Journal of Muslims in Europe, 7(2), 237–261. Scrivens, R. & Davies, G. (2018, February 14). Identifying radical content online. VOXPol Network of Excellence. Retrieved from http://www.voxpol.eu/identifying-radicalcontent-online/ Shifman, L. (2012). An anatomy of a YouTube meme. New Media & Society, 14(2), 187–203. Sloam, J. (2014). New voice, less equal: The civic and political engagement of young people in the United States and Europe. Comparative Political Studies, 47(5), 663–688. Solon, O. (2018, July 6). The rise of “pseudo-AI”: How tech firms quietly use humans to do bots’ work. The Guardian. Retrieved from https://www.theguardian.com/technology/2018/ jul/06/artificial-intelligence-ai-humans-bots-tech-companies Sonnad, N. (2016, October 1). Alt-right trolls are using these code words for racial slurs online. Quartz.Retrievedfromhttps://qz.com/798305/alt-right-trolls-are-using-googles-yahoos-skittlesand-skypes-as-code-words-for-racial-slurs-on-twitter/ Stark, L. (2018, February 17). A lack of emotional intelligence is fueling misogyny and racism at Google—and across Silicon Valley. Berkman Klein Center Collection. Retrieved from https://medium.com/@lukestark/a-lack-of-emotional-intelligence-is-fueling-misogynyand-racism-at-google-and-across-silicon-2d50f3b4c6db Sulaiman, E. & Eckert, P. (2017, September 11). China runs region-wide re-education camps in Xinjiang for Uyghurs and other Muslims. Radio Free Asia. Retrieved from https://www.rfa. org/english/news/uyghur/training-camps-09112017154343.html Summers, J. (2015, December 26). Evgeny Morozov on Uber’s disruptive, libertarian lie: “Whatever the CEO of Uber might say about Ayn Rand, the company itself is interested only in producing regulation that works in its favor.” Salon. Retrieved from https://www.salon.com/2015/12/26/ evgeny_morozov_on_ubers_disruptive_libertarian_lie_whatever_the_ceo_of_uber_might_ say_about_ayn_rand_the_company_itself_is_interested_only_in_producing_regulation_ that_works_in_its_favor/

496   Yenn Lee and Alison Scott-Baumann Suzor, N. (2018, May 8). What proportion of social media posts get moderated, and why? Digital Social Contract. Retrieved from https://digitalsocialcontract.net/what-proportionof-social-media-posts-get-moderated-and-why-db54bf8b2d4a Swisher, K. (2018, July 18). Zuckerberg: The Recode interview. Recode. Retrieved from https:// www.recode.net/2018/7/18/17575156/mark-zuckerberg-interview-facebook-recodekara-swisher Tait, A. (2017, March 31). It me: How memes made us feel less alone. New Statesman. Retrieved from https://www.newstatesman.com/science-tech/internet/2017/03/it-me-how-memesmade-us-feel-less-alone Thompson, N. & Vogelstein, F. (2018, February 12). Inside Facebook’s two years of hell. Wired. Retrieved from https://www.wired.com/story/inside-facebook-mark-zuckerberg2-years-of-hell/ Townson, S. (2016, January 26). Why people fall for pseudoscience (and how academics can fight back). The Guardian. Retrieved from https://www.theguardian.com/higher-educationnetwork/2016/jan/26/why-people-fall-for-pseudoscience-and-how-academicscan-fight-back Tufekci, Z. (2017, September). We’re building a dystopia just to make people click on ads. TED. Retrieved from https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_ just_to_make_people_click_on_ads [video] Tufekci, Z. (2018). It’s the (democracy-poisoning) golden age of free speech. Wired. Retrieved from https://www.wired.com/story/free-speech-issue-tech-turmoil-new-censorship/ Upadhye, J. (2018, June 28). Meet Mexico’s king of fake news. BuzzFeed. Retrieved from https:// www.buzzfeed.com/watch/video/59338?utm_term=.doNP9YL1J [video] Van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Advanced Science News, 1(2), 1–7. Van der Nagel, E. & Frith, J. (2015). Anonymity, pseudonymity, and the agency of online identity: Examining the social practices of r/Gonewild. First Monday, 20(3). Retrieved from http://firstmonday.org/ojs/index.php/fm/article/view/5615 Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. Wardle, C. (2017, February 16). Fake news. It’s complicated. First Draft. Retrieved from https:// firstdraftnews.org:443/fake-news-complicated/ Wardle, C. & Derakhshan, H. (2017). Information disorder: Towards an interdisciplinary framework for research and policy making. Council of Europe. Retrieved from https:// firstdraftnews.org/wp-content/uploads/2017/11/PREMS-162317-GBR-2018-Reportde%CC%81sinformation-1.pdf?x39968 Weiss, B. (2018, May 8). Meet the renegades of the Intellectual Dark Web. The New York Times. Retrieved from https://www.nytimes.com/2018/05/08/opinion/intellectual-dark-web.html Wiggins, B. E. & Bowers, G. B. (2015). Memes as genre: A structurational analysis of the memescape. New Media & Society, 17(11), 1886–1906. Wilson, J. (2018, February 21). Crisis actors, deep state, false flag: the rise of conspiracy theory code words. The Guardian. Retrieved from https://www.theguardian.com/us-news/2018/ feb/21/crisis-actors-deep-state-false-flag-the-rise-of-conspiracy-theory-code-words Wines, M. (2009, March 12). A dirty pun tweaks China’s online censors. New York Times. Retrieved from http://www.nytimes.com/2009/03/12/world/asia/12beast.html Wright, S. (2012). From “third place” to “third space”: Everyday political talk in non-political online spaces. Javnost—The Public, 19(3), 5–20.

Digital Ecology of Free Speech   497 Wu, T. (2016). The attention merchants: The epic scramble to get inside our heads. New York: Knopf Publishing Group. Zeng, M. J. (2018, January 23). China’s Social Credit System puts its people under pressure to be model citizens. The Conversation. Retrieved from http://theconversation.com/ chinas-social-credit-system-puts-its-people-under-pressure-to-be-model-citizens-89963 Zimmer, M. (2010). But the data is already public: On the ethics of research in Facebook. Ethics & Information Technology, 12(4), 313–325. Zimmer, M. (2016, May 14). OkCupid study reveals the perils of big-data science. Wired. Retrieved from https://www.wired.com/2016/05/okcupid-study-reveals-perils-big-data-science/

Section 6

C I T I Z E NSH I P, P OL I T IC S , A N D PA RT IC I PAT ION

chapter 16

ESCR R ev iew Citizenship and Politics Simeon J. Yates, Bridgette Wessels, Paul Hepburn, Alexander Frame, and Vishanth Weerakkody

Introduction This chapter briefly explores the outcomes of the literature review and expert Delphi review process for the citizenship and politics domain. As with the other review chapters the goal is not to work through a large number of examples from the literature. Instead, building on the methods described in chapter 2, we will first set out the results of the digital humanities-based analyses of the literature and the content analysis of methods and theory. We will highlight the major topics and concepts within the ­literature—providing a few general examples. These are not intended to be the “most important” examples from the literature but rather simply indicative of the types of work. This is then followed by the presentation of the content analysis that sought to identify the key theories and methods in use within the literature. Next, we outline the results from the Delphi review of experts. This concludes with the key questions, topics and challenges we identified, and we compare these to the results from the literature work. In the last section, we will present the recommendations for areas of future study. As a reminder, the initial scoping questions for this area of work were: • How digital technology impacts on our autonomy, agency and privacy—illustrated by the paradox of emancipation and control; and • Whether and how our understanding of citizenship is evolving in the digital age— for example whether technology helps or hinders us in participating at individual and community levels.

452   Simeon J. Yates et al.

Initial Comments On one level this part of the project could not have taken place at a more interesting and challenging time, with both the Brexit referendum and the election of a social media active Donald Trump as US president. Behind both events are very complex issues of polarization in politics, and raise deep questions about the role of media, especially digital media, in all levels of political activity. Unfortunately, this means that there has been a small explosion of research on this topic since the current analysis was completed. As we will discuss later, this issue and concern comes through in the Delphi work, and we reflect on next steps in regard to polarized political communication and digital media in the conclusion. Possibly reflecting on this context, this domain had the greatest number of Delphi returns and identified starting literature; in terms of both the number of responses and in the extent and detail of the responses. As a result, a considerable amount of work was undertaken in the analysis and in the final consultation workshop focused on reducing the breath of material gathered. This chapter therefore has a slightly different structure to the other six ESRC review chapters, as the consultation workshop materials are integrated rather than separately reported. The team reflected on the reasons for this much stronger response. As we noted earlier, the Delphi process took place just after the Brexit vote and during the US presidential election. It is possible that the issues around citizenship, politics, and digital media struck a chord with respondents at this time. We also noted that the project steering group had a number of members whose current or prior work has touched on this area. Thus, though we tried to ensure as balanced a response as we could, this may have biased the snowball sample or potentially motivated respondents in this area. Of course, both factors could have played together. We would also like to highlight something that comes through in the comparison of the concept maps from the 2000–2004 period with that of the 2012–2016 period (see Figures 16.1 and 16.2).1 In examining the visualizations of the data from these two periods we find that concept pairs such as public sphere and a focus on government and candidate Internet use are apparent in the earlier literature. These are replaced with foci around participation and engagement for the later literature. We feel this marks a transition from an initial focus on the potential for the Internet and digital media to facilitate public debate and enhance the public sphere, to one focused on the role of networks (social media) to support and enhance political engagement. As noted later, we may now be in a third stage where the focus is on the role of networks in creating “echo chambers” or “filter bubbles,” therefore negating the potentials to enhance the public sphere or disrupt political institutions. Scholars, particularly in political studies and media studies, have noted the rise of the use of social media in civic and political spheres. In broad terms, attention has focused on the ways in which social media facilitates engagement in politics and participation in politics (Dahlgren, 2013) and the characteristics of that communication and relationship between citizens and politics (Papacharissi, 2015). There is a general consensus that a number of factors need to be addressed to realize the potential of digital communication to enhance

ESCR Review: Citizenship and Politics   453

group/int

form/netw

action/we

age/infor communica

internet/ network/so community

internet/mo

government/

governmen

government/inf

citizen/com

information/ soc

information

community

candidate/si

man/woman

form/info network/st

communication/

communication site/web

communicat

citizen/gove

community

campaign/w campaign/s

citizen/i

information/ne

communication internet/use

communication/ medium

democracy/

link/site democracy

citizen/in

october/u

newcastl

communicat

internet/p communicati governmen

sphere/w informat

informati communica access/inter

effect/in

cramer/gr

Figure 16.1  Citizenship 2000–2004: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2000–2004. The diameter of each circle reflects the frequency of the concept pairs.

­ articipation. A continuing issue of inequality remains, with social and political inequality p adding to any existing digital divides. Further, questions of how to develop open and deliberative participation using online communication remain difficult to address. There is a better understanding about the threats that digital communication poses in terms of filter bubbles (Sunstein, 2001) and the personalization of news and other information that might limit an open debate. However, this is an area that requires more research.

Literature Analysis The literature analysis was designed to create two analytic outcomes. First, the goal was to identify key topics within the existing literature. This would allow the comparison

454   Simeon J. Yates et al.

activity/med

medium/platf

service/sms

medium/ movem engagement/

activity/par effect/mediu study/use

participation medium/relati

action/grou information/i

internaet/lit

engagement/

grievance/s

medium/study

medium/use

action/ moveme

citizen/mediu medium/role

engagement/ medium/participat

communication/ net information/ society

citizen/serv

group/use

action/ organ

communication/ medium

engagement/medium

medium/news

medium/politic

medium/society action/ medium

channel/ser network/ organiza

communication/ phone/ servi

participati

internet/parti

participation

effect/parti

participati

democracy/ med

action/networ

facebook/ me

communication/ soc

people/ poli

delivery/se

information

group/user

Figure 16.2  Citizenship 2012–2016: Most frequent concept pairs. Note: Bubble chart showing frequency of the top 50 concept pairs, based on concept modeling (described in Chapter 2) within the Domain for 2012–2016. The diameter of each circle reflects the frequency of the concept pairs.

with areas of future importance identified by the Delphi review. Second, we conducted a content analysis of the literature to explore the predominance of specific, theories, methods and approaches. As noted in the chapter 2, the literature data were subjected to two analyses. The first round of collected literature was analyzed to create concept pairs and trios, and then the combined first and second rounds of literature were analyzed to identify key topic clusters. The results of these two approaches were then compared. Table 16.1 lists the 10 most common concepts identified from the first round of literature. These represent the concepts covering 2% or more of the identified cases. Table 16.2 lists the concept pairs within these groups. In Table 16.2 the first part of the concept pair is marked in bold with various second elements presented in the list below this first part. Unsurprisingly the concepts of citizenship, action and networks were ranked top. This reinforces the point that much of the underlying conceptual base of the literature on digital media and politics is focused on the three-way interface between: citizens; political action and engagement; and the role or impacts of networks (digital or otherwise).

ESCR Review: Citizenship and Politics   455 Table 16.1  Analysis Concepts Ranked Concept

Percent

Citizen Action Network Campaign Citizenship Channel Access Engagement Government Participation Information Link Delivery

7.56 7.32 6.21 5.35 4.35 4.08 3.46 3.35 2.92 2.81 2.59 2.43 2.40

Table 16.2  Concept Pairings—Main and Secondary Concepts Concepts

Percent

Concepts

Percent

Citizen democracy engagement government participatory perception

13.79 2.96 2.91 4.43 1.58 1.92

campaign candidate election movement party practice

9.75 2.02 2.91 1.03 2.86 .94

action activism campaign frame membership protest talk

13.35 1.87 1.82 3.20 1.18 4.29 .99

citizenship engagement people phenomenon study youth

7.93 2.02 2.17 .89 1.43 1.43

network power recognition strength transformation

11.33 6.65 2.36 1.18 1.13

channel citizen consumer phone service

7.44 2.17 .99 1.43 2.86

access citizenship latino percentage survey white

6.31 .44 1.67 1.33 1.77 1.08

Concepts

Percent

engagement norm participation use

6.11 1.13 2.46 2.51

government latino responsiveness stage

5.32 1.13 2.27 1.92

participation participatory protest

5.12 2.86 2.27

information literacy overload protest

4.73 1.43 .49 2.81

link pattern site twitter

4.43 1.13 2.41 .89

delivery perception phone value

4.38 1.48 1.38 1.53

Note: bolded term is the main concept; the unbolded terms below that and above the line are the related subconcepts.

456   Simeon J. Yates et al. Our second approach to the analysis of the literature explored the extraction of topics using a different methodology, based on a factor analysis of salience and relevance measures. We utilized both custom-developed tools and the WordStat software. Unlike the concept mapping which pulled out some of the underlying ontological links, the identification of topics produced groups that more overtly fitted theory and methods in the literature. (This was the case for all of the literature analyses.) Table 16.3 presents the 15 topics identified by WordStat, and Table 16.4 maps these to the concepts analysis.

Table 16.3  WordStat Analysis of Topics Concept topics

Keywords

Eigen-value

Freq

Public sphere

SPHERE; DELIB; HABERMA; DEMOCRACI; DELIBER; DEMOCRAT; PUBLIC; DEBAT; DISCOURS; FORUM; POLIT

Measurement

VARIABL; REGRESS; STATIST; TEST; TABL; MODEL; MEASUR; PREDICT; ESTIM; SIGNI; SAMPL; CORREL

Social network analysis

Cases % Cases

10.5

29,329

486

98.0

3.19

18,205

474

95.6

INFECT; NODE; CONTAGION; NEIGHBOR; THRESHOLD; TI

2.77

2144

313

63.1

Protest and activism

MOVEMENT; PROTEST; ACTION; COLLECT; ORGAN; ACTIVIST; OCCUPI

2.69

12,940

473

95.4

Governance

GOVERN; SERVIC; POLICI; PUBLIC; SECTOR; ADMINISTR; MANAG

2.52

20,565

490

98.8

Elections

ELECT; PARTI; VOTER; CAMPAIGN; CANDID; ELECTOR; VOTE

2.37

11,159

407

82.1

Cyber hate crime

CRIME; VICTIM; HATE; GUARDIANSHIP; CYBER; POLIC; SECUR

2.14

2632

317

63.9

Partisan politics EXPOSUR; PARTISAN; POLAR; ATTITUD; ATTITUDIN; PERCEIV; OPINION

2.01

5060

429

86.5

Web and social media

SITE; WEB; PAGE; USER; BLOG; SEARCH; LINK; GOOGL; FACEBOOK

1.92

14,607

470

94.8

Gender and ethnicity

GENDER; WOMEN; EDUC; FEMAL; ETHNIC

1.83

4741

400

80.7

Civic engagement

CIVIC; ENGAG; CITIZENSHIP; YOUTH; LEARN

1.81

8650

455

91.7

Mobile

PHONE; MOBIL; SM; CHANNEL

1.72

3746

395

79.6

Political online fora

FORUM; THREAD; TALK

1.65

2255

325

65.5

Homophili

HOMOPHILI; NOIS; AGENT; NEIGHBOR; INFLUENC

1.63

2044

315

63.5

Twitter

TWITTER; TWEET; HASHTAG

1.57

2267

181

36.5

Table 16.4  Comparison between Concepts and WordStat Topics Concept/Topic

Citizen Action Network Campaign Citizenship Channel Access Engagement Government Participation Information Link Delivery

Twitter Social network analysis Homophily Cyber hate crime Political online fora Mobile Gender and ethnicity Elections Partisan politics Civic engagement X Web and social media Protest and activism X Measurement Public sphere Governance

X X

X X

X X

X X

X X X X

X X X

458   Simeon J. Yates et al.

Topics The eight key areas emerging from the analysis (see Table  16.3) are Public sphere, Measurement, Social network analysis, Protest and activism, Governance, Elections, Cyber hate crime, and Partisan politics. The category of measurement reflects the greater proportion of work in this domain that employs statistical analysis (see the following section on methods). Although the issue and topic of governance is important in this domain, there is a separate full section of the book (see Section 7) dedicated to this topic. In order of importance, the first topic reflects the broader question of civil society and the public sphere whereas the others focus on specific actions or contexts such as election campaigns—though of course ideally these issues should strongly intersect. From the analysis in Table 16.3 it is clear that the idea of the public sphere is a key topic in the academic debate around the impact of digital media on politics. This is clearly articulated in works by highly influential authors (e.g., Castells, 2008) but also in many individual studies. As we noted previously, one of the general findings from the content analysis was the utilization of very wide-ranging ideas or publications as “scene setters”—such as the idea from Castells (1996) of the “network society”—but without the detail of this work being substantively engaged with. This appears to be the case with the idea of the public sphere (Habermas, 1991), where the Habermasian concept is repeatedly pointed to without the full theoretical model being employed. As we noted earlier, the longitudinal view over the last two decades points to a shift from the focus on public sphere to one on more individualistic issues of participation and engagement. There also appears to be a shift away from the earlier literature that tended to focus on the potentials for digital media to support the public sphere and deliberative democracy. Papers exploring the idea of the public sphere were not uncritical but utilized the concept to examine the potentials for deliberative democracy through digital media either theoretically (e.g., Dahlberg,  2001, Dahlgren,  2005) or through the analysis of interactions (e.g., Papacharissi, 2004). The focus has since shifted towards the analysis of actual network interactions (often via social media) and the extent to which political engagement, influence, and action are developed. In the time between the literature analysis undertaken by the ESRC project and the current publication the focus has shifted somewhat to the failing of the public sphere and the rise of “echo chambers” and “mini publics” (see Frame & Brachotte, 2015). This concern clearly comes through in the Delphi analysis of experts reported later in the chapter. As with several of the other domains, Twitter and social network analysis are two prominent and linked topics. The intersection of Twitter, politics, and citizenship is fraught with challenges, especially as both the technology itself and its uses have continued to change over the last decade (Bimber et al., 2015). As a result, it may not always be helpful to draw comparisons with traditional political behavior. Within the actual study of the use of Twitter in politics and political action there is a focus on collective action. For example, Kende, van Zomeren, Ujhelyi, & Lantos et al. (2016) proposed and tested how the social affirmation use of social media motivates individuals for collective action to achieve social change. In this frame of work, a good number of analyses are focused on the nature of

ESCR Review: Citizenship and Politics   459 social capital, or psychological group membership measures, as routes to understanding political social action undertaken through or supported by Twitter. Much of this focuses on young citizens, but often has a strong element of networked individualism (e.g., Rainie & Wellman, 2012) with the examination of the potential for platforms such as Facebook, Twitter and YouTube to influence political stances and civic engagement (see Loader et al., 2014). A reasonably comprehensive overview can be found in Weller et al. (2014). There are many papers that take on board the theory or orientation of social network analysis (SNA) in their approach, such as the ideas of weak and strong ties or network power, or reference SNA-based studies, though they do not necessarily use specific SNA methods. Examples include political participation (e.g., Bennett, 2012; Chan, 2016), campaigning (Bruns & Highfield, 2013), and influence (e.g., Grudz et al., 2014). The specific use of SNA methods was in fact very limited (see Table 16.6) and was to be found in papers with a strong methodological focus. These may not take on politics and citizenship directly but elucidate how influence may spread in both digital and non-digital communications networks and how these interact (e.g., Haythorthwaite, 2002). More pragmatically, the literature focuses on the actual practices and online behaviors. Activism and protest appear in more recent literature, with authors focusing on the role of social and networked media in engagement and organization of politics. For example, Agarwal et al. (2014) compare the use of digital media by two very different political groups: the Republican Tea Party movement and the Occupy Wall Street movement. Other studies attempt to analyze the links between the types of social media used, contexts of interaction with similar or other groups, and likely political participation (e.g., Kim & Chen, 2016). Some studies try to assess the extent to which online activity leads to other forms of political action (digital or not), from voting to collective action (e.g., Schumann and Klein, 2015; Theocharis & Lowe, 2016). In regard to elections this work goes back to the mid-1990s (e.g., Yates & Perrone, 1998) and early 2000s (e.g., Coleman, 2001a, 2001b) with a strong focus on the United Kingdom and the United States (e.g., Foot & Schneider, 2002). The breadth and variety of this work has grown extensively over the past decade to include a wider variety of nations and forms of electoral process (e.g., Gadekar et al., 2011; Vromen, 2015). The issue of “second screen” communication in the electoral context has also been receiving increasing attention, for example during important televised campaign debates in various countries. Furthermore, questions of online hate or partisan interaction are at this time a key issue. Studies range from analyses of homophily in political group membership (e.g., Colleoni, et al., 2014) through arguments that social and digital media use have gone hand in hand with more personalized politics (e.g., Bennett, 2012). This then bleeds over into issues of digital governance and online crime, be it terrorism or hate crime. We would argue therefore that there appears to be a general, though not universal, shift in the literature over the last two decades, from ideas of the potential role of digital media in the broader public sphere to much more specific and analytics-based assessments of the specifics of network dynamics in regard to political action, engagement, and participation. This situation is seen in a range of areas, including political communication and news, connective action, and the media hybridity. In the area of political communication,

460   Simeon J. Yates et al. research has identified that people access their news using both social media and mainstream (whether public or commercial) media (Rainie et al.,  2011; Oxford Reuters Institute for the Study of Journalism, 2016). The wider media environment for political communication has resulted in the development of media hybrid systems in political communication (Chadwick & Dennis, 2017). What Chadwick and Dennis argue is that both traditional media communication and social media communication are configured in different ways to reach different social groups. There is an organizational dimension to this that draws on the networked logic of social media; here, both connective action and collective action is mobilized (Bennett & Segerberg, 2013).

Theory, Method, and Approach As chapter 2 describes, the content analysis builds on Borah’s (2017) approach to analyzing a set of communications and media literature in regard to digital media use. Table 16.5 provides the results with regard to the empirical approach taken in the literature. The majority of the papers (45%) undertook primary data collection, with 33% being theoreticalsynthesis of current or prior work. The main disciplines from which theory was used or for which theory was developed were: politics and public administration (48.6%), sociology (28.0%), and communication and media (14.3%). It is important to note that only actual use of theory for the purposes of design, synthesis, or analysis were coded. General references to prior work and theory, such as broad reference to “network society” (Castells) or “public sphere” (Habermas), were not coded. This distinction is important, as it highlights the use of theory to design and analyze data or synthesize materials, as distinct from more general discussion. There was considerable variety in the specific theories applied from these disciplines, with no clear preference. Ideas of the public sphere (6%) and political participation (5%) were the most common in the political science literature. The main research methods were literature reviews (33%), surveys (29%), content analysis (8%), and interviews (7%; see Table 16.6). The majority of the empirical work focused on specific groups (e.g., Facebook users) with a limited number of general population studies (see Table 16.7). Where new data was analyzed, the majority (53%) of the analyses were qualitative, though the methods varied, the remainder being statistical (see Table 16.8). Only one study overtly stated that they were using a “big data” approach. Compared to the other domains, there is a stronger emphasis on both empirical data collection and quantitative analysis in the literature analyzed here. As with all the domains, both big data and SNA are so far only used to a limited extent.

Delphi Review The literature review analysis explored the themes found within recent research publications. The following sections detail the results of the Delphi process for the Citizenship

ESCR Review: Citizenship and Politics   461 Table 16.5  Empirical Approach Percent Primary empirical (data collected and analyzed) Theoretical (synthesis of current or prior work) Discursive/descriptive (no new data or theory) Secondary empirical (analysis of existing data)

45.1 33.3 13.7 7.8

Table 16.6  Research Method Percent Literature review (general or narrative) Survey Content analysis Interview(s) Theory building Other Experiment Ethnography Focus groups Social network analysis Textual (linguistic-discourse analysis) Meta-analysis or systematic review

32.7 28.6 7.8 6.9 6.9 4.2 3.2 3.2 2.8 1.8 .9 .9

Table 16.7  Study Population Percent Specific group General population Case study(ies)

48.8 33.7 17.4

Table 16.8  Analytic Approach Percent Qualitative (textual-non-discourse) Statistical (numerical) None Not applicable Discourse (textual-linguistic-discourse)

53.5 32.2 8.3 5.2 .9

462   Simeon J. Yates et al. and Politics domain. There were three parts to the Delphi review: an initial survey, a confirmatory questionnaire to address the findings from the survey, and a confirmatory workshop. The goal of the Delphi process was to identify and prioritize areas for future research. These might include areas already covered by literature but also new concerns, or the needs for a tighter focus on a specific issue. The process sought to identify suggested future scoping or research questions, key topics to address within these questions, and key challenges that might be encountered when researching these questions.

Future Research and Scoping Questions Given the amount of input to the Delphi process for this domain, the suggestions for scoping and research questions were coded into the eight categories and 36 specific questions, which were grouped into 21 questions, as detailed in Table 16.9. The process used two measures to assess the importance of these questions. The first was how

Table 16.9  Delphi Review Scoping Questions Question category

Questions

Digital technologies, radicalization, mobilization and political action

In what ways do digital technologies impact traditional forms of mobilization, collective action, and/or political participation? How have “negative” online behaviors (such as trolling and flaming) impacted on civic and political activity?

Digital technologies, emancipation, agency and control

How and in what ways are digital technologies challenging or reinforcing existing power relations? What are the impacts on our autonomy, agency, dignity and privacy?

Digital technologies and the disruption of current political institutions

How do new technologies disrupt and challenge incumbent political institutions? What are the opportunities and challenges facing democracy in an age of digital participation? How do social media affect the quality of democracy/citizenship? And what about non-democratic states?

Digital technologies, political identity, emotion and empowerment

Does access to digital technologies have a positive emotional impact on citizens, making them feel empowered, with a voice and potential influence?

Digital technologies and new forms of citizenship

How does technology enlarge or change our understanding of, and interaction with, citizens outside of our own national borders? What constitutes citizenship? Is it meaningful to talk about digital citizenship? Does digital expand the notion or simply provide a new space for the exercising of citizenship rights and duties? How are youth engaging with digital technologies and online politics?

ESCR Review: Citizenship and Politics   463 Question category

Questions

Digital technologies and governance

How does technology improve governance (i.e., government’s responsiveness to citizen concerns and ability to effectively manage competing interests)? Does electronic governance transform relationships between states and citizens and the nature of politics?

Digital technologies, groups and elites

How do political elites use digital media? How do old and new parties use new technologies and with what consequences? Does new media promote populism? How do emerging media platforms impact the ongoing digital divide?

Digital technologies, political communication, debate and media

How do new ecosystems of information and delivery impact on political participation, opinion forming, and education? How do people perceive “success” in online political participation? How does digital media interact with traditional media in shaping public opinion?

Table 16.10  Delphi Review Scoping Questions Ranked by Number of Cases and by Importance Question category

Rank: Number of cases

Rank: Importance (Percent)

Digital technologies, radicalization, mobilization, and political action

3

1 (21)

Digital technologies and the disruption of current political institutions

1

2 (17)

Digital technologies and new forms of citizenship

6

3 (16)

Digital technologies, political communication, debate, and media

2

3 (16)

Digital technologies and governance

8

4 (12)

Digital technologies, emancipation, agency, and control

4

5 (10)

Digital technologies, political identity, emotion, and empowerment Digital technologies, groups and elites

5

6 (6)

7

7 (1)

f­ requently the suggestion came up in the Delphi survey, and the second was how important these topics were ranked by the Delphi interviewees. Table 16.10 shows the ranking of these categories by the number of questions allocated to the category and by their ranked importance from the confirmatory survey. It is important to note that ranked importance is almost same in both tables. As chapter 25 describes, there are a number of

464   Simeon J. Yates et al. areas identified in the scoping questions and challenges that are cross cutting, a key one of these being governance. As a result, there are also some strong overlaps with the Governance and Security domain (see chapters 22 and 23) that will be addressed there.

Key Topics If we turn next to specific topics that might cross cut these questions, we find that topics that were most commonly cited in the Delphi process were also those deemed most important in the confirmatory survey (see Table 16.11 and Table 16.12). These topics also closely match the proposed research and scoping questions. Given the number and detail of the scoping questions provided in the initial rounds of the Delphi process, this overlap was highly likely. One of the reasons for this was that as respondents interpreted differently the idea of there being two levels, one of overarching questions and then the topics within them, questions for some respondents were topics for others. This distinction appeared to be clearer for other domains, where the volume of responses was lower. Overall, though, this does provide reinforcing evidence, along with the broad support of the consultation workshop, for the relevance of the questions and topics. We do note, however, that a key comment made in the confirmatory workshop was that the literature and Delphi work had not really addressed the issue of digital media use in the context of major state control and censorship. We agree that this is a topic that appears not to have had as much attention in the literature we surveyed nor in the Delphi review. If we leave aside the governance issues for chapters 22 and 23, then it is clear that the Delphi panel, confirmatory survey, and the workshop see the future focus for research

Table 16.11  Key Topics Ranked by Percent of Delphi Survey Responses Topic Divides Mobilization Talk Control Data Media Other Participation Citizenship Engagement Governance Privacy Identity Methods

Percent 8 8 7 6 6 6 6 6 5 4 4 4 3 3

Topic Technologies Civic Commercial Cultural Direct democracy Empowerment Geopolitics Policy Trust Young people Contestation Parties Populism State

Percent 3 2 2 2 2 2 2 2 2 2 1 1 1 1

ESCR Review: Citizenship and Politics   465 Table 16.12  Key Topics Ranked by Importance from Delphi Survey Topic

Very important

Important

Neutral

Unimportant

Very unimportant

Governance in a digital age

51.9%

37.0%

11.1%

0.0%

0.0%

Political mobilization via digital media

48.1

40.7

7.4

3.7

0.0

Digital and state control

48.1

37.0

11.1

3.7

0.0

Citizenship in a digital age

48.1

33.3

14.8

3.7

0.0

Data—big, small, and citizen

44.4

37.0

14.8

3.7

0.0

Political participation and engagement

44.4

37.0

14.8

3.7

0.0

Privacy in a digital age

40.7

40.7

11.1

3.7

0.0

Political media, old and new

29.6

44.4

18.5

7.4

0.0

Digital divides

22.2

59.3

11.1

7.4

0.0

Political identity in a digital age Online debate and interaction

22.2 18.5

48.1 70.4

29.6 11.1

0.0 0.0

0.0 0.0

to be focused on citizenship, participation, and engagement, with specific concerns about mobilization and radicalization. The questions about empowerment, public sphere, links to older media, and emancipation that are present in some of the earlier literature in this domain has moved down the list of priorities. Importantly, there is a growing concern for how digital technologies are disrupting politics and political institutions.

Research Challenges Our final set of questions asked the Delphi panel about the challenges that may be faced in undertaking research in these areas. These were placed into 14 categories and ranked by the number of coded items (Table 16.13). None of the main challenges was deemed to be specific to any of the seven domains by the consultation workshop. Table 16.14 shows the ranking of these by the confirmation survey. Such cross-cutting topics and challenges are discussed in the concluding chapter 25.

Conclusion The concepts and topic mapping analyses generated very similar results, and these closely map onto the Delphi results. The close mapping of the Delphi and literature analyses

466   Simeon J. Yates et al. Table 16.13  Challenges Ranked by Percent of Cases Challenge

Percent

Methods Theory Big data Epistemology/ontology Ethics Psychology Technology Exclusion Education Funding Impact Individualism Policy Training

42 14 12 7 6 5 4 2 1 1 1 1 1 1

Table 16.14  Challenges Ranked by Importance from Delphi Survey Challenge Developing new theory Developing new methods Dealing with “big data” Ethics Epistemological and ontological issues

Very important Important Neutral Unimportant Very unimportant 55.6% 44.4 44.4 37.0 37.0

37.0% 33.3 33.3 51.9 25.9

7.4% 18.5 18.5 7.4 25.9

0.0% 3.7 3.7 0.0 7.4

0.0% 0.0 0.0 3.7 3.7

potentially indicates that this is a well-developed domain of research with clear foci. The consensus around the consolidation of research questions in the consultation workshop reinforces this. There may be a number of good clear reasons for this emphasis. Political communication and behavior are substantive aspects of both communication studies and political science. These are both areas that have been dramatically affected in very public ways by digital media, in contrast to the very real but less visible impacts of digital technologies on governance or public policy. There are also indications that the visibility of digital media—from the web to social media—have made (at least some) processes of political communication very visible and open to analysis. Given that the literature and the Delphi recommendations strongly overlap, the research has not identified any clear new topic gaps to highlight for future work. Rather, the Delphi work appears to confirm the patterns found in the literature, with a move away from the assessment of the potentials of digital media for deliberative politics, development of the public sphere and broad civic engagement, to a clearer focus on the role of networks in political mobilization, influence, partisan politics, and more

ESCR Review: Citizenship and Politics   467 i­ ndividualistic measures of engagement and political action. This may be a reaction to political changes experienced over the last five years that have often been associated with the use of digital media—such as the Arab Spring, Brexit, and the election of Donald Trump. It might also reflect the nature of available data (e.g., Twitter and Facebook posts) or the nature of these media which are more notably individualistic in form than either prior mass media or even older forms of digital media (e.g., Rheingold, 1993). The disruption caused by social media itself or its use and misuse in politics (cf. Cambridge Analytica scandal) are a further clear set of contemporary questions. Yet underneath this is both an empirical and theoretical need to fully understand what our current social media-based politics is telling us about citizens’ behavior and political processes, and vice versa. We would argue that for the health of democratic institutions, there is a need to empirically understand contemporary political behavior and participation in the context of digital technology use. As a word of caution, in the other domains we noticed a “platform focus” in many studies. In the case of politics, an example might be a focus on political uses of Twitter, in contrast to boarder studies of the full range of digital media that citizens may utilize for political communication. Though there are examples of platform focus, it does not appear as pronounced in relation to political research as in other domains. As with the other domains, we believe that the complexity and variety of potential work warrants consideration to be taken of all the questions topics and challenges identified. Noting this, we would argue that the analysis here has identified four key areas for future research:

1. “Digital technologies,” radicalization, mobilization, and political action 2. “Digital technologies” and the disruption of current political institutions 3. “Digital technologies” and new forms of citizenship 4. “Digital technologies,” political communication, debate, and media

We note that the Governance and Security domain significantly addresses the issue of “Digital technologies and governance,” which was the top ranked topic in the confirmatory survey. The other key topics identified fit within these four scoping areas, except for Digital and state control. This topic, identified as well in comments at the consultation workshop, reminds us that not all politics are democratic and there is no necessary causal link between digital media use and open societies.

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high-frequency ­co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted

468   Simeon J. Yates et al. in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualisations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https://waysofbeingdigital.com/literatureanalysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each domain and the relative frequency of concepts associated with each cluster.

References Agarwal, S. D., Barthel, M. L., Rost, C., Borning, A., Bennett, W. L., & Johnson, C. N. (2014). Grassroots organizing in the digital age: Considering values and technology in Tea Party and Occupy Wall Street. Information, Communication & Society, 17(3), 326–341. Bennett, L. (2012). The personalization of politics: Political identity, social media, and changing patterns of participation. Annals of the American Academy of Political and Social Science, 644(1), 20–39. Bennett, W. L. & Segerberg, A. (2013). The logic of connective action: Digital media and the personalization of contentious politics. Cambridge, UK: Cambridge University Press. Bimber, B., Cunill, M. C., Copeland, L., & Gibson, R. (2015). Digital media and political participation: The moderating role of political interest across acts and over time. Social Science Computer Review, 33(1), 21–42. Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Bruns, A. & Highfield, T. (2013). Political networks on Twitter. Information, Communication & Society, 16(5), 667–669. Castells, M. (1996). The rise of the network society: The information age: Economy, society, and culture (Vol. I) (Information Age Series). London: Blackwell. Castells, M. (2008). The new public sphere: Global civil society, communication networks, and global governance. Annals of the American Academy of Political and Social Science, 616(1), 78–93. Chadwick, A. & Dennis, J. (2017). Social media, professional media and mobilisation in contemporary Britain: Explaining the strengths and weaknesses of the citizen’s movement 38 Degrees. Political Studies, 65(1), 931–950. Chan, M. (2016). Social networks sites and political engagement: Exploring the impact of Facebook connections and uses on political protest and participation. Mass Communication and Society, 19(4), 430–451. Coleman, S. (2001a). Online campaigning. Parliamentary Affairs, 54, 679–688. Coleman, S. (2001b). Click for democracy. The World Today, 57(5), 17–18.

ESCR Review: Citizenship and Politics   469 Colleoni, E., Rozza, A., & Arvidsson, A. (2014). Echo public or public sphere? Predicting political orientation and measuring political homophily in Twitter using big data. Journal of Communication, 64, 317–332. Dahlberg, L. (2001). The Habermasian public sphere encounters cyber-reality. The Public, 8(3), 83–96. Dahlgren, P. (2005). The internet, public spheres, and political communication: Dispersion and deliberation. Political Communication, 22(2), 147–162. Dahlgren, P. (2013). The political web: Media, participation and alternative democracy. Basingstoke: Palgrave MacMillan. Foot, K. A. & Schneider, S. M. (2002). Online action in campaign 2000: An exploratory analysis of the US political Web sphere. Journal of Broadcasting & Electronic Media, 46(2), 222–244. Frame, A. & Brachotte, G. (Eds.). (2015). Citizen participation and political communication in a digital world. Abingdon, UK: Routledge. Gadekar, R. et al. (2011). Web Sites for E-electioneering in Maharashtra and Gujarat, India. Internet Research, 21(4), 435–457. Grudz, A. et al. (2014). Geography of Twitter networks. Social Networks, 34, 37–81. Habermas, J. (1991). The structural transformation of the public sphere: An inquiry into a category of bourgeois society. Cambridge, MA: The MIT press. Haythorthwaite, C. (2002). Strong, weak, and latent ties and the impact of new media. Information Society, 18(5), 385–401. Kende, A., van Zomeren, M., Ujhelyi, A., & Lantos, N. A. (2016). The social affirmation use of social media as a motivator of collective action. Journal of Applied Social Psychology, 46(8), 453–469. Kim, Y. & Chen H-T. (2016). Social media and online political participation: The mediating role of exposure to cross-cutting and like-minded perspectives. Telematics and Informatics, 33, 320–330. Loader, B. D., Vromen, A., & Xenos, M. A. (2014). The networked young citizen: Social media, political participation and civic engagement. Information, Communication & Society, 17(2), 143–150. Oxford Reuters Institute for the Study of Journalism. (2016). Reuters Institute Digital News Report. http://reutersinstitute.politics.ox.ac.uk/sites/default/files/research/files/Digital%25 20News%2520Report%25202016.pdf Retrieved July 3, 2018. Papacharissi, Z. (2004). Democracy online: Civility, politeness, and the democratic potential of online political discussion groups. New Media & Society, 6(2), 259–283. Papacharissi, Z. (2015). Affective publics: sentiments, technology and politics. Oxford, NY: Oxford University Press. Rainie, L., Purcell, K. & Smith, A. (2011). The Social Side of the Internet. Pew Research Centre. http://www.pewinternet.org/2011/01/18/the-social-side-of-the-internet/ Retrieved July 3, 2018. Rainie, L. & Wellman, B. (2012). The Individual in a networked world: Two scenarios. The Futurist, July-August, 24–28. Rheingold, H. (1993). The virtual community: Finding connection in a computerized world. Boston, MA: Addison-Wesley Longman Publishing Co., Inc. Schumann, S. & Klein, O. (2015). Substitute or stepping stone? Assessing the impact of lowthreshold online collective actions on offline participation. European Journal of Social Psychology, 45, 308–322.

470   Simeon J. Yates et al. Sunstein, C. (2001). Echo chambers: Bush v. Gore, impeachment and beyond. Princeton, NJ: Princeton University Press. Theocharis, Y. & Lowe, W. (2016). Does Facebook increase political participation? Evidence from a field experiment. Information, Communication & Society, 19(10), 1465–1486. Vromen, A. (2015). Campaign entrepreneurs in online collective action: GetUp! In Australia. Social Movement Studies, 14(2), 195–213. Weller, K., Bruns, A., Burgess, J. E., Mahrt, M., & Puschmann, C. (2014). Twitter and society: An introduction. In K. Weller et al. (Eds.), Twitter and society (pp. xxix–xxxvii). New York: Peter Lang. Yates, S. J. & Perrone, J. L. (1998). Politics on the Web. Assignation, 15(4), 49–53.

ESRC Review: Data and Representation   521 Table 18.11  Key Topics Ranked by Percent of Delphi Survey Responses Topic

Percent

Social impacts Privacy and surveillance Citizens/everyday life Open data/algorithm transparency/accountability Exclusion/inclusion/divides Data visualization/social construction Methods Digital identity Economics

20 18 16 16 12 6 6 4 4

Table 18.12  Key Topics Ranked by Importance from Delphi Survey Topic Social impacts of data Privacy and surveillance Citizens/Everyday life e­ xperiences and uses of data Understanding Open data/ Algorithm transparency Accountability Data exclusion/inclusion/divides Digital identity and data Data visualization/Representation/ Social construction of data Research methods Economic impacts

Very important

Important

Neutral

Unimportant

Very unimportant

86.7% 60.0 53.3

13.3% 33.3 33.3

0.0% 6.7 13.3

0.0% 0.0 0.0

0.0% 0.0 0.0

53.3

33.3

13.3

0.0

0.0

40.0 40.0 40.0

53.3 33.3 13.3

6.7 20.0 46.7

0.0 6.7 0.0

0.0 0.0 0.0

26.7 20.0

33.3 66.7

33.3 0.0

6.7 13.3

0.0 0.0

The Delphi panel identified eight categories of challenges in undertaking research in this domain: methods; social theory and social questions; access to data; data literacy; education, ethics; inequality/exclusion/inclusion/divides; and interdisciplinarity (see Table 18.14). Over half of the cases involved methods issues, so this category has been further divided into analytics and measurement; combining old and new social research methods; concepts; social measures; and understanding and developing new research methods. Over half of the participants ranked the three challenges of ethics; data inequality/exclusion/inclusion/divides; and interdisciplinary working as “very important” (Table 18.15). Although the ethics and inequality challenges rankings do not match with their percentages in Table 18.14, these are, nonetheless, key cross-cutting issues. The challenges identified point towards specific concerns in working across the social

522   Simeon J. Yates ET AL. Table 18.13  Data-focused Topics and Challenges Topic

Challenge

Datafication

Ownership, exploitation, rights, boundaries, new sources How is data being stored and by whom? Data bias: inequity and stereotypes in the data? Archiving: tools, algorithms and processes

Data literacies

Making data and processes visible Domain and general literacy People who do not want to/cannot be “datafied”

Privacy, security, and trust

Needs to know more about the difference between personal and machine data Access and permissions Citizen choice in data creation and use Unintended consequences

The future?

Need to think beyond the current data environment

Data interpretation

Beyond data to meaning AI and IoT and how they use data Algorithms and meaning Data semantic gap Accountability, social values, and transparency

Table 18.14  Challenges Ranked by Percent of Cases Challenge Methods Analytics and measurement Combining old and new social research methods Concepts Social measures Understanding and developing new research methods Social theory and social questions Access to data Data literacy Education Ethics Inequality/exclusion/inclusion/divides Interdisciplinarity

Percent 57.9 7.9 7.9 15.8 5.3 21.1 7.9 5.3 5.3 5.3 7.9 5.3 5.3

sciences, information studies, and computer science disciplines, as the tools and methods being used often originate in computer science and information studies, but must be integrated with or translated into social science. This was the only area where there was explicit comment on the need to provide higher education support to develop and train both students and researchers in new methods and deeper data literacy.

ESRC Review: Data and Representation   523 Table 18.15  Challenges Ranked by Importance from Delphi Survey Challenge Ethics Data inequality/exclusion/inclusion/ divides Interdisciplinary working (Computing and social science) Methods—Combining old and new social research methods Social theory and social questions Methods—Concepts Higher education and training Access to data Methods—Analytics and measurement Methods—Social measures Data literacy

Very important

Important

Neutral

Unimportant

Very unimportant

66.7% 53.3

26.7% 40.0

6.7% 6.7

0.0% 0.0

0.0% 0.0

53.3

26.7

13.3

6.7

0.0

46.7

26.7

20.0

6.7

0.0

40.0 40.0 40.0 20.0 20.0

53.3 33.3 20.0 60.0 53.3

6.7 26.7 40.0 20.0 26.7

0.0 0.0 0.0 0.0 0.0

0.0 0.0 0.0 0.0 0.0

20.0 20.0

53.3 46.7

20.0 33.3

6.7 0.0

0.0 0.0

To conclude, this domain clearly separated out a set of social science research questions and areas, with topics that mixed both research and methods issues. Challenges were predominantly around methods.

Conclusion Contemporary research in the Data and Representation domain studied here appears to have focused on: data methods (science and methods, big data, and Google), data sources (social media and mobile), areas of focus (global and urban culture, consumer services, health, law and hate speech, gender, Twitter and politics, governance, and cybercrime), and other topics (ethics and impact). These areas closely match the areas identified by the Delphi process. These include social research questions (citizen and community use of data, citizen interaction with data and algorithms, data literacy, power and accountability for data and algorithms, social construction of data and ­algorithms, and social implications of data and automation). As well, they involve social research topics and challenges (social impacts of data, privacy and surveillance, citizens/ everyday life experiences and uses of data, understanding open data/algorithm transparency/accountability, data exclusion/inclusion/divides, digital identity and data, data visualization/representation/social construction of data, and economic impacts). And they also include methods challenges (interdisciplinarity, analytics and measurement, combining old and new social research methods, concepts, social measures, understanding and developing new research methods).

524   Simeon J. Yates ET AL. Missing from this domain are substantive empirical studies of either the research questions, or of the implementation of digital methods. We would argue that this domain therefore needs to develop a set of robust case studies addressing the key research questions identified by the Delphi process.

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high-frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/ provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualisations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all,  by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https://waysofbeingdigital.com/literature-analysis-interactive-results/ for interactive visualizations with mouse-overs of the main clusters of concepts within each Domain, and the relative frequency of concepts associated with each cluster.

References Ajunwa, I., Crawford, K., & Ford, J. S. (2016). Health and big data: An ethical framework for health information collection by corporate wellness programs. Journal of Law, Medicine & Ethics, 44(3), 474–480. Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. boyd, d. (2015). Making sense of teen life: Strategies for capturing ethnographic data in a networked era. In E. Hargittai & C. Sandvig (Eds.), Digital research confidential: The secrets of studying behavior online (pp. 79–102). Cambridge, MA: The MIT Press. boyd, d. & Ellison, N.  B. (2007). Social network sites: Definition, history, and scholarship. Journal of Computer-Mediated Communication, 13(1), 210–230. boyd, d. & Crawford, K. (2012). Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 15(5), 662–679. Crampton, J.  W., Graham, M., Poorthuis, A., Shelton, T., Stephens, M., Wilson, M.  W., & Zook, M. (2013). Beyond the geotag: Situating “big data” and leveraging the potential of the geoweb. Cartography and Geographic Information Science, 40(2), 130–139. Crawford, K., Gray, M. L., & Miltner, K. (2014). Big Data| Critiquing Big Data: Politics, ethics, epistemology| Special section introduction. International Journal of Communication, 8, 10.

ESRC Review: Data and Representation   525 Elmer, G. (2013). Live research: Twittering an election debate. New Media & Society, 15(1), 18–30. Elwood, S. (2006a). Negotiating knowledge production: The everyday inclusions, exclusions, and contradictions of participatory GIS research. The Professional Geographer, 58(2), 197–208. Elwood, S. (2006b). Beyond cooptation or resistance: Urban spatial politics, community ­organizations, and GIS-based spatial narratives. Annals of the Association of American Geographers, 96(2), 323–341. Kennedy, H., Hill, R. L., Aiello, G., & Allen, W. (2016). The work that visualisation conventions do. Information, Communication & Society, 19(6), 715–735. Latour, B. (2010). Tarde’s idea of quantification. In M. Candea (Ed.), The social after Gabriel Tarde: Debates and assessments (pp. 129–144). London: Routledge. Latour, B. & Woolgar, S. (1986). Laboratory life: The construction of scientific facts. Princeton, NJ: Princeton University Press. McKee, H.  A. (2011). Policy matters now and in the future: Net neutrality, corporate data ­mining, and government surveillance. Computers and Composition, 28(4), 276–291. Mordini, E., Wright, D., Wadhwa, K., De Hert, P., Mantovani, E., Thestrup, J., . . . Vater, I. (2009). Senior citizens and the ethics of e-inclusion. Ethics and Information Technology, 11(3), 203–220. Negroponte, N. (1995). Being digital. New York: Vintage. Rice, R. E. & Hoffman, Z. T. (2018). Attention in business press to the diffusion of attention technologies, 1990–2017. International Journal of Communication, 12. http://ijoc.org/index. php/ijoc/article/view/8250/2424 Ruppert, E., Law, J., & Savage, M. (2013). Reassembling social science methods: The challenge of digital devices. Theory, Culture & Society, 30(4), 22–46. Summerfield, P. (1985). Mass-Observation: Social research or social movement? Journal of Contemporary History, 20(3), 439–452. Tapscott, D. (1999). Growing up digital: The rise of the net generation. New York: McGraw-Hill. Wright, D. (2011). A framework for the ethical impact assessment of information technology. Ethics and Information Technology, 13(3), 199–226.

chapter 19

Digita l Citizenship i n the Age of Data fication Arne Hintz

Introduction We increasingly interact with our social and political environment through digital infrastructures. Digital tools and platforms have become essential for us to participate in society. Political engagement takes place through social media debates and online campaigns; access to services requires online interactions between citizens and the state or commercial providers; and a plethora of digital traces affect and condition human activity, from crossing borders to operating a phone. Everyday cultural practices involve social media exchanges, and business transactions are mediated by the platform economy. In these (and many more) ways, we increasingly enter the sphere of civic activity— and develop agency—through digital media. Digital citizenship has emerged as a concept to describe this condition. It has acquired increasing popularity although, or perhaps because, it lacks a narrow and commonly agreed definition, and it is used for various purposes and to understand different practices. It has been applied to describe the centrality of digital infrastructure in contemporary social interactions, the implications for people’s identities and forms of belonging, and necessary skillsets and conditions. Yet in most iterations it tries to understand the relation between the digital and the political, and thus the role of the digital subject as political subject. While this may include being a subject to an authority, such as the state, most accounts of digital citizenship have been interested in the digital citizen as a subject of his or her own making. They have thus departed from classic understandings of the citizen as defined through membership of a nation-state, and have focused instead on the self-creation and self-assertion of citizens as active participants in society through digital acts. Such acts may include, for example, the use of digital tools for

Digital Citizenship in the Age of Datafication   527 citizen journalism, online campaigns and digital community-building, and the making of rights claims in digital environments. The notion of digital citizenship thus implies a focus on citizens’ agency and the empowerment derived from the use of digital tools. However, the emerging condition of datafication complicates this perspective. The pervasive collection of data through online platforms and smart devices, its analysis by corporate actors and the state, and the profiling of people through an ever-increasing amount of data points all question the empowering nature of digital environments. While digital tools continue to help ­citizens develop their own position in society, they also enhance the opportunities for governmental and commercial institutions to assign – and restrict – citizenship roles through data analysis. In this chapter, I will explore these conflicting dynamics and trace the consequences of diverse power shifts between the citizen and the state. The goal is to develop an understanding of digital citizenship that responds to the challenges of the “age of datafication.” To start with, I will review key themes and trends in citizenship studies. Then I will explore the notion of digital citizenship as the self-constitution of subjects in society through digital acts. Finally, I will discuss how datafication challenges previous notions of digital citizenship and explore the possibilities of agency in a datafied age. This, I hope, will enhance our understanding of the changes and transformations of citizenship in societies that are increasingly governed through “big data” analysis.1

Citizenship Definitions of Citizenship According to its classic understanding, citizenship denotes the formal relation between a person and a nation-state. It is a notion of belonging and of membership to a political community, formalized through documents such as the passport and typically received at birth. As a concept it is closely connected to territory—the territory where someone is born or chooses to live—but also to power, resources, and community. It is conferred by a powerful actor, the state; it determines an individual’s access to and share in the state’s collective resources; and it strongly affects questions of belonging and identity (Turner, 2009). Citizenship ties people to one another within the jurisdiction of the state and binds them to the state in specific ways (Tilly, 1997). This is underpinned by the rights (such as voting) and obligations (such as taxation) of the individual in relation to the state. The existence of, and demand for, civic, social, and political rights that ensure citizens’ equality and protection is thus a key marker of citizenship (Marshall, 1950). Furthermore, and to varying degrees, the concept entails the participation of citizens in the community’s political, social, and economic processes (Bellamy, 2008). Historically, different strands of citizenship have developed that foreground different characteristics. The Greek (or republican) model emphasized the participation of

528   Arne Hintz c­ itizens in public affairs and in the institutions of public administration, and thus in the creation of the rules, laws, and decisions that govern them. Active involvement in the affairs of the polis was not just a right but an obligation, and membership was defined by strong social bonds (and social exclusion according to land ownership, gender, race, and class). In contrast, the Roman (or liberal) model emphasized the legal status of citizens, their freedom to pursue their private interests, and the protection they enjoyed from interference by state authorities, other powers, and other individuals. According to this model, “a citizen is one who enjoys the common liberty and protection of authority” (Walzer, 1970, p. 205). Since the modern period, citizenship has increasingly become part of state-building efforts and the construction of national consciousness, and has been intertwined with the emergence and protection of a wider set of civic, political and social rights (Bellamy, 2008). These past accounts have shaped how we think about citizenship. They point to a division in its conception between active and passive forms (Turner, 1990). Active citizens are socially and politically engaged with their environment, while passive citizens focus on their private matters in a given framework of rights, obligations, and protection. While modern states typically combine political and legal citizenship, contemporary nation-states with mass societies may reflect more prominently the Roman approach of providing state protection across large territories. As citizens in contemporary societies, we may mostly operate as passive recipients of rights, but active engagement is occasionally encouraged or even required, and more substantial participation in political life is certainly demanded and enacted by civil society groups and social movements. So both forms of citizenship are part of how the concept is understood today.

Challenges and Alternatives These traditional approaches to citizenship have been challenged from different directions. In particular, critics have highlighted the exclusionary nature of citizenship as well as its alleged equality and universality. Feminist and LGBTQ rights groups, as well as ethno-cultural communities, have argued that citizenship rights are not distributed in an ­equitable manner, and have pointed out biases in citizenship conceptions towards white,  male, and property-owning individuals (Fraser & Gordon,  1994; Lister,  1997; Young, 1989). The territorial focus of citizenship has been challenged by processes of globalization, internationalization, and transnationalization, which affect the autonomy of the nation state and its ability to confer rights. The rise of global corporations and civil society organizations, together with the growing role of both global and regional institutions such as the UN and the EU, point to transnational forms of authority that convey rights and generate obligations more widely (Urry,  1999). National rule-making has “become embedded within more expansive sets of interregional relations and networks of power” (Held & McGrew, 2003, p. 3) involving cities, provinces, states, and regional as well as transnational institutions. Similarly, forms of belonging and affiliation have spread from the national to local and regional communities as well as to transnational

Digital Citizenship in the Age of Datafication   529 diasporas and non-territorial associations, such as cultural communities and communities of practice (McNevin, 2011). Further, a (related) challenge to classic forms of citizenship emerged with the rise of neoliberalism in the late 20th century and the increasing primacy of the economic domain over the political (Mouffe, 2000). This has advanced the importance of consumption in contemporary societies, de-emphasized political participation, and weakened democratic institutions (Crouch, 2004). It has led to the emergence of a “citizen-consumer” (Clarke et al., 2007) who is politically passive and interacts with society mainly as a consumer of privatized goods and service (Turner, 2017). Each of these critiques addresses different dynamics, but together they show that classic reference points that organize belonging are being challenged, political communities are witnessing transformations, and new claims for inclusion are being made. These challenges have led to diverse attempts to address the changing social, political, and economic situation through new conceptions of citizenship. “Feminist citizenship” (Lister, 1997), for example, has been proposed as a pluralist conception of the citizen that is rooted in difference, rather than exclusionary homogeneity. “Postnational citizenship” (Soysal, 1994) responds to the growing reality of migration and is “based on universal personhood rather than national belonging” (p. 1). While maintaining the classic focus of citizenship models on participation in a political community, it proposes forms of inclusion that are not bound to historical, cultural, or ethnic ties to that community. “Transnational” (Bauböck,  1994) and “cosmopolitan” (Linklater,  2002) forms of ­citizenship move the emphasis further away from the nation-state and respond to the prominent role of trans-border communities, global civil society, and international law. According to this school of thought, rights and obligations are increasingly determined and implemented by international agreements and institutions, leading to “the emergence of an international law of citizenship” (Spiro,  2011, p. 696). Without formal acknowledgment, transnational citizenship largely remains speculative and aspirational, but it demonstrates a diversity of affiliations that are interacting with, and potentially superseding, the nation-state. As Isin and Ruppert (2015) note, “the subject is a composite of multiple forces, identifications, affiliations, and associations” (p. 21). They point to NSA whistleblower Edward Snowden as a representative of a new form of international active citizenship that responds to an emerging set of international responsibilities and corresponds with the transnational circulation (and global collection) of data (Isin & Ruppert, 2017). Arguably, the most important shift in citizenship studies has come with an increased focus on the acts of citizens in developing their own position in society. This perspective emphasizes the practice, rather than the status, of citizenship, and views the latter as “the expression of agency” (Lister, 1997, p. 38). Citizenship scholars have drawn from theories of performativity to investigate the ways in which people constitute themselves as citizens (Isin, 2012) and in which they “do citizenship,” for example by claiming new rights and implementing existing rights through their own acts (Zivi, 2012). As in the republican model, the focus here is on participation, but it goes beyond the mechanisms of receiving citizenship and instead highlights practices of achieving citizenship by doing, enacting, or performing citizen acts and thereby bringing citizen subjects into being.

530   Arne Hintz This understanding of citizenship begins with the citizen as an active figure, not with the  nation-state or another form of belonging (Clarke et al.,  2014). Citizenship thus becomes a site of contestation and social struggle. Citizen engagement in civil society associations and social movements—from neighborhood groups to NGOs to protest mobilizations—are important spaces for “doing citizenship,” and claiming new rights becomes a key feature of performing and enacting one’s own role as a citizen. This does not mean that belonging and obedience to the nation-state are magically replaced by the citizen as a sovereign figure that enacts itself as a subject of power. We are still members of national jurisdictions, with the rights and obligations assigned to them. According to Isin and Ruppert (2015), legality remains a core force of subjectivation, as does the imaginary aspect of national belonging, but these are complemented by performativity as a key pillar for understanding citizenship. Citizen agency and citizen acts are thus placed in the forefront of our discussion of citizenship.

From Digital Acts to Digital Citizenship Digital Acts In societies that are increasingly mediated through digital technologies, digital acts become important means through which citizens create, enact, and perform their role in society. And just as citizens have traditionally re-asserted their position in relation to the state by claiming human and civil rights, they are now “making rights claims” (Isin & Ruppert, 2015, p. 4) in the digital environment. Digital acts may include the wide range of activities that we are regularly involved with on the Internet, such as emailing, messaging, blogging, collaborating, coding, friending, liking, posting, tweeting, and uploading. These and similar acts help create our role and position in society, and they serve as claims to the right to act freely in digital environments (Isin & Ruppert, 2017). Yet among the broad range of digital actions and interactions, we can identify a number of more specific acts that demonstrate how citizens assert themselves. Digital protest and online campaigns have been, arguably, the most prominent examples, and experiences which have been widely discussed range from the Anonymous online protests to the use of social media in uprisings such as the Arab Spring, the role of hashtags such as #MeToo, and the proliferation of campaign platforms. While there are controversies regarding the extent to which digital tools have enabled recent waves of protest and activism (e.g., Dencik & Leistert, 2015; Morozov, 2011), the Internet has undoubtedly offered a significant domain for people to raise their concerns, mobilize for action, generate recognition, and (re)claim rights. A plethora of practices have emerged from what some have enthusiastically called “liberation technology” (Diamond, 2010)—including online mobilizations, whistleblowing initiatives such as

Digital Citizenship in the Age of Datafication   531 WikiLeaks (Brevini, Hintz, & McCurdy, 2013) and Xnet (Siapera, 2016), “sousveillance” by citizens watching authorities and exposing wrongdoing (Mann et al.,  2003), and many others. Communication tools and platforms are also applied for more direct forms of state-citizen communication and citizen participation in public administration, for example for public consultations, citizen-based policy reviews, and participatory budgeting (Simon et al., 2017). People have thus used a diverse set of tools and practices to intervene into political debate and interact with their political environment. Furthermore, the Internet has enabled assertive practices far beyond immediate political interventions. Citizen journalism has advanced people’s ability to contribute their voice and expertise to questions of public concern, and it has challenged established professional media (Allan, 2013). Networks such as Indymedia and organizations such as Global Voices have developed citizen-based platforms for news and debate, while groups like Riseup.net have set up alternative technical infrastructure and offered online services (such as web space and mailing lists) to fellow activists (Hintz & Milan,  2013; Milan,  2013). Digital storytelling has advanced processes of narrative exchange and offered channels for citizen-based knowledges to enter the public realm (Hartley & McWilliam, 2009). Opportunities for digital subjects to engage with their social and political environment have opened up, moreover, as forms of social organization have come to be more fluid and spontaneous in a digital context. “Smart mobs” (Rheingold, 2002) and “connective action” practices (Bennett & Segerberg, 2014) have reduced access barriers for the individual to purposeful collective action and have challenged classic organizational models from the pre-digital age. Similarly, participatory modes of production have allowed for enhanced roles for digitally active subjects in the cultural and economic realm. Digital acts may include the development of fan fiction, which has increasingly complemented classic cultural production (Jenkins,  2008), the contribution of one’s skills to free software development or one’s knowledge to Wikipedia (Shirky, 2010), and a range of other practices of online collaboration, networked production, and voluntary contributions to larger projects (Benkler, 2006).

Digital Citizenship These features of digital activity have been integrated—to varying degrees and with varying priorities—in different iterations of the concept of digital citizenship. Outlining an early version of the notion in the late 1990s, Katz (1997) defined digital citizens according to the extent of their Internet use but, at the same time, observed “a new political ethos” and “a new political sensibility” among this emerging group. In his view, they were civic-minded, shared common values, revered civil liberties, and were committed to openness. Digital citizens, according to Katz, regarded the Internet as a “tool for individual expression, democratization, economic opportunity, community and education,” and technology as “a force for good.” This perspective was certainly grounded in the optimism of cyber-libertarian approaches of the time, which regarded the Internet as an

532   Arne Hintz autonomous space where citizens (and private business) were freed from coercion by governments. Just a year before, John Perry Barlow had formulated the promises of the Internet, in his “Declaration of Independence of Cyberspace,” as a new space “where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity” (Barlow, 1996). The idea of cyberspace as a new “agora” of free and democratic discussion was widely shared and restrictions to this freedom were hardly imaginable. Katz thereby set the tone for an understanding of dig­ ital citizenship that is closely related to notions of democratization and empowerment. Mossberger, Tolbert and McNeal (2007) refined the concept by focusing on the effectiveness of Internet use and on the consequences of an individual’s participation in ­society. They adopt a similar starting-point by defining as “digital citizens” those who use the Internet frequently but highlight technical competence and information literacy skills for effective use along with the conditions of access. They thereby respond to classic citizenship discourses around inclusion and exclusion, and to the conditions of membership in a community. Yet they also pick up the positive perspective on the Internet as a force that facilitates democratic participation and, thus, regard digital citizenship as the ability to participate in society. Their approach has been supported by, for example, Lindgren (2017), who notes that debates on digital citizenship address “the opportunities and resources that a person has to participate online in society and politics” (p. 147). Questions of access and competence, and inequalities in the capacity to use digital tools, come to the forefront here. More often, however, contemporary investigations into digital citizenship focus on how specific “digital acts” (Isin & Ruppert, 2015) and “acts of citizenship” (Isin, 2008) constitute people’s role in society and generate their position as active citizens. Couldry et al. (2014), for example, investigate digital storytelling as a practice that forms citizenship. Using Dahlgren’s (2003) notion of civic culture which examines the conditions for people’s participation in the public sphere and thereby identifies “the possibilities of people acting in the role of citizens” (pp. 154–155), they explore the implications of digital acts for participation and, by extension, the building of citizenship. An important aspect for them is the notion of recognition, as “digital media and digital infrastructures provide the means to recognize people in new ways as active narrators of their individual lives” (Couldry et al., 2014, p. 615). This resonates with observations on social media uses of young people who “write themselves and their communities into being” (boyd, 2007, p. 14). Extensive case studies have documented how digital acts can enact and perform citizenship (McCosker, Vivienne, & Johns, 2016) as a form of do-it-yourself citizenship (Ratto & Boler, 2014). Taking matters into one’s own hands and becoming producers allows protagonists to (re)construct identities, develop agency, and intervene into systems of authority and power. Ratto and Boler reaffirm Hartley’s claim of “self-determination as the foundation to citizenship” (Hartley, 1999, p. 178), but add that this act is politically transformative as it leads to “new modalities of political participation” (p. 3). Their broader understanding of “DIY citizenship” is grounded in what they call “critical making,” which encompasses both online and offline practices, including non-digital forms of media and diverse forms of crafting, hacking and making. This perspective connects

Digital Citizenship in the Age of Datafication   533 with previous analyses of how participation in older, non-digital media, such as ­community radio and video activism, can generate recognition, self-actuation and community-building. Rodriguez (2001), for example, conceptualized such practices and their implications in terms of active “citizenship,” tracing how power relations in communities are changed as a result, authority is questioned, and new identities (or even new communities) are formed. As the notion of “citizenship” is increasingly applied across studies that focus on practices of participation, activism, or DIY culture, the contours of the concept may lose some of their sharpness. Isin and Ruppert (2015) have addressed this problem by reconnecting digital citizenship with the classic idea that rights claims against an authority are at the basis of the (self-)creation of citizens. Digital citizens, in their view, do not just receive existing rights, but also make claims to rights that may not yet exist. They do not just use the Internet to conduct established forms of citizenship, but to develop new forms and claim new rights. These go beyond established civic, political, and social rights and address the technological context of humanity’s future. Further, such rights are not necessarily limited to the territory of the nation-state but refer to a different kind of environment—cyberspace—where people meet and interact. The result is, as Siapera (2017) notes, a new ontology of the citizen, brought into being through digital acts.

Empowerment and New Affiliations What connects many of these accounts of digital citizenship is a focus on citizens’ agency and the progressive social change that (may) result from it. At the core of most approaches is an interest in how people enact themselves as subjects of power (Isin & Ruppert, 2015). Their self-organized acts are often related to social movements, countercultural and political activism (Ratto & Boler, 2014). Effective use of the affordances of digital, mobile and social media, it is argued, can enhance participation in society (Mossberger, Tolbert, & McNeal, 2007) and generate innovation, social change, possibilities for contest, capacity for creative cultures of practice, and public good—in short, “democratize civic and political participation and facilitate social inclusion” (Vivienne, McCosker, & Johns, 2016, p. 8). The concept of digital citizenship therefore has an intrinsic connection with citizen empowerment. Digital media, it is claimed (explicitly or implicitly) have allowed us to raise our voice, be heard in social and public debate, and construct our role in society. This implies a democratizing trend in state-citizen relations. It does not mean, though, that scholars follow uncritically the cyberlibertarian dreams of the earlier phase of Internet development. Enacting ourselves and performing our own citizenship does not negate the existence of traditional authorities that convey citizenship rights. Isin and Ruppert (2015) acknowledge that digital citizens still submit to governments in which they are implicated. Social media activists, similarly, submit to new regulatory regimes by platform providers. Vivienne, McCosker, and Johns (2016, p. 8) recognize that digital media environments may also affect user experiences “in ways that limit

534   Arne Hintz users’ agency and capacity to shape decisions that govern their lives.” But despite these qualifications, the overarching focus in studies of digital citizenship is on user actions and citizen agency, on claims and contestations, and on the assertion of rights and voice. The active digital citizen operates in an environment of changing affiliations and social configurations. As noted earlier, critical citizenship studies have pointed to a ­fragmentation of the public sphere into multiple publics and a loss of cohesion based on traditional bonds. Classic reference points for citizenship—such as national borders and formal organizations—have lost some of the key roles they played in traditional nation-state societies and are complemented by a wider range of often looser and more fluid affiliations. The debate over digital citizenship has advanced our understanding of these developments as the Internet has given rise to new forms of online communities and networked publics. The individual of the digital age relates to a variety of cultural, social, political and geographic affiliations, which overlay the membership in a nationstate. As already noted, the digital subject is “a composite” (Isin & Ruppert, 2015, p. 12) of multiple forces, identifications and associations. Social and political belongings become less stable and classic organizational forms less relevant as the collective “we” is increasingly replaced by temporary collaborations of “I”s (Milan, 2013). With concepts such as “networked individualism” (Rainie & Wellman,  2012) scholars have tried to capture these changing social formations in and through digital society. As individuals adopt a variety of subjectivities and identities, form new kinds of associations and meet in new spaces (beyond the national), the public sphere becomes divided into diverse sets of what Dean (2001) calls “cybersalons.” Digital citizens participate in the “salons” that relate to their interests, tastes, political affiliations or geographic communities, but not necessarily to their national membership. For Papacharissi (2010), the digital realm has led to the emergence of a “private sphere” in which public and private activities become intertwined and hard to separate. Digital citizens, she argues, often interact with their social and political environment (e.g., via social media) from the private space of, for example, their bedroom (and, we may add, via the privately owned medium of a commercial online platform). Seemingly private acts in online communities may have public political effects, and civic engagement is often shared with entertainment and a wider set of motivations—we may think of memes or of the trolling actions of the Anonymous “smart mobs.” Papacharissi calls this “private citizenship,” and it resonates with the fluidity of actions and belongings mentioned earlier. As noted by other observers, this form of citizenship comes along with a transformation of social structures from masses and collectivities to “a variety of atomized actions” (Papacharissi, 2010, p. 131). These actions are collaborative and interactive but they lack the stability of life-long national affiliation or other traditional or­gan­i­za­tional processes. Understanding digital citizenship as performative enactment of citizenship, in a context of fluid affiliations and networked individual acts, highlights the empowerment of the citizen through digital acts and the democratizing effects derived from this. As a consequence, this perspective suggests a power shift from the state and corporate sector

Digital Citizenship in the Age of Datafication   535 to the citizen. It points to challenges to the authority of the state and its governance of the citizenry.

Digital Restrictions However, with the evolution of technological infrastructure and the changing politicaleconomic context of its development and use, the conditions for digital citizenship have changed. In the early phase of the popularization of the Internet (the 1990s), cyberspace had challenged the law’s traditional reliance on territorial borders, and thus questioned government’s ability to control citizens’ behavior. As Internet users moved freely across the global network, cyberspace was widely seen as distinct from physical space, with a distinct set of rules and doctrines (Johnson & Post, 1996). However, and contrary to  Barlow’s earlier claims that governments “have no sovereignty where we gather” (Barlow, 1996), states have managed to gain significant influence over technical infrastructures and digital spaces of human interaction since the start of the new millennium. By exerting control over important nodes of the network and by providing regulatory frameworks for infrastructure and content providers, they have introduced an increasing range of restrictions and established virtual borders around state territories (Goldsmith & Wu, 2006). This has restricted content flows across ­territorial borders, most prominently exemplified by the “Great Firewall of China.” During times of protests and uprisings, governments are now routinely closing down online services in their country or in specific locations, or interrupting connections to the outside world. Inside most countries, filtering and blocking content that transcends moral, religious or political limits set by governments has become common practice (e.g., Deibert et al., 2008). These forms of restrictions are increasingly complemented by other forms of interventions into content, such as the strategic dissemination and manipulation of information (Deibert et al., 2012). The open environment for the creation and exchange of citizen acts is thus gradually transforming into a tightly controlled space in which DIY citizen activity is allowed within limits, rather than enabled naturally by the technological infrastructure. Similarly, the possibilities to monitor citizens’ interactions, exchanges, locations and movements have increased. The Snowden revelations demonstrated the extent to which state agencies such as the NSA and the GCHQ have intercepted and surveilled much of our online communication and have hacked into personal devices as well as larger telecommunications services (Greenwald, 2014). A range of laws and regulations allow governments to monitor citizens’ communication and require telecommunications operators and Internet service providers to store (and provide access to) detailed communications data (Hintz & Brown, 2017). Whereas a cartoon from the 1990s famously declared that “on the Internet, nobody knows you’re a dog,” our identity, movements and interests are now closely monitored. Activists and dissidents have suffered in particular ways from governmental surveillance, as have ethnic and religious minorities.

536   Arne Hintz State authorities have used Facebook and other social media platforms to scrape the user data of alleged dissidents and have distributed malware that installs spying software onto the infected computer, for example to capture webcam activity (Villeneuve, 2012). In many countries, including the democracies of the West, the policing of protests and other civic activities now includes the monitoring of social media activity and the identification of “threats” through the analysis of social media traffic (Dencik, Hintz, & Carey, 2017). The role of social media in state strategies points us to the political-economic context of contemporary digital citizenship. The tools that we use to enact and perform our citizenship are increasingly a small set of commercial platforms, provided by a highly concentrated business sector. We use these platforms to produce, submit, and share content, connect with fellow users and engage with our environment, but this engagement takes place in a context of commodification and monetization, and its boundaries are set accordingly (Fuchs, 2014). Social media companies, “sharing” platforms, app stores and other providers of crucial infrastructure regulate what is allowed on their sites, and they thus act as gatekeepers that both allow and restrict the activities of digital citizens (Hintz, 2015). Moreover, their core business model is the extraction, analysis and monetization of personal data. The “data mine” (Andrejevic, 2012) of online platforms enables the detailed monitoring and analysis of Internet users that is at the core of the emerging mode of “platform capitalism” (Srnicek, 2016) and of current surveillance trends (Lyon, 2015). This brief sketch of digital restrictions suggests two complications for the performative and active (self-)construction of digital citizens. First, the tools, platforms and infrastructure that digital citizens use are subject to limitations (in terms of their or­gan­i­ za­tional structures as well as interventions by states and the private sector) to the digital acts that are allowed and enabled. Second, digital citizens are not only constituted through their actions, but also because everything they do leaves data traces which are analyzed and used in a variety of ways—not by digital citizens but by platform companies, data brokers, and the state. The first of these two trends provides problems for dig­ ital citizens but does not question, as such, the concept’s focus on self-enactment through digital acts and rights claims. The second, however, complicates contemporary notions of digital citizenship more profoundly. It suggests that digital citizenship is not only self-constructed and self-defined, but equally—if not more substantially—constructed by the governmental and business realm.

Digital Citizenship and Datafication The Datafication of Life and Governance Citizens’ active engagement with their environment through digital networks produces data that is collected, processed and analyzed and can uncover a wide range of preferences, personal traits and social networks. The increasingly close integration of digital

Digital Citizenship in the Age of Datafication   537 technologies in our everyday lives means that we create data traces when we connect with our friends on apps; share intimate information about our personal lives via chats; vote, protest and campaign on platforms; and conduct business interactions that allow us to go about our everyday life—from online banking to ordering food, transport and lodging. Moreover, much of what previously would have been called “offline” activities, such as moving in the geographical space of a “smart city” that is filled with sensors or watching a movie on a “smart TV” in a “smart home,” can now produce detailed profiles about us. Communication devices, which hold a vast range of information about our personal and professional lives, now extend beyond computers and mobile phones (and, thus, the tools through which digital citizens have traditionally enacted their role in cyberspace) to light switches, cars and waste bins. Such data tracking renders ordinary everyday lives increasingly transparent to large organizations whilst at the same time those who collect and use the data remain invisible to those whose data are garnered and used (i.e., citizens; Lyon, 2015). The technical ability to turn our behavior and social activities into data points that can be collected and analyzed has become the basis of the digital economy. From the platform economy to the expanding data broker industry, an ever-growing range of data-driven businesses develop revenue from the aggregation, re-packaging and circulation of people’s data (Rieke et al., 2016; Srnicek, 2016), using a wide array of attention technologies (Rice & Hoffman,  2018). The data-focused logic of accumulation that emerges here has been called “surveillance capitalism” (Zuboff, 2015) as it relies on the processing of data about people and their activities. Governments, too, increasingly adopt practices of profiling, sorting and categorizing populations to allocate and manage public services (Ansorge,  2016). Government departments and state agencies in many countries now rely on the mass collection of data on people to inform security measures, provide services and devise policy in areas such as education, child welfare and housing. Categorizations based on citizens’ consumption habits, political preferences, ethnic backgrounds or geographic location affect the level of services they get, the ways in which they are regarded as “risks,” or the ease with which they may cross borders. In predictive policing, for example, data on neighborhood crime rates, previous arrests, personal living conditions, and a range of other characteristics, are combined to create categories of potential future criminals and affect sentencing (Angwin et al., 2016; Trottier, 2015). Predictive risk models are also used to assign child protective services, decide eligibility for health services, and “vet” migrants and refugees (Crawford, 2016; Eubanks, 2018; Redden et al., 2019). To that end, citizens are increasingly assigned data scores, not just by commercial services such as financial institutions and insurance agencies, but also by government and public administration (Chin & Wong, 2016; Dencik et al., 2019). For the public sector, data-based decision-making promises more efficient delivery of public services, better allocation of resources, and better responses to social problems and a variety of “risks.” It provides a seemingly “scientific” method for tackling uncertainty by rendering it perceptible and justifying responses, even if the “science” on which it is based is often questionable (Amoore and Piotukh, 2016). It enables proactive forms

538   Arne Hintz of governance that utilize the predictive capacities of algorithmic data processing and allow for pre-emptive measures. The “logic of preemption” (Massumi, 2015) of databased governance holds particular attraction for public administration in the context of security challenges and resource shortages.

Implications of Datafication The logic of data-based governance requires the collection of the widest-possible range of data and is therefore inherently in conflict with privacy concerns. Moreover, it establishes trust in the processes, rationalities, techniques and outcomes of algorithmic decision making (Aradou & Blanke, 2015). It relies on the assumption that data processing delivers “objective” results. The root of “dataism,” as Van Dijck (2014) calls it, is thus an ideological belief in particular forms of knowledge and social order. As a consequence, algorithms adjudicate more and more decisions in our lives and become “new power brokers in society” (Diakopolous 2013, p. 2). As noted before, practices of categorization, classification, segmentation and selection of populations lie at the core of data-based governance. People are increasingly subject to what Lyon (2015) has described as “social sorting,” with significant consequences for their economic, social and political opportunities as well as their civic engagement (see also Cheney-Lippold,  2017; Vagle,  2017). While the segmentation of population groups according to information gathering is not new, emerging forms social sorting based on “big data” accumulate a much broader range of personal data, affect a wider reach of both private and public life, and establish new categories, based on the traces of digital life. These new forms of social sorting operate, in part, outside established categories of civic rights, are often created without our knowledge and may be based on criteria that do not correspond to lived experience. As Couldry and Hepp (2017) note, “when governments’ actions, whatever their democratic intent, become routinely dependent on processes of automated categorization, a dislocation is threatened between citizens’ experience and the data trajectory on the basis of which they are judged” (p. 212). The ways in which people are “judged” through data lead us to, arguably, the most significant implication of datafication for digital citizenship—the changes in societal power dynamics. With data-based sorting, scoring and categorizing, the emerging power relations of the datafied age are between those who provide personal data (digital citizens) and those who own, trade and control it (typically, large Internet companies and the state). While datafication is by no means the first instance of the state using information processing to expand its influence over citizens (Mattelart, 2003), it provides vastly enhanced possibilities to understand, predict and control citizen activities. This empowers the state, particularly, in a context of dispersed and individualized social structures that provide challenges for a comprehensive regulation of the citizenry by state authorities. Monitoring and profiling the “atomized actions” of populations allows the state to address a fragmented reality and create a new and governable collectivity.

Digital Citizenship in the Age of Datafication   539 This points us to the original (French) meaning of the word surveillance: supervision. The digital citizen is, at the same time, an active citizen and a supervised citizen.

Datafication and Citizenship So while established understandings of digital citizenship have emphasized the performative and empowering character of people’s self-construction of their own role in society through digital acts, data-based governance moves the perspective in the opposite direction by assigning people their societal position based on data traces. This has implications for the participatory and “active” forms of citizenship, but it may also affect the more traditional understandings of citizenship as nationality, which the following example demonstrates. As the Snowden revelations documented, the NSA is allowed to monitor the communication of foreigners, but not US citizens, and it establishes whether a piece of online communication belongs to a foreigner or a US citizen by analyzing the infrastructure that is used (e.g., phone number, IP address) as well as the data that is produced (e.g., language and degree of interaction with people inside and outside the US). National citizenship thus becomes dependent on communications data. Cheney-Lippold refers to these designations as jus algoritmi, thus contrasting it with classic legacies of jus sanguinis (family-based citizenship) and jus soli (location of birth). Jus algoritmi is “a formal, state-sanctioned enaction of citizenship that distributes political rights according to the NSA’s interpretations of data” (Cheney-Lippold, 2016, p. 1729). As a consequence, it “functionally abandons citizenship in terms of national identity in order to privilege citizenship in terms of provisional interpretations of data” (p. 1738). It sometimes aligns with a citizen’s formal nationality, and sometimes becomes detached from it. Like other data categorizations used in algorithmic governance, it is an identity we are assigned through data analysis, not necessarily one that we identify with or create for ourselves. Datafication and algorithmic decision making thus change the parameters of citizenship, but not necessarily in the ways foreseen by scholars of ­digital citizenship. What agency, then, does the datafied citizen have? Datafied subjects enact their role in the digital world, in part, by challenging, mediating and negotiating data collection and analysis. This has included, for example, practices of technological self-defense to restrict data gathering. Forums to provide secure digital infrastructures for active digital citizens have proliferated, with “numerous digital rights and Internet freedom initiatives seizing the moment to propose new communication methods for activists (and everyday citizens) that are strengthened through encryption” (Aouragh et al., 2015, p. 213). These have included privacy-enhancing tools such as the TOR browser, the GPG email encryption system and the encrypted phone and text messaging software Signal. Privacy guides such as the Electronic Frontier Foundation’s “Surveillance Self-Defense” (https:// ssd.eff.org/en) and the Tactical Tech Collective’s “Security in a Box” (https://tacticaltech. org/projects/security-box) explain the use of privacy-enhancing tools and offer advice on secure online communication. “Crypto-parties” have brought necessary training in

540   Arne Hintz such tools to towns and cities worldwide. Digital rights advocacy, moreover, has offered a promising avenue for claiming rights in datafied spaces and address related challenges. In the UK, for example, organizations such as Privacy International, the Open Rights Group, Big Brother Watch, Article 19 and Liberty have regularly issued statements regarding their concerns about data collection organized public debates and lobbied legislators. Further, “data activism” has applied alternative forms of data use for social and political purposes. Initiatives such as Occupy Data have supported social movements through data gathering, analysis and visualization—both of public datasets that concern the movements’ agenda for social change, and of data collected by the movement itself, for example on police violence. Open Data campaigns have leveraged the availability of public data to support campaigning efforts (Renzi and Langlois, 2015). Using data in empowering ways and as a means for social and political change, data activism aims to “bring democratic agency back into the analysis of how big data affect contemporary society” (Milan, 2017, p. 153). Data activism scholars suggest that it can empower users of digital infrastructures to be more critical about data collection and thereby alter the relationship between citizens and data collectors. (Milan & van der Velden,  2016). Revisiting and updating the practices of digital citizenship, it re-claims the agency of citizens that is at risk in datafied societies. However, it does not change the underlying challenge of data collection and analysis that is now inherent in every digital act. Data activism can, in this sense, add bottom-up sousveillance and dissident data uses to ­existing top-down surveillance, but it may expose citizens to the same mechanisms of data collection and profiling.

Digital Citizenship between Empowerment and Control The performative character of digital citizenship is thus retained in a context of datafication but, as I argue, is severely limited. We can observe a complex set of processes in which digital citizenship is constituted both through the acts of digital subjects and through the collection and analysis of their data by state and corporate actors. The construction of digital citizenship is thus the result of a combination of acts by the individual, commercial platform providers, and the state. Many scholars of digital citizenship recognize the multiple forces that are at play—between performative enactment and techno-political limitations, and between empowerment and disempowerment. Isin and Ruppert (2015) acknowledge that the digital citizen is not only a self-enacted subject but also subject to forms of control which include “harvesting, assembling, and storing data about things we say and do through the Internet” (Isin & Ruppert 2015, p. 1) by states and corporations. Digital citizens, they note, have to submit to code as a form of power that, for the most part, they cannot influence. Vivienne et al. (2016), similarly, acknowledge the different forms of control and constraints that are wielded by states, institutions and platform providers and affect the performative and constitutive acts of digital citizens. Yet the focus by these accounts remains on the agency of digital citizens

Digital Citizenship in the Age of Datafication   541 and their self-enactment, even as these citizens negotiate restrictions. In contrast, and as I argue here, the governance model of datafication with its components of social sorting, profiling and data-based categorizations points to different sets of power dynamics and questions established notions of active and empowering digital citizenship.

Conclusion Digital citizenship has become an important framework for understanding people’s position as political subjects in digitized environments. The concept typically focuses on the performative self-enactment of citizens and incorporates a wider set of affiliations and belongings. It builds on digital acts, the use of digital tools, and digital rights claims, and conceptualizes the resulting self-construction of people’s active role in society. The digital realm has offered many possibilities for these acts and has promised citizen empowerment. However, as we have seen, the technological, political and economic context of digital life may restrict digital acts significantly. Particularly, the key role of data collection and analysis in contemporary societies complicates an overly optimistic perspective on citizen agency in online environments. Digital citizens are traced and tracked, scored and profiled, assessed and monitored, and integrated into new categories of citizenship according to the data that is collected about them. They are thus ­constituted and managed, at least in part, through data analysis by new (the platform economy) and traditional (the state) institutions. This questions key coordinates of the concept. It limits the agency of digital citizens as  well as the empowering characteristics of digital acts. Rather than advancing the ­self-enactment of citizens, it suggests a power shift in the opposite direction—from the citizen to the state. Whereas the concept of digital citizenship departed from classic understandings in which citizenship is conferred by a powerful actor, “datafied citizenship” returns the focus towards traditional state-citizen relations. Digital or datafied ­citizens do have new practices at their disposal, including techno-legal responses to datafication, claims to privacy rights and to social justice in datafied environments, and new forms of data activism that apply the opportunities of data analysis constructively. However these take place in the context of pervasive data collection and a governance model of data-based profiling and social sorting that is outside the citizen’s reach. Digital citizenship thus intersects the empowered and controlled subject, and traverses the active and monitored dimensions of citizenship. What, then, are necessary conditions for digital citizens’ self-determination in ­datafied societies? Research (Hintz et al., 2017) suggests they would need to include a trustworthy technical infrastructure that accommodates user rights, such as privacy, in its design, its physical construction, and its code. Further, they require a supportive legal and regulatory framework that limits the collection, analysis and use of personal data; adequate public knowledge of datafication; and an informed use of the relevant platforms and applications. Underpinning such approaches, this chapter argues that we

542   Arne Hintz need a comprehensive understanding of the changes and challenges to core principles of citizenship that have emerged in times of datafication.

Note 1. The chapter is informed by a two-year project of empirical research on the consequences of datafication and, particularly, increasing surveillance on digital citizenship (Hintz, Dencik, & Wahl-Jorgensen, 2017), and by the work of the Data Justice Lab at Cardiff University. The argument presented here has been elaborated in more detail in the book Digital citizenship in a datafied society (Hintz, Dencik, & Wahl-Jorgensen, 2018).

References Allan, S. (2013). Citizen witnessing: Revisioning journalism in times of crisis. Cambridge: Polity. Amoore, L. & Piotukh, V. (2016). Introduction. In L. Amoore & V. Piotukh (Eds.), Algorithmic life: Calculative devices in the age of big data (pp. 1–18). New York: Routledge. Andrejevic, M. (2012). Exploitation in the data mine. In C. Fuchs, K. Boersma, A. Albrechtslund, & M. Sandoval (Eds.), Internet and surveillance: The challenges of Web 2.0 and social media (pp. 71–88). Abingdon: Routledge. Angwin, J., Larson, J., Mattu, S. and Kirchner, L. (2016, May 23). Machine Bias. Propublica. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminalsentencing Ansorge, J. (2016). Identify & sort: How digital power changed world politics. London, UK: Hurt & Company. Aouragh, M., Gürses, S., & Rocha, J. (2015). Let’s first get things done! On division of labour and techno-political practices of delegation in times of crisis. The Fibreculture Journal, 26, 208–235. Aradou, C. & Blanke, T. (2015). The (Big) Data-security assemblage: Knowledge and critique. Big Data & Society, 2(2), 1–12. Barlow, J.  P. (1996). A declaration of the independence of cyberspace. Electronic Freedom Foundation. http://homes.eff.org/~barlow/Declaration-Final.html Bauböck, R. (1994). Transnational citizenship. London: Edward Elgar. Bellamy, R. (2008). Citizenship: A very short introduction. Oxford: Oxford University Press. Benkler, Y. (2006). The wealth of networks: How social production transforms markets and freedom. New Haven and London: Yale University Press. Bennett, L., & Segerberg, A. (2014). The logic of connective action: Digital media and the ­personalization of contentious politics. Cambridge: Cambridge University Press. boyd, d. (2007). Why youth (heart) social network sites: The role of networked publics in ­teenage social life. In D. Buckingham (Ed.), Youth, identity, and digital media (pp. 119–142). MacArthur Foundation Series on Digital Learning. Cambridge, MA: MIT Press. Brevini, B., Hintz, A., & McCurdy, P. (Eds.) (2013). Beyond WikiLeaks: Implications for the future of communications, journalism and society. Basingstoke: Palgrave MacMillan. Cheney-Lippold, J. (2016). Jus Algoritmi: How the national security agency remade citizenship. International Journal of Communication, 10, 1721–1742. http://ijoc.org/index.php/ijoc/article/ view/4480

Digital Citizenship in the Age of Datafication   543 Cheney-Lippold, J. (2017). We Are data. New York: New York University Press. Chin, J., & Wong, G. (2016, November 28). China’s new tool for social control: A credit rating for everything. Wall Street Journal. http://www.wsj.com/articles/chinas-new-tool-forsocial-control-a-credit-rating-for-everything-1480351590 Clarke, J., Newman, J., Smith, N., Vidler, E., & Westmoreland, L. (2007). Creating citizenconsumers: Changing publics and changing public services. London: Sage. Clarke, J., Coll, K. M., Dagnino, E., & Neveu, C. (2014). Disputing citizenship. Bristol: Policy Press. Couldry, N., Stephanson, H., Fotopoulou, A., MacDonald, R., Clark, W., & Dickens, L. (2014). Digital citizenship? Narrative exchange and the changing terms of civic culture. Citizenship Studies, 18(6–7), 615–629. Couldry, N. & Hepp, A. (2017). The mediated construction of reality. Cambridge: Polity Press. Crawford, K. (2016). Know your terrorist credit score! Presentation at Re:publica, May, Berlin. Crouch, C. (2004). Post democracy. Cambridge: Polity Press. Dahlgren, P. (2003). Reconfiguring civic culture in the new media milieu. In J.  Corner & D. Pels (Eds.), Media and the restyling of politics (pp. 151–170). London: Sage. Dean, J. (2001). Cybersalons and civil society: Rethinking the public sphere in transnational technoculture. Public Culture, 13(2), 243–266. Deibert, R. J., Palfrey, J. G., Rohozinski, R., & Zittrain, J. (2008). Access denied: The practice and policy of global Internet filtering. Cambridge: MIT Press. Deibert, R., Palfrey, J. G., Rohozinski, R., & Zittrain, J. (2012). Access contested: Towards the fourth phase of cyberspace controls. In R. Deibert, J. G. Palfrey, R. Rohozinski, & J. Zittrain (Eds.), Access contested: Security, identity and resistance in Asian cyberspace (pp. 3–20). Cambridge: MIT Press. Dencik, L. & Leistert, O. (Eds.) (2015). Critical perspectives on social media and protest: Between control and emancipation. Lanham, MD: Rowman and Littlefield. Dencik, L., Hintz, A., & Carey, Z. (2017). Prediction, pre-emption and limits to dissent: Social media and big data uses for policing protests in the United Kingdom. New Media & Society, 20(4), 1433–1450. Dencik, L., Hintz, A., Redden, J., & Warne, H. (2019). The ‘golden view’: data-driven gov­ern­ ance in the scoring society. Internet Policy Review 8(2). https://policyreview.info/articles/ analysis/golden-view-data-driven-governance-scoring-society Diakopolous, N. (2013). Algorithmic accountability reporting: On the investigation of black boxes. A Tow/Knight Brief. Tow Center for Digital Journalism, Columbia Journalism School, New York. Diamond, L. (2010). Liberation technology. Journal of Democracy, 21(3), 69–83. Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press. Fraser, N. & Gordon, L. (1994). A genealogy of dependency: Tracing a keyword of the US welfare state. Signs: Journal of Women in Culture and Society, 19(2), 309–336. Fuchs, C. (2014). Social media: A critical introduction. London: Sage. Goldsmith, J. & Wu, T. (2006). Who controls the Internet? Illusions of a borderless world. Oxford: Oxford University Press. Greenwald, G. (2014). No place to hide: Edward Snowden, the NSA and the surveillance state. London: Hamish Hamilton Press. Hartley, J. (1999). The uses of television. London: Routledge. Hartley, J. & McWilliam, K. (Eds.) (2009). Story circle: Digital storytelling around the world. Oxford: Wiley-Blackwell.

544   Arne Hintz Held, D. & McGrew, A. C. (2003). The great globalization debate. In D. Held, & A. G. McGrew (Eds.), The global transformations reader (pp. 1–50). Cambridge: Polity Press. Hintz, A. & Milan, S. (2013). Networked collective action and the institutionalised policy debate: Bringing cyberactivism to the policy arena? Policy & Internet, 5(1), 7–26. Hintz, A. (2015). Social media censorship, privatised regulation, and new restrictions to ­protest and dissent. In L. Dencik & O. Leistert (Eds.), Critical perspectives on social media and protest: Between control and emancipation (pp. 109–126). Lanham, MD: Rowman and Littlefield. Hintz, A., Dencik, L., & Wahl-Jorgensen, K. (2017). Digital citizenship and surveillance ­society. International Journal of Communication, 11, 731–739. Hintz, A. & Brown, I. (2017). Enabling digital citizenship? The reshaping of surveillance policy after Snowden. International Journal of Communication, 11, 782–801. Hintz, A., Dencik, L., & Wahl-Jorgensen, K. (2018). Digital citizenship in a datafied society. Cambridge: Polity Press. Isin, E. (2008). Theorizing acts of citizenship. In E. Isin & G. Nielsen (Eds.), Acts of citizenship (pp. 15–43). London: Zed. Isin, E. (2012). Citizens without frontiers. London, UK: Bloomsbury. Isin, E. & Ruppert, E. (2015). Becoming digital citizens. Lanham, MD: Rowman & Littlefield. Isin, E. & Ruppert, E. (2017). Citizen Snowden. International Journal of Communication, 11, 843–857. Jenkins, H. (2008). Convergence culture: Where old and new media collide. New York: New York University Press. Johnson, D. R. & Post, D. (1996). Law and borders: The rise of law in cyberspace. Stanford Law Review, 48(5), 1367–1402. Katz, J. (1997). The digital citizen. Wired, January 12, 1997, https://www.wired.com/1997/12/ netizen-29/ Lindgren, S. (2017). Digital media & society. London: Sage. Linklater, A. (2002). Cosmopolitan citizenship. In E. Isin & B. S. Turner (Eds.), Handbook of citizenship studies (pp. 317–332). London: Sage. Lister, R. (1997). Citizenship: Feminist perspectives. Basingstoke: MacMillan. Lyon, D. (2015). Surveillance after Snowden. Cambridge, UK: Polity Press. Mann, S., Nolan, J., & Wellman, B. (2003). Sousveillance: Inventing and using wearable computing devices for data collection in surveillance environments. Surveillance & Society, 1(3), 331–355. Marshall, T. H. (1950). Citizenship and social class, and other essays. Cambridge: Cambridge University Press. Massumi, B. (2015). Ontopower: War, powers, and the state of perception. Durham, NC: Duke University Press. Mattelart, A. (2003). The information society. London: Sage. McCosker, A., Vivienne, S., & Johns, A. (Eds.) (2016). Negotiating digital citizenship: Control, contest and culture. London, UK: Rowman & Littlefield. McNevin, A. (2011). Contesting citizenship: Irregular migrants and new frontiers of the political. New York: Columbia University Press. Milan, S. (2013). Social movements and their technologies: Wiring social change. Basingstoke: Palgrave MacMillan. Milan, S. (2017). Data activism as the new frontier of media activism. In G. Yang & V. Pickard (Eds.), Media activism in the digital age (pp. 151–163). London: Routledge.

Digital Citizenship in the Age of Datafication   545 Milan, S., & van der Velden, L. (2016). The alternative epistemologies of data activism. Digital Culture & Society, 2(2), 57–74. Morozov, E. (2011). The net delusion: The dark side of Internet freedom. New York: Public Affairs. Mossberger, K., Tolbert, C., & McNeal, R. S. (2007). Digital citizenship: The Internet, society, and participation. Cambridge, MA: MIT Press. Mouffe, C. (2000). The democratic paradox. London: Verso Books. Papacharissi, Z. (2010). A private sphere: Democracy in a digital age. Cambridge, UK: Polity Press. Rainie, L. & Wellman, B. (2012). Networked: The new social operating system. Cambridge, MA: MIT Press. Ratto, M. & Boler, M. (Eds.) (2014). DIY citizenship: Critical making and social media. Cambridge: MIT Press. Renzi, A. & Langlois, G. (2015). Data activism. In G. Langlois, J. Redden, & G. Elmer (Eds.), Compromised data: From social media to big data (pp. 202–225). New York: Bloomsbury. Rheingold, H. (2002). Smart mobs: The next social revolution. Basic Books. Rice, R. E. & Hoffman, Z. T. (2018). Attention in business press to the diffusion of attention technologies, 1990–2017. International Journal of Communication, 12. Rieke, A., Yu, H., Robinson, D., & von Hoboken, J. (2016). Data brokers in an open society. London: Open Society Foundation. Rodriguez, C. (2001). Fissures in the mediascape: An international study of citizens’ media. Mahwah, NJ: Hampton Press. Shirky, C. (2010). Cognitive surplus: Creativity and generosity in a connected age. London, UK: Penguin. Siapera, E. (2016). Digital citizen X. In A. McCosker, S. Vivienne, & A. Johns (Eds.), Negotiating digital citizenship: Control, contest and culture (pp. 97–114). London: Rowman and Littlefield International. Siapera, E. (2017). Reclaiming citizenship in the post-democratic condition. Journal of Citizenship and Globalisation Studies, 1(1), 24–35. Simon, J., Bass, T., Boelman, V., & Mulgan, G. (2017). Digital democracy: The tools transforming digital engagement. NESTA Working Paper. http://www.nesta.org.uk/publications/ digital-democracy-tools-transforming-political-engagement Soysal, Y. N. (1994). Limits of citizenship: Migrants and postnational membership in Europe. Chicago, IL: The University of Chicago Press. Spiro, P. J. (2011). A new international law of citizenship. The American Journal of International Law, 105(4), 694–746. Srnicek, N. (2016). Platform capitalism. Cambridge: Polity Press. Tilly, C. (1997). A primer on citizenship. Theory and Society, 26(4), 599–602. Trottier, D. (2015). Open source intelligence, social media and law enforcement: Visions, ­constraints and critiques. European Journal of Cultural Studies, 18(4–5), 530–547. Turner, B. S. (1990). Outline of a theory of citizenship. Sociology, 24(2), 189–217. Turner, B.  S. (2009). TH Marshall, social rights and English national identity. Citizenship Studies, 13(1), 65–73. Turner, B.  S. (2017). Contemporary citizenship: Four types. Journal of Citizenship and Globalisation Studies, 1(1), 10–23. Urry, J. (1999). Globalization and citizenship. Journal of World-Systems Research, 5(2), 310–324.

546   Arne Hintz Vagle, J.  L. (2017). The history, means, and effects of structural surveillance. Northeastern University Law Review, 9(1), 103–149. Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific ­paradigm and ideology. Surveillance & Society, 12(2), 197–208. Villeneuve, N. (2012). Fake Skype encryption software cloaks DarkComet trojan. Trend Micro Malware Blog, 20 April 2012. http://blog.trendmicro.com/fake-skype-encryption-softwarecloaks-darkcomet-trojan/ Vivienne, S., McCosker, A., & Johns, A. (2016). Digital citizenship as fluid interface: Between control, contest and culture. In A. McCosker, S. Vivienne, & A. Johns (Eds.), Negotiating digital citizenship: Control, contest and culture (pp. 1–18). London: Rowman & Littlefield. Walzer, M. (1970). Obligations: Essays on disobedience, war, and citizenship. Boston, MA: Harvard University Press. Young, I. M. (1989). Polity and group difference: A critique of the ideal of universal citizenship. Ethics, 99(2), 250–274. Zivi, K. (2012). Making rights claims: A practice of democratic citizenship. Oxford, UK: Oxford University Press. Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information ­civilization. Journal of Information Technology, 30, 75–89.

chapter 20

Digitizi ng Cu lt u r a l Compl ex it y Representing Rich Cultural Data in a Big Data Environment Georgina Nugent-Folan and Jennifer Edmond

Introduction One of the major terminological forces driving information and communication ­technology (ICT) integration in research today is “big data.” While the characteristics, often defined by the multiple “V’s” of big data—volume, velocity, variety, veracity, and value (BDIOT, 2018)—may make big data sound inclusive and integrative, such approaches are in fact highly selective, excluding input that cannot be effectively structured, represented, or digitized. Data of this complex sort is precisely the kind that human activity tends to produce, but the technological imperative to enhance signal through the reduction of noise does not accommodate this richness. Data and the computational approaches that facilitate “big data” have acquired a ­perceived objectivity that belies their curated, malleable, reactive, and performative nature. In an input environment where anything can be “data” once it is entered and accepted into the system as “data,” data cleaning and processing, together with references to metadata and information architectures that structure and facilitate our cultural archives, acquire a capacity to delimit what data are that is endemically under-acknowledged or ignored outright. This engenders a process of simplification that has major implications for the potential of future innovation within research environments that depend on rich material yet are increasingly mediated by digital technologies that delimit, simplify and even hide or make latent potentially important facets of this material.

548   Georgina Nugent-Folan and Jennifer Edmond The objective of this chapter is to explore the impact of bias in digital approaches to knowledge creation by investigating the delimiting effect digital mediation and datafication can have on rich, complex cultural data. Cultural data refers to the rich, complex data used in analyses of culture (such as anthropological field notes) or by humanities researchers (manuscripts, photographs, diaries). The data may often refer to objects that are rare or even unique. These objects may have complex histories, particularly when it comes to cultural data or objects that may have passed through hundreds of hands. These objects may have been subject to rebinding, restoration, damage, losses, or intentional edits (Edmond & Nugent-Folan, 2017). As Edmond notes, cultural data are particularly difficult to incorporate into a digital research environment such as a database precisely because such complexities are at odds with the very functionality of the database and the findability of the material held within it: How is this level of uncertainty, irregularity and richness to be captured and integrated, without hiding it “like with like” alongside archival runs with much less convoluted narratives of discovery? Who is to say what [. . .] is “signal” and what [is] “noise”? Who can judge what critical pieces of information are still missing?  (Edmond, 2016, p. 98)

Cultural data can be contrasted with the comparably “simple” data output of population counts or environmental sensor readings that may be immediately recognizable as data, and comforting in their regularity and quantitative nature. Such readings accommodate partial representations rather than fully representing the people or climate that has been sampled; itself a near impossible task. If rich or complex data prove difficult to fully represent on a small-scale level, then in the transition to a big data environment, we run the risk of losing much of what makes this material useful or interesting in the first place. We will begin by reviewing some of the existing implicit definitions of data that underlie ICT-driven research. In doing so we will draw attention to the heterogeneity of definitions of data, to identify the key terms associated with data demarcation and data use, and to then expand on the implications of this heterogeneity. By establishing a clear taxonomy of existing theories and definitions of data, we provide a foundation that will serve to underpin a more applied understanding of the term, especially in both humanistic and technical contexts, while also working to understand why these approaches to defining data are often presented in counterpoint, as mutually exclusive, or antagonistic. Our taxonomy of data definitions is backed up by the findings of a data mining exercise wherein we examined usage of the terms “data” and “big data” in the proceedings of a major international big data journal, the Journal of Big Data, from its inception to the present day. It outlines key priorities associated with each conceptualization of data, together with key (and under-acknowledged) points of divergence or differences in the use of these conceptualizations and in the acts of data cleaning and data processing. In addition to developing new perspectives on the rhetorical stakes and action implications of differing concepts of the term “data,” the chapter moves towards a reconceptualization of what data are, and how best to go about speaking of and about them.

Digitizing Cultural Complexity   549 The first section of this chapter explores how the word “data” is used, or indeed misused, in scientific discourse, and examines the implications of these discursive practices. Following this, we will examine how data cleaning and the pre-processing of data influence our conceptualization of data proper. Section two explores how strategies for navigating the data deluge (e.g., metadata, keywords, search, and other algorithmic or organizational strategies) can restrict the interpretative potential of data, and make us susceptible to an implicit truth claim made on the basis of those strategies.

Defining Pre-Data and the Origins of Data In the lifecycle of data, there is a sort of spectrum or continuum from pre-data through to data proper (pre-data, native data, raw data, source data). These phases involve the addition or subtraction of different forms of context or structure. But closer inspection reveals these to mean, or have the potential to mean, vastly different things. Furthermore, “pre-data” can be seen to exhibit varying attributes, and if treated differently, the same “pre-data” can result in vastly different data in later phases. A key facet of this problem is that the term “data” is ubiquitous, but is consistently interpreted differently, used in different contexts, or used to refer to different things, leading to confusion and ­disorganization: indeed, there is even a lack of consensus as to whether “data” as a term is singular or are plural. Throughout this chapter, for the sake of clarify and consistency, we will be referring to data in the plural.

Pre-Data Before we discuss data definitions, it is important to identify the difference between data and what we might term pre-data, that is “data” before it is datafied and becomes data-proper. It refers to objects and phenomena in the real world that might be, but have not yet been, subjected to datification (with all of the losses and gains that that such a proc­ess implies). “Pre-data” involves the material entity that undergoes datafication; it can be used to describe what data are like before they become, are captured, or are transformed into data. In this sense, “pre-data” can be just about anything: a conversation (that we may record or transcribe), a paper document (that we may scan or digitally analyze), a natural phenomenon (that we may photograph, or seek to quantify). Electronic records such as social media posts, created without the intent of creating data, may already have the form of “data,” and be managed by a company as “data,” but they may not necessarily be used as “data,” until for example a social researcher downloads and reuses the post as a part of a corpus, or a third-party analytics firm uses the data to identify users for advertisers.

550   Georgina Nugent-Folan and Jennifer Edmond If data proper can only ever be considered a partial simulacrum of a real life happening, event, or entity—not the thing in and of itself, but a recording of the thing—then “pre-data” can either be the thing in and of itself, the thing in its totality, or the facets of the thing that are not entirely recordable through datafication, such as context or environment.

Native Data Native data are data as they exists/existed in their indigenous environment, the source context or environ where the native data was extracted from; so it is data plus its environmental or source context. So for example, in the case of sensor data recording the surface temperature on a mountaintop, native data incorporates points about the environment in which the data was drawn from. It contains all of the data readings in their entirety, not just the readings (maximum or minimum temperature) that researchers may be looking to target or identify. That this additional contextual data may be included as metadata illustrates just how difficult it can be to keep or maintain cognizance of the context of your data: where on the mountaintop the sensor was placed, for example. Such information only makes sense if one has a wider understanding of the environments of contexts beyond the immediate remit of the sensor, the geographic co-ordinates of the mountain, its elevation, the presence of nearby streams, wildlife, pollutants, etc.; such native context can often only be maintained in a supplemental manner. “Native data” can also mean, as Christine Borgman outlines in Big data little data no data, the context(s) the data acquires when it transitions from a “native” to a “nonnative” environment; in other words, the data that retains the traces of datafication, the data that is native to a specific software or program (Borgman, 2017, p. 22). So for example, if a manuscript is scanned and subjected to optical character recognition, the native data in this case is (depending upon the range and functions incorporate into the software in question) the output of this initial scanning: it may or may not have been cleaned, or corrected for errors or misreadings. It may also retain traces or references to the scanning process such as the inclusion of dust motes from the camera lens, or registration marks. “Native data” is a useful term because it retains emphasis on the importance of context apropos data. However, the term remains overwhelmingly underused.

Raw Data Raw data supposedly refers to data that has not undergone any processing, or more specifically, any further processing, as the process of datafication in and of itself is a transformative process. No data are inherently raw, but this term is nevertheless used to refer to data we may consider to be relatively unprocessed. “Raw data” are distinct from “native data” in that it can be considered a step further in terms of levels or degrees of exposure to datafication. “Native data,” existing in its native environment, becomes “raw data” when extracted from this native environment. A photograph of a yard full of

Digitizing Cultural Complexity   551 wildflowers can be considered the “native data” that allows a person to later study that photo, identify and list down all the flowers they identify in that photo. This is their “raw data,” while the photograph, in displaying the totality of the environment—the location of the flowers, the time of day the time the image was captured, perhaps even the soil type, and so on—can be considered the “native data.” Despite the existence of a relation between the terms “native data”—that is data in its “rawest” or “native” state—and “raw data,” even “native data” has been processed to a degree (the photographic image is itself a proc­essed entity), and so one can also argue, as Borgman does, that “Identifying the most raw form of data may be an infinite regress to epistemological choices about what knowledge might be sought” (Borgman, 2017, p. 27). Ribes and Jackson are illuminating on the processes “native data” undergoes to achieve “raw status” and, further still, the work that goes into preserving data, observing that “we often think of raw data as following straight and commonsense pathways from collection to database. Sometimes this is true [ . . . ] however the more common story [ . . . ] [sees] data moving through complex, multi-institutional networks” (Ribes & Jackson, 2013, p. 149). The so-called “raw data” we are provided with or acquire as researchers has already undergone extensive, often undocumented cleaning to get it into a state where it is recognizable as data in the first place. And furthermore, what is “raw data” for one researcher may not be “raw data” for another. For example, Person A collects “raw data” in the form of a survey of the flowers in the aforementioned yard. This is their “raw data” as they endeavor to create a very precise description of their garden. Person B takes this data and adds it to five other such descriptions they have collected from other yards in the area. This is their “raw data” as they work to develop a floral atlas of the locality. Another person (Person C) takes that atlas and extracts only information about purple flowers. This is their “raw data” for a study of the attractiveness of purple flowers to pollinators. The pollinators study becomes a part of the “raw data” basis for a study of agriculture, and so on. Thus, as Borgman notes, “raw is a relative term” (Borgman, 2017, p. 26). While the concept of “raw data” is increasingly acknowledged as something of a misnomer or oxymoron (in his contribution to the aptly titled collection of essays “Raw Data” is an oxymoron, Matthew Stanley puts it nicely with the observation “raw data is not so raw”; Stanley, 2013, p. 77), it is still widely used. Specifically at issue is the relativity of the rawness, with Borgman noting that “What is ‘raw’ depends on where the inquiry begins” (Borgman, 2017, p. 26), and the variability of rawness may also be discipline-specific, project-specific, or researcher-specific. In addition, in certain cases, “Only a small portion of what comes to be considered ‘raw data’ is actually generated in the field” (Ribes & Jackson, 2013, p. 161). These so-called “raw” data are often subject to extensive cleaning and modification that is not always acknowledged, fully accounted for, or consistent. The result is that inter-disciplinary projects or conversations that make use of these frustratingly abstract, idiosyncratic, and variable terms often run the risk of encountering counter-productive “context collapse” (Marwick & boyd, 2010), to borrow a term from research into social media in which it is identities rather than discourses that collapse. Unless the referents of these terms are agreed on and maintained throughout the course of the project, scholarly discourse may continue but without mutual understanding.

552   Georgina Nugent-Folan and Jennifer Edmond

Source Data Source data indicates the source of the data for any given research project. For some researchers, their source data may not be “native,” or even “raw”; it may already be data proper and have undergone extensive processing. When understood as the data that is necessary to conduct a research project, “source data” brings with it a number of important issues, and these arise well before it has undergone even the most preliminary analysis or testing. Particularly in a digital environment, where the contents of online news sites, for example, change multiple times over the course of an hour, what is available to you on one day may change, or not even be there, the next day: it may not be locatable in exactly the same place, or even in the same format; the hyperlink that allowed you to access this material may not function indefinitely. A very basic example of this is the ever-changing content interfaces of the digitally available newspapers, or the fact that the levels of a given mineral or nutrient recorded in a river will vary from day to day, so the measurements taken there over the course of a set period (for example, after an oil spill) will not necessarily reflect the levels taken at a later date. Elsewhere, and again within a digital environment, hyperlinks frequently become broken or dead (a process commonly referred to as linkrot), and the 404 error message has become a near-universally recognized indication of a communications failure and an inability to locate the desired hyperlink. Thus, while one might assume that data that comes from a specific source is stable or consistent, source data in fact can be variable or even become non-existent. This ties in with the wider issue of the ephemerality and decay of online documents and their URLs (Jones et al., 2016; Klein et al., 2014; Koehler, 2004; Mussell, 2012). One result is that the collected data can quickly become “outdated data” because it no longer matches or reflects the data at the original site or source of data acquisition. For this reason researchers will often save their data source by storing duplicates of the data elsewhere on their computer, in what is known as a cache memory. They may also indicate the date at which the data were obtained from a source, as in URL references that specify a given article was “Accessed on [ . . . ]” a specific date. They may also include a caveat that specifies the periodicity of their access to a given data source or the limits or remits of their study, such as for example by making clear that they had access to the data only for a set period (thereby acknowledging that it may have since changed), or that the data was collected over a specific period in time, by stating for example “We obtained this data until the period of . . .”. Referring to sources in this manner acknowledges provenance, and facilitates traceability, which is important when it comes to producing trustworthy, reliable, and replicable research. Interestingly, however, there is no single or standardized response to this need, clearly a pressing one in an environment in which up to 75% of web URLs lead to changed content (Jones et al., 2016). Different communities will point to different potential solutions, with DOIs (Digital Object Identifiers) being a preferred technical fix, and style sheets, such as APA or Chicago, suggesting their own, alternate ways of managing the fact that data often become detached from their context in the web of documents. None of these solutions, however, truly tackles the root of the problem, that is that a researcher’s ability to trust data is often based upon

Digitizing Cultural Complexity   553 tacit clues derived from the analogue environment, as well as precisely the kinds of provenance information that is either not captured or not maintained effectively (Tenopir et al., 2013, p. 31). The term “source data” therefore indicates where your data came from, even when you may not be confident of your access to the provenance that implies. So, whereas “raw data” comes from nowhere, “source data” always comes from somewhere (and both the “source” and the “data” may change over time), even though the terms may refer to one and the same object.

Source Data and Data Quality Researchers source their data from a wide variety of venues. Some source their data from sensors or instruments, others from open online sources, from colleagues, or archives, or their data may even have been purchased from companies. Each of these sources has varying levels of quality and provenance, and this data may have been exposed to gradations of processing that are often left undocumented, or implicit. This is why acknowledging the source of data are so important. Making sources available to others indicates confidence in the research and findings because it provides the ability of others to query and test the research findings. As the previous sections have made clear, in many cases, source data has already been exposed to extensive processing and curation. The list of wild flowers that makes up a botanist’s raw data may be alphabetized, for example; or in the case of the researcher investigating pollination, certain self-pollinating flowers may have been removed from the dataset. Some data sources, such as text for example, can appear relatively straightforward and monovalent in terms of content. Other data can, though, be multivalent and contain input from a variety of sources, such as from the web, newsfeed, legal documents, literature, etc. Medical data, for example, can contain a myriad of different data forms or modes that together make up the health profile of a patient, or for the purposes of a medical trial or study, a section of the population. These data are multivalent and may contain both signal and “noise” just as, for example, a radio signal may contain both music and static; or indeed, from a user’s perspective, any given page on an online newsfeed contains text articles, images, and advertisements, not all of which will be of equal interest or relevance to the user. One of the characteristics of data as it is found in its source environment is that it often contains noise. Therefore the data targeted at the onset of the study, the data sought out to conduct the study, may exist in a noisy environment and has to be disassembled or extracted or even estimated from the surrounding “noisy data,” the material that is not required for the purposes of that particular study. As can be seen here then, the term noise can have both a technical meaning (in the sense of “signal-to-noise ratio”) and a “noise” in the sense of being superfluous, distracting, or even misleading, to the task at hand. Data quality can vary depending on the source. Indeed, the quality of source data is often directly linked to where and how it was sourced, collected, or scraped. This again influences the approaches one may take to clean, process, or analyze it. For example,

554   Georgina Nugent-Folan and Jennifer Edmond Facebook or Twitter or YouTube content may have language errors that need cleaning, while content from a major newspaper (such as the Wall Street Journal) rarely includes those kinds of problems.

Summary It should be clear by now that to equate source data with pre-data, native data, or even raw data is ill-advised. And while arguments that distinguish between native and raw data run the risk of becoming tautological and tedious, transparency in relation to the data source, the native context, and the degree of processing that data have been exposed to, is clearly necessary. This is particularly so in relation to complex data (data that can have more than one meaning, more than one signal) and concordantly in research environments where complex data gives rise to situations where one person’s signal (data) may be another person’s noise (as in dirty or messy data). For example, one person may choose not to record instances of non-linguistic elements in a transcript of a conversation. A later user will be unable to use this transcript to study the social function of “backchannel” responses in conversation, as those elements of the exchange are no ­longer a part of the available data.

Data Definitions: Theory and Practice Having covered the origins of data and the various types of pre-data available to the researcher, this next section will examine the various definitions of data that have gained traction among the scholarly community. The challenges that accompany any attempt at defining data will then be introduced. Following this, we will examine how data processing and data cleaning impact upon the interpretation of data, about the terms we use to speak about data as it transitions through these various phases, and about the important issue of polysemy in relation to the term “data.”

An Introduction to the Challenges of Defining Data Rosenberg outlines the history of “data” as a concept originating before the 20th century, exploring how it acquired its “pre-analytical, pre-factual status” (Rosenberg, 2013, p. 18). If “data” is “pre-analytical, pre-factual,” then what of “pre-data,” or “proto-data”? Rosenthal (2017) presents data as an entity that “resists analysis,” an “object that cannot be questioned.” Borgman further elaborates on this, stating that “Data are neither truth nor reality. They may be facts, sources of evidence, or principles of an argument that

Digitizing Cultural Complexity   555 are used to assert truth or reality” (Borgman, 2017, p. 17). Here already we have a not insignificant contradiction, however, in that two of the foremost theorists on data,  Rosenberg and Borgman, alternately describe data as “pre-factual” and “facts,” respectively. How can an entity be both pre-factual and factual? How can data be “facts” if they are pre-factual? Rosenberg further fuels this field of contradictions with the observation that “When a fact is proven false, it ceases to be a fact. False data is data nonetheless” (Rosenberg, 2013, p. 18). Rita Riley picks up on this contradiction when she outlines the arguments for and against data surveillance, noting that such back and forth arguments “imply that data is somehow neutral and that it is only the uses of data that are either repressive or emancipatory” (Riley, 2013, p. 130). Data cannot be neutral if it is a fact, and it cannot be a fact if it is false, yet individually it can be false, a fact, pre-factual, and also neutral. This uncertainty around the characteristics of “data” feeds in to a larger discussion on distinctions between concepts such as data, information, and knowledge (Rice, McCredie, & Chang, 2001). Data has been used to great emancipatory effect throughout projects such as American slavery as it is, a groundbreaking project that, as Garvey notes, “helped to create the modern concept of information, by isolating and recontextualizing data found in print” (Garvey, 2013, p. 91). However, Garvey’s assertion that “data will out” (Garvey, 2013, p. 90) implies that there is some overarching truth-function to data, that it can somehow speak for itself, as opposed to needing context and modification so as to take on the values expected or sought out by the researcher. Data only, as Garvey described it, “outs”—that is, invites interpretation—by means of a complex set of procedures that may be discipline-specific, case-specific, or even researcher-specific. For example, a researcher with a background in palaeography is well suited to decipher the contents of a manuscript, but not if that manuscript contains text in a language they do not read fluently. They also may not be skilled in the encoding procedure necessary to transcribe the contents into xml (extensible markup language) for the purpose of disseminating the material in a digital environment, or capable of building the system that could run and facilitate this kind of digitally facilitated manuscript analysis. Conversely, the software engineer capable of building a system that facilitates the digital analysis of medieval manuscripts may have little or no background in medieval scholarship or manuscript studies (see Chandna et al., 2015).

Data Processing and Data Interpretation The procedures that facilitate the interpretation of data are informed by pre-existing knowledge bases. A researcher who speaks English but not French or Latin, for example, will naturally focus on the English contents of a multi-lingual manuscript, to the neglect of the other material. Furthermore, the purposes for engaging with or seeking out particular data in the first place often determine the types of data obtained, and the interpretations these data are subjected to. If, for example, material is being sought out

556   Georgina Nugent-Folan and Jennifer Edmond for a collection of early English poetry, then material in Latin or French is not required, and while it may be present within the “source data” or the data when it is in its native context, following scanning by OCR software, this material will be removed in order to make the “raw data” necessary for and relevant to the project. Many of these approaches to data processing adapt and evolve as the research ­progresses. However, data processing choices and provenance records are often left un­do­cu­mented and unaccounted for, with the result that data acquires a factual sheen that belies its curated and processed status. Even a Google search presents only a certain perspective on the data available to its search queries, limited by the nature of keyword search, the way in which the algorithm ranks pages and, further upstream, by how data has been captured and presented, with or without the context it needs to be exposed in a relevant query (Horsley & Priddy, 2018). For example, in the case of Garvey’s essay on abolitionists’ use of facts, “the ads were abstracted, their information pried loose and accumulated, aggregated en masse” (Garvey, 2013, p. 91) in a manner that can be considered a very early pre-computational approach to aggregating data in a manner akin to a Google search. This data did not simply “out;” it was pried out. It’s fitting then, that Rosenberg famously cites data as “rhetorical” (Rosenberg, 2013, p. 18), and Raley argues data are “performative” (Raley, 2013, p. 128). Perhaps these rhetorical and performative facilities explain data’s mutative nature, for it seems data can be anything, can mean anything, can be made to mean anything, or that anything can be data once it is entered into the system as data. Data has largely become a devalued term, perhaps best equivalent to “stuff ” or “things,” sort of as a synonym for input. Data are culturally specific, idiosyncratic and variable, and this point becomes obfuscated by the discourse in which data can mean almost everything, and in doing so, mean next to nothing. If we subscribe to the argument that anything has the potential to be data, then any definition of data (or the architecture that makes it available in an analogue or dig­ ital environment) needs to maintain an awareness of the interpretative potential of the material contained within its datasets. If data are of speculative value, then facets discarded or obfuscated during the cleaning process could become important or valuable at some point in the future. But this later contextualization is only possible if the facets are still available, and ideally available with their native context. Raley argues that “Data cannot ‘spoil’ because it is now speculatively, rather than statistically, calculated” (Raley,  2013, p. 124), suggesting both that data are inherently multiply interpretable (which is often true), but also that “data” itself exists in some original pure form and can therefore never degrade (which is misleading). Nevertheless, we have to remain cognizant then of “what could be data to someone, for some purpose, at some point in time” (Borgman, 2017, p. 19). Data can also become hidden, or be rendered latent within an archive, as a result of the information architecture employed to organize it within a digital environment. In their study of historical data and its sources, Horsley and Priddy (2018) draw attention to the tension that exists between what they describe as the “quick win” of keyword searches vs. the context and complexities of the material being sought out:

Digitizing Cultural Complexity   557 The feeling of getting to know material was therefore endangered by keyword searches’ bypassing of context, which also undermined the process through which archivists deepened their relationships with collections. In this instance, a sense of context was vital for developing an understanding of connections that might be missing or yet to be made.

Material can also be missed as a result of “digital information seeking behaviour” (Rowlands et al., 2008, pp. 294–95). For example, the average user will assume that if the information they are looking for does not appear in the first few online search results, it somehow doesn’t exist (Rowlands et al., 2008). This activity has been described in the digital age as “horizontal information seeking”: “A form of skimming activity, where people view just one or two pages from an academic site and then ‘bounce’ out, perhaps never to return. The figures are instructive: around 60 percent of e-journal users view no more than three pages and a majority (up to 65 percent) never return” (Rowlands et al., 2008, pp. 294–95). Inclusion within a formal collection of digitized or digital records, often referred to as a digital archive, necessitates that data must be organized according to some kind of framework. The major and minor elements in this framework can be referred to as its architecture (just as a house will have a certain collection of rooms and hallways into which furniture—data—can be placed). Problems can emerge when that data are delimited, constrained, formatted, or structured by the architecture surrounding it within a digital environment. This architecture itself might use the term “data” as one of its fundamental descriptors, and therefore might contain, shape, and affect the management of data within, while itself also using the term “data” to describe more external, tangential structures, that are still necessary for the archive to function. “Data” as a term then is used and re-used in the context of the entities that serve different functions within the digital environment: metadata, the dataset, the database, or data that has been exposed to various levels of processing. All of these types of data are different, yet the term that signifies them remains the same. So how do we maintain an awareness of which data are what data? And how are we to accommodate fluidity, a capacity for change, and account for the incorporation of material not identified as data at the time of its entry into the system that can subsequently become data?

Data Definitions and Polysemy We know, or have known from at least as early as 1951 in Briet’s Qu’est-ce que la documentation? that, “Data are not pure or natural objects with an essence of their own . . . [data] exist in a context, taking on meaning from that context and from the perspective of the beholder” (Borgman,  2017, p. 18). Nevertheless, it is difficult to ascertain a concrete understanding of what data actually are that satisfies all the uses to which the term is regularly put. Some of these statements or definitions of data outlined in the previous

558   Georgina Nugent-Folan and Jennifer Edmond section grant the term and its mobile referent a sort of authority it cannot possibly obtain. While this is perhaps to be expected among the more philosophical or conceptual approaches to defining data, inconsistencies and tacit disagreements over what data are also exist within the computer science community. In a series of interviews conducted with computer scientists regarding their conceptions of data, Edmond and Nugent-Folan noted how many “elaborated on how hard it was to define data, nothing on the one hand that while ‘everything that is not very well defined can cause confusion,’ it is nonetheless very difficult to define data” (Edmond & Nugent-Folan, 2018, p. 114). However, this variability remains largely undiscussed in the scientific literature: indeed, it seems to be an acceptable part of the discourse. Within the confines of a single research paper (Tabard et al., 2011), data can be both simple and complex, pre-epistemic and preprocessed, directly comprehensible to humans and purely machine readable. The word “data” can be used to refer to something newly drawn out of the environment (“captured,” “collected”) or as something already available for access, analysis, navigation (“research data” or indeed just “data”). It can refer to complex hybrid objects (“experimental data,” “digital data”), or to comparatively simple strings of regular meter readings (“sensor data”; Tabard et al., 2011). The scale of this polysemy is quite striking and leads to an astonishing reliance on the single term “data” within certain streams of scientific discourse. A random selection of five articles from the Journal of Big Data in 2016 yielded over 650 occurrences of the term “data,” including 325 of the composite phrase “big data” and 124 in the context of a “dataset” or “database.” Within this corpus we observe such inconsistencies as idiosyncrasies in spelling key terms within the one paper (dataset/data set); the same term used to refer to specific data/datasets or to more general data/datasets; the same term used to refer to different phases of investigation; to different data (data stream/data cluster/original data/evolved data/evolving data); and to data as a synecdochal term covering both the whole and the part. The following quotations give a sense of what one finds: Data pretreatment module is outside from online component and it is done to preprocess stream data from the original data which is produced by the previous component in the form of data stream. In addition, we calculate the standard deviation for the entire data in the stream to check whether all the data are of the same value or not. Due to visiting data once during the processing data in stream, the performance of processing data is crucial.  (Khalilian et al., 2016)

It is difficult, if not impossible, to keep track of what data are which in each of these cases, and it also appears that that writerly skill and the ability to construct persuasive arguments do not appear to be of value in this epistemic culture, where the term “data” takes precedence above all else. Indeed, to take things further, a 2015 article by Najafabadi et al. (2015) used the term data an incredible 507 times over the course of 21 pages, which equates to 24.1 mentions per page. Of these, 124 iterations pertained to “big data,” which left 383 uses of data “proper.”

Digitizing Cultural Complexity   559 Table 20.1  Analysis of Journal of Big Data, 2014–2017 Year Merged 2014-2017

Total words in corpus

Average number of Total occurrences Total occurrences of the words per sentence of the word “data” word “big data”

148,826

25.9

2975 (2%)

782 (0.5%) All but seven of the iterations of the word “big” related to “big data”

2017

50,021

24.8

726 (1.45%)

214 (0.42%) All iterations of the word “big” related to “big data”

2016

40,630

26.6

930 (2.3%)

329 (0.8%) All iterations of the word “big” related to “big data”

2015

46,603

25.8

1185 (2.5%)

2014

11,572

28.9

134 (1%)

225 (0.5%) All but seven of the iterations of the word “big” related to “big data” —

Taking these investigations further, we compiled a test corpus that consisted of five randomly selected research papers each year from the Journal of Big Data from 2014 (the year of its inception) to 2017 (Volumes 1–4; with the exception of 2014, because of the six articles in the issue only two papers were research oriented, the others being a short report, two case studies, and a survey). As Table 20.1 shows, across the entire corpus of articles examined as part of this text mining exercise, the most frequent recurring word, overwhelmingly, was “data,” occupying a staggering 2% of the total sampled corpus. The second most-frequent word in the corpus was “big” and, out of 782 appearances of the adjective, all but seven related to “big data” (99.1%). So of all the words in these articles from the Journal of Big Data, 2% are “data” and one quarter of those include “big.” Our conversations on big data are directed and defined primarily by (apparently) “small” data. If we consider that “big data” is increasingly held to different standards than “data,” then this feature of “big data” research, namely its reliance on the term “data,” begins to acquire potentially worrying consequences. As mentioned previously, Rosenberg cites data as “rhetorical” while also reminding us that “False data is data nonetheless” (Rosenberg, 2013, p. 18) and that “Data has no truth. Even today, when we speak about data, we make no assumptions about veracity” (Rosenberg, 2013, p. 37). That last statement is important, because while, when we speak of data we perhaps make no assumptions about truth, when we speak about big data, we do, as the five characteristics of big data, noted earlier, are “volume, velocity, variety, veracity and value” (BDIOT, 2018). Somewhere in the transition from data to big data, veracity becomes important, even if

560   Georgina Nugent-Folan and Jennifer Edmond the datasets that compose the big data are themselves, when taken on a small scale, ­representative of data that has “no truth,” that is “pre-factual,” or that can be described via any of the definitions of data discussed previously. These conflicting conceptual definitions pose a major challenge: How can (small) data bring no assumptions of veracity, but “big data” be veracious? If “big data” is to have different characteristics than “small data,” then the terminological frame of reference for discussions on “big data” should differ from that of regular data, but as the results just outlined make clear, this is not the case. Researchers are speaking about entities with overwhelmingly different properties but using the same terminology, terminology whose usage indicates or imports properties or connotations that are characteristic of data but may be foreign to the characteristics of big data, especially if that big data are prescribed as being by definition “veracious.” The phrase “big data” is dictated by understandings of the term “data” that do not accord with or accommodate—on a semantic level—veracity, one of the principal characteristics of “big data,” with the term itself appearing highly malleable and manipulable. Rhetoric such as this distracts from the fact that problems identifiable at the level of mere “data” or singular “data points” are magnified when we take it to the scale of big data. “Big data” in particular appears to have been infused with a corporate glow of being the cure for just about any ill. One particularly striking example of this is the 2015 ad campaign by Winton Global Investments, which presents “analysing big data” as “the secret to living happily ever after” (Chandler, 2015). Continuing on the theme of fairy tales, Couldry (2013) refers to big data as a “myth” because it “is oriented to the social world in a particular way. It does not have as its domain a national population, or even the particular collectivities that might gather online. It builds its population, d ­ ata-bit by data-bit, through a series of operations that bypass earlier ideas of social interrelations.” People define data differently in the small scale, people interpret data differently in the small scale, and the term “data” is heavily overdetermined in the small scale. The inconsistencies of definitions and variability of what data can be, how it can be spoken of, and what can or cannot be done with it, are striking. While they would indicate that Rosenberg’s concept of data as “rhetorical” to some extent holds true, the sheer scale and variance has significantly more impact than can be conveyed with the single term “rhetoric.” For example, it is possible to have Ambiguous data, anonymous data, bad data, bilingual data, chaotic data, contradictory data, computational data, dark data, hidden data, implicit data, in-domain data, inaccurate data, inconclusive data, “information” (rather confusingly, several interviewees referred to data as “information”), log data, loose data, machine processable data, missing data, noisy data, outdated data, personal data, primary data, real data, repurposed data, rich data, standardized data, semi-structured data, sensitive data, stored data, true data, uncertain data. How exactly all of these variances of data can be together combined under the umbrella term “big data” without compromising on their small-scale specificities and complexities remains to be seen. To make things even more complex, any definition of data (or the architecture that shapes it and makes it available in an analogue or digital environment) needs to maintain an awareness of the speculative potential of the information contained within its datasets. Remarking on definitions that work by example or by assigning attributes to

Digitizing Cultural Complexity   561 examples of data—Borgman observes that “any such list is at best a starting point for what could be data to someone, for some purpose, at some point in time” (Borgman, 2017, p. 19). Not only can data be anything extant, then, it is also notional, and even speculative, being dependent on its contents suggesting a definition, contents that may change from discipline to discipline, or over time within a discipline. This again adds credence to Raley’s notion of data as performative. As a performative entity it has the potential to be other; to assume, perform, and acquire other values, according to the needs of the user. An additional problem is that what data are, what can be done with data, and what can be said about data—and conversely our understandings of what data itself can do to us, to its interlocutors, environments and to its contexts—all vary drastically from discipline to discipline, and even within disciplines. In addition, this overabundance of rhetorical strategies regarding what data are and how we are to speak about it also serves to create a certain distance from data. Because we cannot say with certainty exactly what it is, it acquires a mystique that carries both a sense of authority, granting data a certain objectivity that belies its curated, malleable, reactive, and performative nature.

Data Cleaning Also known as data scrubbing, data cleaning is a process by which elements are removed from a dataset or data stream, generally because they foul the desired processing, or one wherein elements or formats are standardized so as to make processing easier. This proc­ess is viewed by some communities as a central part of good research practice; ­others, however, view data scrubbing as a form of data manipulation that erodes the credibility of research based upon it. If we are to accept Borgman’s observation regarding the speculative potential for any item to be identified as, and function, as data “to someone, for some purpose, at some point in time” (Borgman, 2017, p. 19), then data cleaning in and of itself then is an interpretative act, because in doing so one establishes and distinguishes between signal and noise, and removing or muting the noise so as to highlight the signal. This means that the material scrubbed from the data when it was in its native environment is no longer accessible, having been deemed external to the remit of data selected for inclusion. This expunges items of potential future (speculative) value or merit as data. However, that which is not presently identified as data of a certain kind or value at one time, can or could be identified at any given point as data at another time; non-data removed through cleaning thus arguably always has the potential to be or become data. From this perspective all data streams are, to a certain degree, incomplete, and necessarily so: while noise may have the potential to be signal, signal can only become apparent after having been stripped of noise, such as through data cleaning, even if that material has the potential to be valuable. In addition to a lack of transparency regarding the fact that different appearances of the term data potentially signifies different data, and to the lack of transparency in relation to the transformations that are applied to data, the technique of cleaning data so as to produce more refined data are not systematized. Rather, it is presented as an almost

562   Georgina Nugent-Folan and Jennifer Edmond esoteric process, one that brings with it implicit assumptions regarding the methods used to clean or process; assumptions that are not only discipline specific, but often topic-and researcher-specific. The fact that salient aspects of an original record might have been lost along the way as data passes through the hands and servers of different actors is not only not recognized as potentially problematic, it is obliquely recognized as a part of the scientific process. In her analysis of a collection of papers belonging to Roger Casement, Edmond (2016) illuminates on what she refers to as the “potential complexity of historical sources” and the problems they pose when it comes to the provision of workable “big data” environments for historians or other researchers that work with cultural data: No less than three previous owners of the papers are referenced (one of which is only known for his or her status as a member of the aristocracy). Their place in Casement’s life (and indeed his own place in Irish history) is explained, chronologically and in terms of his thematic interests. The material status of the collection is given, including the fact that it consists of “mainly” (but not exclusively?) letters. A surprising anecdote is relayed regarding how the archive came to realize they held such a significant collection, which illustrates how the largely tacit knowledge of the archivist enabled their discovery and initial interpretation. This example is not an exceptional one. How is this level of uncertainty, irregularity and richness to be captured and integrated, without hiding it “like with like” alongside archival runs with much less convoluted narratives of discovery? Who is to say what in this account is “signal” and what “noise”? Who can judge what critical pieces of information are still missing?  (Edmond, 2016)

The NASA protocol for describing the processing levels of research data carefully (see the next section: Data Variants and Data Treatment) grades data from level 0 (raw or native data) through 4 additional levels of processing. However, as Borgman observes, these distinctions only pertain to the onset of the research, the point where data are gathered; thereafter, when the data are taken on by a new researcher or for a new project, the number resets and the data are once again considered raw, until it is subjected to further processing (Borgman, 2017, p. 26). Transparency regarding the transformations the data have undergone, and acts of pre-processing, processing, and cleaning should be flagged for the attention of other potential users of this data. It seems obvious that this should be a requirement of scientific analysis and publication, as should be accountability regarding both the provenance of a given dataset, and of the researchers involved in the collection and curation of that data.

Data Variants and Data Treatment Rather than attempt something akin to a “definition of culture” that is machine readable, or a taxonomy of complex cultural data that are digitally compatible, one that would allow us to somehow create a universal definition for data, we would perhaps be best

Digitizing Cultural Complexity   563 served by creating a typology of data variants. This typology would be based upon usage or how the material is treated, and ultimately how many levels the data are “away from” its origin material—material that itself may be complex or simple, human created or machine readable—or indeed by adopting one of the many examples of these protocols already in use. This sees data defined by what researchers do to the data to make it data. One of the most useful example of this comes in the form of NASA’s Earth Observing System Data Information System (EOS DIS) (NASA EOS DIS Data Processing Levels, n.d.) wherein, as Borgman notes, “Data with common origin are distinguished by how they are treated” (Borgman, 2017, p. 21). The EOS DIS is perhaps one of the most functional definitions of data available because it not only acknowledges the levels of proc­essing material undergoes to become data, but tiers this scrubbing or cleaning process, therein acknowledging that some material undergoes more extensive modification than others, and maintaining traceability to the source context or environ from which the “native data” was extracted. Table  20.2 reproduces the EOS DIS, ranging from “Level 0” data comprising of “Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts [ . . . ] removed,” to “Level 4” data comprised of “Model output or results from analyses of lower-level data (e.g., variables derived from multiple measurements)” (NASA EOS DIS Data Processing Levels, n.d.). The lowest level of data on the scale is referred to as “unprocessed” yet it has already been processed to a degree, as the table clearly states that “any and all communications artifacts [have been] removed.” This approach, then, is clearly not without its problems or idiosyncrasies. Furthermore, this system of data categorization is incomplete because of the absence of acknowledgement Table 20.2  NASA’s Earth Observing System Data Information System (EOS DIS) Data level Description Level 0

Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e.g., synchronization frames, communications headers, duplicate data) removed. (In most cases, the EOS Data and Operations System (EDOS) provides these data to the data centers as production data sets for processing by the Science Data Processing Segment (SDPS) or by a SIPS to produce higher-level products.)

Level 1A

Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e.g., platform ephemeris) computed and appended but not applied to Level 0 data.

Level 1B

Level 1A data that have been processed to sensor units (not all instruments have Level 1B source data).

Level 2

Derived geophysical variables at the same resolution and location as Level 1 source data.

Level 3

Variables mapped on uniform space-time grid scales, usually with some completeness and consistency. Model output or results from analyses of lower-level data (e.g., variables derived from multiple measurements).

Level 4

564   Georgina Nugent-Folan and Jennifer Edmond of a level that precedes “level 0,” referred to in passing by Borgman as “native data,” a phrase that has been discussed earlier in this chapter. Second, while the distinctions between levels are relatively explicit here, as noted previously they only pertain to the onset of the research, the point where data are gathered at the onset of the study: Although NASA makes explicit distinctions between raw and processed data for operational purposes, raw is a relative term, as others have noted [ . . . ] What is ‘raw’ depends on where the inquiry begins. To scientists combining level 4 data products from multiple NASA missions, those may be the raw data with which they start. At the other extreme is tracing the origins of data backward from the state when an instrument first detected a signal.  (Borgman, 2017, p. 26)

Interestingly, not one of the categories employed has an analogous one in the humanities (aside from the rather loose concept of primary, secondary, and tertiary sources). That is not to say that a clear, lucid gradation of data that distinguishes how the material has been treated, or at least flags the fact that the data has been subjected to transformations, would not be beneficial for humanities researchers. A “data provenance record” that flags the processing, transformations, and contextualization the material has been exposed to, or expunged from, would be useful here, but maintaining such a record can be challenging (Edmond, 2016). Again, the fact that the EOS DIS distinctions only apply up to the point where analysis begins (at which point the data, irrespective of processing level, becomes “raw data” within the context of that inquiry) is itself problematic. Even when it comes to an ordered definition as seemingly precise as the EOS DIS, however, we still have to acknowledge the arbitrariness of what we’re dealing with, as Borgman explains: No matter how sharp these distinctions between categories may appear, all are arbitrary to some degree. Every category, and name of category, is the result of decisions about criteria and naming. Even the most concrete metrics, such as temperature, height, and geo-spatial location, are human inventions. Similarly, the measurement systems of feet and inches, meters and grams, and centigrade and Fahrenheit reflect centuries of negotiation.  (Borgman, 2017, p. 26)

While this argument is ontological and has the potential to take a philosophical ­tangent that is anathematic to the pragmatic aims of this chapter, it is nevertheless worth keeping in mind that categories or taxonomies that appear neutral or objective now, may not have been so in the past and may not appear so in the future. Furthermore, different approaches to the measuring of natural phenomena such as those outlined earlier tend to converge over time, whereas human societies tend to change out from under them. In other words, established methodologies may not always fit the changing societies that use them. The changing vocabularies around issues of gender classification, with the recognition of transgender and intersex identities, are examples of how evolving social norms drive data practices to evolve as well, as are the evolving societal understandings of the makeup of a “family” to incorporate same-sex parents or single-parent families.

Digitizing Cultural Complexity   565 The evolution of census forms and social surveys to accommodate these changes, such as the removal in 1860 of questions about household slaves in the US Census questionnaire, illustrate how the data domain must at times change to accommodate changing societal values, as established ones may no longer be flexible enough for the contemporary era (United States Census Bureau, n.d.). Acknowledging that research is the project of, and reflective of, specific cultural periods and research practices is important, particularly when it comes to capturing and reflecting the ambiguities of complex cultural data in a digital age where, as noted, decisions to digitize or not and how objects are classified can have an impact on the material available for the development of cultural identities. In other words, different people interpret and interact with data differently. As Edmond and Nugent-Folan note, in the case of “a simple seashell”: Any given seashell can be said to have certain core properties about which there is likely to be broad consensus: it is hard, it is hollow, is has a certain color. But these core properties have the potential to take on—or be attributed with—very different meanings depending on who or what interacts with them, and what interpretation they lay over these core properties. A child, for example, will most likely appropriate it for use in a manner that is very different from that of a hermit crab. A fashion designer may take the item and, by inscribing designs on the shell, fundamentally alter the makeup of the item in and of itself, a factor that would influence how a museum cataloguer, working decades later, would catalogue the item for display within a GLAM [cultural heritage] institution. Put another way, each of the agents encountering the shell creates a narrative, ­capturing the meaning of the shell for them and for the moment in which they appropriate it.  (Edmond & Nugent-Folan, 2017)

If even such a seemingly simple object can have such a multitude of facets and interpretative avenues available to any agent that interacts with it, then cataloguers, archivists, and database managers clearly play a pivotal and difficult role in ensuring these interpretative avenues are available, while facilitating an operable digital research environment. This is largely done through the data these figures provide or generate on the data they are cataloguing or archiving, referred to as “metadata.”

Metadata With this in mind, we will now turn to a discussion of the issue of metadata’s contextual and formative influence on data. Metadata is, quite literally, data about data. Metadata describes data, what it is, what it is comprised or made up of, where it has come from, who has catalogued it, and so on. It plays a central role in navigating a database, digital catalogue or digital research environment, and is considered necessary for functionality, for practicality (planning research trips, navigating the archive etc.) and for documenting the contents of the archive, to the point where its role as a mediator between scholar/user

566   Georgina Nugent-Folan and Jennifer Edmond and archive has become standard; metadata is now not only an obligatory intermediary between a scholar and an archive, but a structural facet of the information architecture that mediates the space between user interface and the database. The PREMIS Data Dictionary for Preservation Metadata is considered “the international standard for metadata to support the preservation of digital objects and ensure their long-term usability” (Premis, n.d.). Traditional examples of metadata include the kinds of information available beside a painting hanging in a gallery: the artist’s name (if it is known), the artists’ dates of birth and death (if they are known), the title of the painting (if it is known), the materials used to realize it (such as oil on canvas, or watercolor, etc.), the dimensions of the painting, the current owners (or, if it belongs to the museum, who donated it and when, or when the museum acquired it), and perhaps even a short description of facets of the piece that may be considered particularly interesting. In a new media environment where this image may be reproduced digitally, this same information can also be provided in both human and machine readable formats, so you may also be able to quickly search out and compare it with other paintings by the same artist that may be held in another collection, or you may be able to view the painting from alternative angles, or see further images of the painting pre- and post-restoration. So metadata provides basic, often standardized information about the data one is interacting with, accessing, or using, and therefore it is an integral and unavoidable tool for researchers as they seek to use and understand their data. But questions remain as to how much of this context is being redacted, modified, or hidden by the surrounding information architecture. In the transition from analogue to digital, data loses facets of its native context: the smell, touch, or feel of the material document. In the case of a paper manuscript, these may include ink or coffee stains, or pen tests that may indicate the onset of a new writing session. It also acquires new contexts in the form of how metadata situates it in the digital environment, such as a keyword assignment that has the potential to connect seemingly unrelated documents. “Raw” or “native” data can be shaped by the contexts imported onto it in the form of metadata in and of itself. In addition to the impact of these new contexts, the “native” or “source” contexts that accompanied the data within its “original,” “pre-data,” or “native” environment, may not always be transferred to and catered for in the database. Classification systems and documentation standards are based upon high level, nongranular categories such as author/creator, title, location, or date. Such categories are an advantage because they allow for the classification of diverse and abundant quantity of data, but a disadvantage because they flatten out the data and make the granular more difficult to identify and access. In the previous section we introduced Raley’s concept of data as “performative” (Raley, 2013, p. 128). Others refer to how data can become “a sort of actor” “reshaping” (Ribes & Jackson, 2013, p. 148) the social world, or how data can carry the imprint of “the original interpretative framework through which they were constructed, collected or collated” (Drucker, 2014, p. 128). Now if data are performative or rhetorical, and possess the capacity to shape and be shaped by its contexts, both native and acquired, the same can be said for the structures that facilitate data access, such as metadata. Metadata standards such as Dublin Core, MARC 21 or EAD (see, for example,

Digitizing Cultural Complexity   567 http://www.loc.gov/ead/) can influence how we approach and conceptualize data, to the extent that metadata dictates how we read and interact with data; indeed, many would say that that is their very purpose. As the name implies metadata standards organize diverse objects and information in order to increase their findability and reuse value. But like bias, this influence can be good or bad. At its most basic level metadata is data about data—it points the researcher in the direction of data, and gives them more data on the data they are looking for. In other words, metadata delimits or demarcates data— representing rich context as or through lean data—and can therefore be similarly argued to have a performative or rhetorical nature in that it persuades you to consider the material it relates or refers to as data proper. There is a balance to be struck then between metadata and data structuring as delimiters of data, but also as a necessary part of navigating data-rich environments where categories, structuring, and metadata represent attempts to provide simpler, shorter, and processable indicators of the underlying data, which otherwise would be impossible to manage. It is worth considering then what makes up your data, and what terms we have to refer to the data we have on our data, that is, our metadata. Both of these aspects can have a significant impact on what you can know about data and how you can access and use it. Furthermore, as Horsley and Priddy note, in the case of complex cultural data, metadata plays such a central role that these “description of items [metadata]” was often “acknowledged [by archivists working in cultural heritage institutions] to be loaded with artefacts of its journey to the user, which become inscribed in this metadata and are inseparable from the data itself ” (Horsley & Priddy, 2018). The performative facet of metadata is magnified when we are working on the scale of big data, and is particularly problematic when we are dealing with “uncertain,” “ambiguous” or “complex” data. This performative effect—intended, deliberate, or unforeseen— is a phenomenon well illustrated by Presner’s analysis of the relationship between research subject, research data, and system metadata in the Shoah Visual History Archive (Shoah Foundation, n.d.). Presner argues that these relationships require an holistic approach to the conglomeration of complex cultural data in order to achieve “ ‘ethical’ modes of computation” (Presner, 2016, p. 179). It goes without saying that great sensitivity is required of any project dealing with the personal testimonies of victims of the Holocaust within a digital environment where the need for “responsible and ethical representations of the Holocaust” is “dependent on algorithmic calculations, information processing, and discrete representations of data in digitized formats (such as numbers, letters, icons, and pixels)” (Presner, p. 179). It is far too easy to err either on the side of losing sight of the many victims who died anonymously (by focusing on the more famous stories like Elie Wiesel or Anne Frank), or to lose sight of the human suffering at the core of the holocaust experience by focusing on the scale of the tragedy (as many have said of the Digital Monument to the Jewish Community in the Netherlands; Joods Monument, n.d.). An algorithmic approach to the records of the holocaust should, in theory, be able to reconcile these poles of human frailty in the face of enormous data corpora. However, as Presner observes, the Shoah VHA was reliant on human cataloguers for the creation of metadata and catalogues or other finding aids, leading to an approach

568   Georgina Nugent-Folan and Jennifer Edmond to uncertainty within the digital archive that saw certain material categorized as “ ‘indeterminate data’ such as ‘non-indexable content’ ” (p. 193). But that which is considered indeterminate or ambiguous differs dramatically, ranging from repetition, pauses, emotion, noise, or silence and even, most troublingly, material the indexer “doesn’t want to draw attention to (such as racist sentiments against Hispanics, for example, in one testimony)” (p. 192). What remains, as Presner notes, is “a kind of ‘normative story’ (purged of certain contingencies and unwanted elements)” (p. 192). What about researchers who are interested in what happens to the “non-normative” sections of the dialogues or archival material, the material that is not assigned a keyword or other metadata indicator? Who, after all, gets to designate the normative and delimit the non-normative? This is, after all, a central question underlying thesaurus-based indexing or individual cataloguer indexing (see Horsley & Priddy,  2018, p. 16). Privileging one facet of the datafied document neglects and undercuts the importance of the other, and vice versa. Each section assigned a keyword has a relationship with the material on either side of it within the text, but this un-indexed material is left “silent,” “hidden” (in certain cases, especially cases such as the one mentioned earlier in which facets of the testimonies are “purged” (Presner, p. 192) from the keyword ­thesaurus of the archive), or alternatively left latent and unstructured in the archive, without a keyword. It is for this reason that traditionally trained archivists and librarians struggle with metadata standards that incorporate provenance such as that developed by the International World Wide Web Standards Community W3C (https://www.w3.org/ Consortium/). To even begin to represent the richness of an archival record, which may have been passed through numerous hands and families, or contain such a variety of material that its eventual relevance is hard to judge, transcends the kinds of clear categories and linear relationships such standards propose. We should also remember in this respect that metadata is not the only technique that hides from us as much as it possibly exposes. Whether or not something is digitized, for example, and how and where it is made publically available, can have the same effect. Unintentional processes have a different impact, however, on findability than does the standardization of the descriptors of complex objects by means of metadata.

Conclusion This chapter has outlined some of the costs incurred as a result of the imprecision in the term “data” in an age when our data may be big, but our ability to express its heterogeneity is limited. Data are teleologically flexible, epistemologically variable, arbitrary, and categorically indistinct; data sources are also arbitrary and indistinct, particularly in the humanities. Data also contain errors and unintelligible material. Data can be considered to have near limitless interpretability, and if curated correctly, any facet of the native

Digitizing Cultural Complexity   569 material input can be identified as data at any point. Furthermore, the integration of data and computer science in humanities research represents a crossover of methodological and ideological approaches, with humanists adopting technological approaches they may be unfamiliar with, and computer scientists or engineers designing technology for complex cultural material they may be unfamiliar with; each community working with and valuing parts of that eco-system differently, and valuing the potential knowledge generated differently. Naturally, this can influence how we conceive of, manage, interpret, and use, our data. Irrespective of how one elects to theorize why “data” is or has become so problematic, the gradations of data one can encounter in a digital environment can lead to extensive confusion. We have data at the level of input, we have data that have been cleaned, we have data that is present but not encoded, we have data that only becomes data when it is excised from its original context and situated in another context. But for all of these different phases or evolutions of data, we have only one term; and thus we have minimal opportunities (aside from context within an article, discipline specific intuition, explanatory footnotes, etc.) to deduce what is being referred to when the term “data” is being used, or what has been done to “this data” to make it different from “that data.” There needs to be consensus in terms of what we speak about when we speak about data, one that acknowledges how difficult it is to “define” data when it comes to the diverse cultural resources that fuel humanities research. Much of this, as this chapter has made clear, is due to the poverty of our vocabulary for speaking about data—a term whose usage spans a huge variety of practices—in a way that can accommodate the richness, diversity, and complexity that is often a feature of cultural data (Edmond, Nugent-Folan, & Doran, 2019). We need systems that expose their uncertainty regarding what data are, or the limits of their data, in comprehensible ways for the average user, so that we can see, for example, where machine translation strips away cultural complexity because it is unable to transmit the subtle differences in word or phrase the original speaker may have intended, the web of intertextual references, or indeed the note of sarcasm or irony a human translator would work to convey. Perhaps most importantly, we need the vocabulary and the knowledge to, as Duportail (2017) puts it, “feel” data, to ascertain the tangible facets of its materiality, of its provenance, of its makeup that may not be fully accessible in a digital environment. Human knowledge creation is multi-modal and embodied. As it is currently conceived, digital information and data struggle to harness this affordance. To achieve this worthy goal, we will inevitably have to face the fact that data are not, as the hackneyed phrase has come to suggest, the “new oil” (Humby, 2006; Marr, 2018). The status of digital information objects as “data” is not a result of fixed and stable processes, but rather of an individual, corporate, or social recognition of certain information as having a certain value and place in various stakeholders’ knowledge creation processes. Only by stripping “big data” of its tacit claims to authority by virtue of qualities it may not even possess can we actually expose the true value of data, one that can build bridges of knowledge, rather than silos.

570   Georgina Nugent-Folan and Jennifer Edmond

Acknowledgments This work was developed in the context of the project Knowledge Complexity (KPLEX). The KPLEX project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 732340. It bears an intellectual debt to Dr. Michelle Doran of the Trinity Centre of Digital Humanities, Trinity College Dublin.

References BDIOT 2nd international conference on big data and the internet of things. (2018). [Call for papers]. Retrieved from http://www.bdiot.org/cfp.html Borgman, C. (2017). Big data little data no data. Cambridge, MA: MIT Press. Chandna, S., Tonne, D., Jejkal, T., Stotzka, R., Krause, C., Vanscheidt, P., . . . & Prabhune, A. (2015, February). Software workflow for the automatic tagging of medieval manuscript images (SWATI). In Document recognition and retrieval XXII (Vol. 9402, p. 940206). International Society for Optics and Photonics. Couldry, N. (2013, December 13). A necessary disenchantment: The myth of big data. Retrieved July 16, 2018, from Culture Digitally, http://culturedigitally.org/2013/12/a-necessarydisenchantment-the-myth-of-big-data/ Chandler, David. (2015, April 26). #BigData—“the secret to living happily ever after”—Winton Global Investment ad, St. Pancras International, London. [Twitter post]. Retrieved from https://twitter.com/davidch27992090/status/592415056052236289 Drucker, J. (2014). Graphesis: Visual forms of knowledge. Cambridge, MA: Harvard University Press. Duportail, J. (2017, September 26). I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets. Retrieved July 9, 2018, from The Guardian, https://www.theguardian. com/technology/2017/sep/26/tinder-personal-data-dating-app-messages-hacked-sold Edmond, J. (2016). Will historians ever have big data? Theoretical and infrastructural perspectives. In B. Bozic, G. Mendel-Gleason, C. Debruyne, & D. O’Sullivan (Eds.), Computational history and data-driven humanities (CHDDH 2016). IFIP Advances in Information and Communication Technology, 482, 91–105. Switzerland: Springer Nature Switzerland AG. Edmond J., & Nugent-Folan G. (2017). Data, metadata, narrative. Barriers to the reuse of cultural sources. In E. Garoufallou, S. Virkus, R. Siatri, & D. Koutsomiha D. (Eds), Metadata and semantic research. MTSR 2017. Communications in Computer and Information Science, 755. Springer, Cham. Edmond, J., & Nugent-Folan, G. (2018). Redefining what data is and the terms we use to speak of it. Retrieved from https://kplex-project.eu/deliverables/ Edmond, J., Nugent-Folan, G., & Doran, M. T. (2019). Reconciling the cultural complexity of research data: can we make data interdisciplinary without hiding disciplinary knowledge? Data Science (in press). Retrieved from http://hdl.handle.net/2262/83156 Garvey, E. G. (2013). “facts and FACTS”: Abolitionists’ database innovations. In L. Gitelman (Ed.), “Raw Data” is an oxymoron (pp. 89–102). Cambridge, MA: MIT Press. Horsley, N., & Priddy, M. (2018). Report on historical data as sources. Retrieved from https:// kplex-project.eu/deliverables/ Humby, C. (2006). Talk delivered at ANA Senior marketer’s summit, Kellogg School. Retrieved from: http://ana.blogs.com/maestros/2006/11/data_is_the_new.html

Digitizing Cultural Complexity   571 Jones  S.  M., Van de Sompel, H., Shankar, H., Klein, M., Tobin, R., & Grover, C. (2016). Scholarly context adrift: Three out of four url references lead to changed content. PLoS One, 11(12), e0167475. Joods Monument (n.d.). https://www.joodsmonument.nl/en/ Khalilian, M., Mustapha, N., & Sulaiman, N. (2016). Data stream clustering by divide and conquer approach based on vector model. Journal of Big Data, 3, 1. doi:10.1186/s40537-015-0036-x Klein, M., Van de Sompel, H., Sanderson, R., Shankar, H., Balakireva, L., Zhou, K., & Tobin, R. (2014). Scholarly context not found: One in five articles suffers from reference rot. PloS One, 9(12), e115253. Koehler, W. (2004). A longitudinal study of Web pages continued: A consideration of document persistence. Information Research, 9(2). Retrieved from: http://www.babapour01.20m. com/Opac_1.htm Marr, B. (2018). Here’s why data is not the new oil. Forbes, Retrieved March 5, 2018, from https://www.forbes.com/sites/bernardmarr/2018/03/05/heres-why-data-is-not-the-newoil/#1beb06343aa9 Marwick, A. & boyd, d. (2010). I tweet honestly, I tweet passionately: Twitter users, context collapse, and the imagined audience. New Media and Society, 13(1), 114–133. doi:10.1177/ 1461444810365313 Mussell, J. (2012). The passing of print: Digitising ephemera and the ephemerality of the dig­ ital. Media History, 18(1), 77–92. Najafabadi et al. (2015). Deep learning applications and challenges in big data analytics. Journal of Big Data, 2(1), 1. NASA EOS DIS Data Processing Levels. (n.d.). Retrieved from: https://science.nasa.gov/ earth-science/earth-science-data/data-processing-levels-for-eosdis-data-products PREMIS Data Dictionary for Preservation Metadata. (n.d.). Retrieved from: http://www.loc. gov/standards/premis/ Presner, T. (2016). The ethics of the algorithm: close and distant listening to the Shoah Foundation visual history archive. In C. Fogu, W. Kansteiner, & T. Presner (Eds.), Probing the ethics of holocaust culture (pp. 175–202). Boston, MA: Harvard University Press. Raley, R. (2013). Dataveillance and countervailance. In L. Gitelman (Ed.), “Raw Data” is an oxymoron (pp. 121–146). Cambridge, MA: The MIT Press. Ribes, D., & Jackson, S. J. (2013). Data bite man: The work of sustaining a long-term study. In L. Gitelman (Ed.), “Raw Data” is an oxymoron (pp. 147–166) Cambridge, MA: MIT Press. Rice, R. E., McCreadie, M., & Chang, S-J. (2001). Accessing and browsing information and communication. Cambridge, MA: MIT Press. Rosenberg, D. (2013). Data before the fact. In L. Gitelman (Ed.), “Raw Data” is an oxymoron (pp. 15–40). Cambridge, MA: MIT Press. Rosenthal, J. (2017, April 1). Introduction: “Narrative against data.” Genre, 50(1), 1–18. doi:10.1215/ 00166928–3,761,312 Rowlands, I., Nicholas, D., Williams, P., Huntington, P., Fieldhouse, M., Gunter, B., & Tenopir, C. (2008). The Google generation: The information behaviour of the researcher of the future. Aslib Proceedings, 60(4), 290–310. Retrieved from: https://doi.org/10.1108/00012530810887953 Shoah Foundation. Visual History Archive ®. (n.d.). Retrieved from https://www.sfi.usc. edu/vha Stanley, M. (2013). Where is that moon, anyway? The problem of interpreting historical solar eclipse observations. In L. Gitelman (Ed.), “Raw Data” is an oxymoron (pp. 77–88). Cambridge, MA: The MIT Press.

572   Georgina Nugent-Folan and Jennifer Edmond Tabard, A., Hancapié-Ramos, J. D., Esbensen, M., & Bardram, J. (2011). The eLabBench: An interactive tabletop system for the biology laboratory. In Proceedings of the ACM international conference on interactive tabletops and surfaces (pp. 202–211). ACM. Tenopir, C., Allard, S., Levine, K., Volentine, R., Christian, L., Boehm, R., Nichols, F., Christensen, R., Nicholas, D., Watkinson, A., Hamid, R., Eti, J., & Thornley, C. (2013). Trust and authority in scholarly communications in the light of the digital transition. Nashville: University of Tennessee & CIBER Research Ltd. United States Census Bureau (n.d.). Retrieved from: https://www.census.gov/history/www/ through_the_decades/index_of_questions/1860_1.html

chapter 21

Moti vations for On li n e K now l edge Sh a r i ng Kristin Page Hocevar, Audrey N. Abeyta, and Ronald E. Rice

Introduction Online knowledge sharing systems (KSS) greatly reduce the cost and effort of organizing, finding, and sharing knowledge and social connections, particularly when the content and users are geographically and temporally dispersed. Through these systems, individuals can make their privately held knowledge publicly available, and large groups can more easily share knowledge to support collective action (Bimber, Flanagin, & Stohl, 2005). Despite these advantages, there is often little obvious reward given to those who contribute knowledge; thus, individuals must be motivated by other factors. Extant research on motivations for online knowledge sharing is widely dispersed and can be found in a variety of disciplines. Moreover, this body of research has generated interesting—but at times contradictory or problematic—findings. Accordingly, this review ­synthesizes and organizes relevant theory and research on motivations for online knowledge sharing. The first section of this review provides its framework. The following two sections categorize motivations for online knowledge sharing by their source or stimulus, moving from more self-oriented motivations to more other-oriented motivations. The subsequent section summarizes three factors (self-efficacy, trust, and venue) that are not direct motivators but may provide preconditions or a moderating context. Finally, the review concludes with suggestions for future research.

574   Kristin Page Hocevar ET AL.

Framework Distinctions Five distinctions bound this review: the first three explain what is included and excluded, and the final two define fundamental concepts. First, we focus on research and theory on more general Internet (and intranet) sites whereby individuals can both collect and contribute content. As a result, we do not consider reference databases or data repositories from which users search and retrieve formally-structured information. Second, we do not consider technical aspects of knowledge sharing, such as system features or affordances or user search strategies, except as they are significant influences on the motivations for knowledge sharing. Third, we distinguish among the components of the knowledge-seeking and utilization process (Case & Given, 2016; Rice, McCreadie, & Chang, 2001), which includes (a) seeking (the process of browsing, accessing, or searching), (b) sharing (including direct, generalized, or indirect reciprocity, as exchanging tends to imply more direct reciprocity or even a dyadic transaction), (c) collecting (the process of receiving or actively obtaining knowledge, which may include posting requests or questions), (d) contributing (the process of providing knowledge, either through generalized posting or providing responses to specific requests or questions), (e) evaluating (for accuracy, credibility, or utility), and (f) utilizing (or applying) that knowledge. Here, we focus on sharing as the general process, and collecting and contributing when they are specifically referenced. Fourth, although access to information can refer to knowledge, technology, communication, and data or facts, we use the general term knowledge to emphasize the potential value of shared information. Fifth, we group individuals’ motivations to contribute to online KSS into two broad categories: self-focused and other-focused motivations. Based on the source or stimulus of the motivation, this distinction acknowledges the inherent and crucial social dilemma in KSS: the tension between individual costs and benefits and collective costs and benefits (Heinz & Rice, 2009). When people are driven to contribute by self-focused motivations, they conceive of themselves as unique individuals, acting primarily in their self-interest, who obtain a personal benefit as a result of contributing. In contrast, individuals driven by other-focused motivations conceive of themselves as members of a social context, whose actions are likely to affect or influence others in some way. Moreover, their contributions may be judged or influenced by a group identity or compared to others’ contributions or group norms. This distinction is somewhat similar to De Dreu and Nauta’s (2009) conceptualization of self-interest and other-orientation, but our use refers to the individual or social source or stimulus of the motivation. Self- and other-focused motivations may seem redundant with the more common distinction between intrinsic and extrinsic motivations (discussed later in the chapter), but they are conceptually separate. Intrinsic and extrinsic motivations hinge on the locus of control: intrinsically motivated individuals feel in control of their actions, whereas extrinsically motivated individuals respond to

Motivations for Online Knowledge Sharing   575 an external force. In contrast, self- and other-focused motivations are differentiated by individuals’ perceptions of themselves in relation to others. Having explained the five distinctions guiding this review, the following section provides a brief overview of public goods theory, which is foundational to an understanding of online knowledge sharing, and briefly introduces several common forms of KSS.

Public Goods and Knowledge Sharing Systems Using KSS, individuals can collaboratively produce a public good or a good that is both non-rival (i.e., one individual’s use of the system or its knowledge does not affect another’s use of it) and non-excludable (i.e., all members of the collective audience can benefit from the resource, regardless of their contributions to it). Because of these defining characteristics, contribution to online repositories is discretionary: individuals are not required to contribute their knowledge to the pool, and they can benefit from the pool without contributing to it (Connolly & Thorn, 1990). Consequently, the rational individual will act as a free rider or a social loafer; that is, he or she will benefit from others’ contributions to the pool but withhold his or her private knowledge (Fulk, Flanagin, Kalman, Monge, & Ryan, 1996). However, if each individual acted in his or her own interest, online repositories would be degraded and eventually destroyed (Hardin, 1968). Thus, the decision to contribute to online repositories represents a social dilemma, or a situation in which the interest of the individual is in opposition to the interest of the collective (Dawes, 1980). Nevertheless, some individuals do share their private knowledge; in doing so, they are engaging in collective action, an action undertaken by two or more people that results in the production of a public good (Marwell & Oliver, 1993). Individuals can share knowledge through a diverse array of online KSS, and research in this area refers to a dizzying number of systems. In the hopes of providing clarity, we briefly define three frequently referenced KSS—online repositories, communities or networks of practice, and social network sites—and characterize them by their level of connectivity and communality (Fulk, Flanagin, Kalman, Monge, & Ryan,  1996). Connectivity refers to the degree to which individuals are directly linked to one another in the network, whereas communality describes the extent to which individuals jointly hold, maintain, and benefit from the knowledge provided in the repository. Online repositories are information resources developed through individuals’ contributions, transmitted over a digital network, and made accessible for participants’ use (Cheshire & Antin, 2008). They emphasize the contributing and collecting of knowledge through the filtering and sharing of content, thus featuring communality instead of connectivity. Communities of practice involve the exchange of knowledge about a shared practice, but their members typically engage in face-to-face communication and have strong relational ties with one another (Brown & Duguid, 2000; Wenger, 1998). Both researchers and practitioners have noted that communities of practice can arise in virtual communities and have identified several ways that social media can help facilitate

576   Kristin Page Hocevar ET AL. communities of practice (Barab, 2003; Hoadley, 2012; Johnson, 2001). In the online context, communities of practice are sometimes also referred to as problem-solving virtual communities (Yu, Jiang, & Chan, 2007) or electronic networks of practice (Deng & Poole, 2011; Wasko & Teigland, 2004). Linked by weak ties, these individuals may never meet face-to-face but are able to exchange knowledge about their shared, specific topic through the network. Like online repositories, both communities and networks of practice are characterized by high levels of communality. However, they are also marked by individuals’ mutual engagement with one another (Wasko & Teigland, 2004), and therefore exhibit higher levels of connectivity than online repositories generally do. For instance, wikis not only support the sharing, revising, and pooling of both explicit and tacit knowledge and the development of knowledge bases, but they also support conversations about that knowledge, including revealing revision histories as a kind of threaded conversation (Arazy & Gellatly, 2012). Web 2.0 and social media are becoming pervasive contexts for diverse kinds of ­information sharing for a wide range of purposes (Ellison, Gibbs, & Weber,  2015; Ostauyi, 2013). Social network sites (SNS) are Web-based services through which individuals create a public or semi-public profile and maintain connections with other users of the system (Ellison & boyd, 2014). Because of their emphasis on social interaction, SNS can promote very high levels of connectivity. Additionally, SNS notifications and postings support ambient awareness among users, increasing knowledge sharing (Leonardi & Meyer, 2015). Some systems (such as Facebook) enable group and network members to easily collect and contribute knowledge, while others (such as Twitter) allow for rapid contribution to large numbers of unknown followers (Osatuyi, 2013). However, users of some SNS can mask their personal information and interactions through privacy settings, resulting in lower levels of communality than online repositories. As the preceding paragraphs demonstrate, a wide variety of KSS exist, and these variations are described by a multitude of specific terms. Khansa, Ma, Liginlal, and Kim (2015) provide an excellent, succinct overview of virtual knowledge collaboration sites, which consist of organizational knowledge management systems, online professional communities, Wikipedia and other open-source sharing communities, customer-based firm-hosted online communities, and online question and answer communities. Similarly, Phang, Kankanhalli, and Huang (2014) present a theoretical model of participation in online policy deliberation forums, which includes knowledge sharing, emotional support, interest- or hobby-oriented, product consumption, review, and citizen science project sites.

Self-Oriented Motivations This section explores the more self-oriented motivations for online knowledge sharing, including enjoyment and entertainment, altruism, expertise, feedback, reputation,

Motivations for Online Knowledge Sharing   577 incentives, and expected benefits and costs. To allow for a better understanding of these motivations, we first provide an overview of three relevant theoretical perspectives— motivational theories, transactive memory theory, and social exchange theory. Many self-oriented motivations can be categorized as intrinsic or extrinsic motivations. As a result, many studies (e.g., Bock, Zmud, Kim, & Lee, 2005; Cho, Chen, & Chung, 2010; Yang & Lai, 2010) are framed by motivational theories of intrinsic and extrinsic motivation, such as self-determination theory (SDT; Deci & Ryan,  2013). According to this perspective, intrinsically motivated individuals engage in an activity for the inherent satisfaction it provides rather than for an external reward (Ryan & Deci, 2000); in other words, the activity “is its own reward” (Deci & Ryan, 2013, p. 88). Consequently, intrinsically motivated behavior produces positive feelings, such as enjoyment, happiness, enthusiasm, and entertainment (Deci & Ryan, 1985). Individuals engaging in intrinsically motivated behavior also experience feelings of competence (i.e., the ability to deal effectively with one’s surroundings) and self-determination (i.e., autonomy or control of one’s actions), as well as feelings of creativity, flexibility, and spontaneity (Deci & Ryan, 1985). This theoretical perspective is relevant to many selforiented motivations; for example, those who engage in online knowledge sharing because it is enjoyable and entertaining are intrinsically motivated. Similarly, individuals who altruistically share knowledge are driven by intrinsic motivation. Extrinsically motivated individuals, on the other hand, engage in an activity to satisfy external pressures or to earn a tangible reward (Ryan & Deci, 2000). For these individuals, the desired goal or reward is distinct from the activity itself, and their behavior is often controlled by external contingencies, such as rewards sought or punishments avoided (Deci & Ryan, 2013). As a result, individuals engaging in extrinsically motivated behavior feel pressure and tension, and they may also experience feelings of anxiety and lower self-esteem (Deci & Ryan, 1985). Several self-oriented motivations are extrinsically motivated; for instance, individuals who engage in online knowledge sharing to enhance their reputation or to earn rewards are extrinsically motivated. Although many self-oriented motivations are explained by motivational theories, others are better explained by transactive memory theory, which provides a cognitive and social basis for understanding why members contribute to and benefit from KSS, especially across interdependent groups (Huang, Barbour, Su, & Contractor, 2013). This perspective argues that people working together—typically in organizations and teams—need to be able to recognize members’ expertise and retrieve information from and provide information to the relevant expert. In other words, group members do not need to personally remember others’ expert knowledge, but they must be aware of others’ expertise and able to collect and contribute knowledge. In doing so, members develop a transactive memory system. Transactive memory theory explains several selforiented motivations, such as the demonstration of one’s expertise or the enhancement of one’s reputation through online knowledge sharing, but it is also relevant to otheroriented motivations, which are discussed in a subsequent section. Lastly, social exchange theory (Blau, 1964) can provide a better understanding of selforiented motivations. This perspective portrays individuals as rational beings and

578   Kristin Page Hocevar ET AL. argues that their behavior is based on cost-benefit analyses. Blau (1964) asserts that individuals are driven to maximize their personal benefits while minimizing their personal costs. Accordingly, they will only engage in an activity if the benefits of doing so outweigh any associated costs. Following this logic, individuals’ decision to contribute to online repositories is based on the expected costs and benefits of contributing. Having outlined three theoretical perspectives that undergird self-oriented motivations, in general, we now transition to a discussion of specific motivations found in the literature on online knowledge sharing.

Enjoyment and Entertainment For some individuals, online knowledge sharing is enjoyable and entertaining and thus considered an intrinsically motivated behavior. For example, users of social network sites identified enjoyment (Lin & Lu, 2011), as well as fun and entertainment (QuanHaase & Young,  2010), as primary motivations to share knowledge. Participants in online discussion forums (Wasko & Faraj, 2000; Yang, Li, Tan, & Teo, 2007) and online travel communities (Wang & Fesenmaier, 2003) also cited feelings of enjoyment as their motivation to contribute. Similarly, Wikipedia contributors are primarily motivated by the fun (Nov,  2007) and enjoyment (Hoisl, Aigner, & Miksch,  2007; Moore & Serva, 2007; Schroer & Hertel, 2009) they derive from contributing. Although many studies have found feelings of enjoyment and entertainment to motivate online knowledge contribution, others have not. For example, a study of Wikipedia contributors (Yang & Lai, 2010) found that enjoyment was not a significant predictor of self-reported knowledge sharing frequency. Similarly, members of an online photo sharing community with a longer tenure were not motivated by enjoyment to post or tag photos, though newer members were (Nov, Naaman, & Ye, 2010).

Altruism Because individuals do not usually receive immediate benefits or rewards in exchange for contributing to an online repository, some have argued that altruism—“absolute lack of self-concern in the motivation for an act” (Kankanhalli, Tan, & Wei, 2005, p. 122)— plays a motivational role. Although this definition suggests that altruism could be considered an other-oriented motivation, most studies consider it to be an individual, internal motivation; thus, we also consider it to be a self-oriented motivation. Both qualitative and quantitative work have supported the relationship between altruism and online knowledge sharing (e.g., Cho et al., 2010; Hew & Hara, 2007; Moore & Serva, 2007). Surveys of participants in a question and answer site (Lou et al., 2013) and a virtual community (Chang & Chuang, 2011) indicated that altruism was a significant, positive predictor of both knowledge contribution quantity and quality. Similarly, individuals who provided health-related knowledge in a question and answer site

Motivations for Online Knowledge Sharing   579 (Oh, 2012) and participants in online technical forums (Wasko & Faraj, 2000) were most motivated by feelings of altruism. Kankanhalli et al. (2005) noted that altruism was positively associated with individuals’ contribution to electronic organizational repositories, and Jadin, Gnambs, and Batinic (2013) reported that individuals with a prosocial orientation were significantly more likely to be authors—as opposed to just readers—of Wikipedia articles. However, altruism is not always associated with knowledge sharing. For example, Hung, Durcikova, Lai, and Lin (2011) found that individuals high in altruism did not contribute significantly more ideas during a computer-mediated group brainstorming task than individuals low in altruism. It is possible that these inconsistent results stem from the various ways that altruism is conceptually and operationally defined. Whereas many researchers define altruism as feelings of enjoyment derived from helping others (e.g., Chang & Chuang, 2011; Cho et al., 2010; Hung et al., 2011; Oreg & Nov, 2008), Hew and Hara (2007) refer to altruism as feelings of empathy and compassion for others, and Fugelstad et al. (2012) measured altruism by assessing the frequency at which individuals performed helpful behaviors (e.g., giving up a seat on the bus). Theoretically, the existence of “true” or “selfless” altruism is a matter of some contention. Some scholars argue that altruism is an evolutionary instinct to reward others for cooperation or adherence to norms (Fehr & Fischbacher, 2003), but others argue that an altruistic motivation to help others is actually due to more egotistic reasons (see Andreoni,  1990; Batson, 2014). This suggests that relative altruism, in which an act is motivated only by minor self-concern (Smith, 1981), may also foster knowledge sharing. In sum, whether or not altruism, specifically, is the correct term (a conceptual issue), and a strong motivating factor (an empirical issue), is still unclear.

Expertise Expertise of the collector or the contributor may motivate online knowledge sharing. Although Surowiecki (2004) and others argue for the wisdom of the crowd (i.e., aggregated knowledge contributed by laypeople can generate a more accurate answer than one generated by a single expert), other scholars suggest that aggregated knowledge resources must include some number of experts or knowledgeable contributors in order to achieve an accurate and valuable outcome (Sunstein, 2006); this is especially relevant for some knowledge domains, such as health (Rice, 2004). Not only is expertise needed to contribute information, but users also need relevant expertise to determine which knowledge to collect and to subsequently evaluate it (Rice, McCreadie, & Chang, 2001). Although self-rated expertise was not a significant predictor of knowledge sharing amount or quality in an online professional association (Wasko & Faraj, 2005), tenure and experience in the domain were positively related to contribution. Additionally, the expectation that experts will retrieve and use knowledge contributed by team members was positively related to the likelihood of members contributing knowledge (Huang et al., 2013). However, experts may contribute less in some contexts because they fear they

580   Kristin Page Hocevar ET AL. will lose their competitive advantage, damage their reputations by sharing inaccurate or biased knowledge, be critiqued by other experts, or lose face if they do not provide the best answer (Heinz & Rice, 2009; Kang, Kim, Gloor, & Bock, 2011).

Feedback Feedback—both positive and negative—can also motivate individuals to engage in online knowledge sharing. Although feedback emanates from others—suggesting that it could be an other-oriented motivation—it is individuals’ need for and interpretation of it that motivates contribution. When people receive positive feedback, they feel satisfied and competent, and they may also feel that they are responsible for their good performance; accordingly, positive feedback is generally correlated with increased motivation to contribute knowledge (Cheshire & Antin, 2008; Zhu, Zhang, He, Kraut, & Kittur, 2013). For example, receiving comments on a photo posted online tends to predict increased future contribution (Burke, Marlow, & Lento, 2009). Surprisingly, negative feedback can also motivate contribution, as it increases the recipient’s desire to improve others’ perceptions of his or her performance. However, negative feedback can also decrease motivation (Deci & Ryan, 2013), as it can be perceived as a challenge to the contributor’s knowledge or expertise or as a signal that the contribution is counter-normative (Zhu et al. 2013).

Reputation When individuals’ contributions to an online repository can enhance their reputation— or others’ perceptions of their character or expertise—they may be more motivated to contribute (Kollock, 1999). Prior research has found a positive relationship between reputation enhancement and contribution quantity (Nov et al., 2010; Wasko & Faraj, 2005). Moreover, reputation enhancement is a significant predictor of the quality of individuals’ contributions to virtual communities (Chang & Chuang, 2011) and electronic networks of practice (Wasko & Faraj, 2005). Even users with an already high reputation in an online knowledge sharing community may contribute more—as well as higher quality—content (Javanmardi, Ganjisaffar, Lopes, & Baldi, 2009). However, reputation enhancement does not always motivate information contribution. For example, Oreg and Nov (2008) found that Wikipedia editors are more motivated to contribute by enjoyment than perceived reputation building, and other studies failed to find a significant correlation between users’ desire to enhance their reputation and their contributions to the online repositories (Lampe, Wash, Velasquez, & Ozkaya, 2010; Yang & Lai, 2010). Reputation enhancement might be particularly motivating for individuals with professional identities or those participating in professional sites, as reputation has significant implications for organizational and professional outcomes. In a survey of participants in a voluntary, organizational community of practice, Jeon, Kim, and Koh (2011) determined that participants’ anticipated recognition from superiors and co-workers

Motivations for Online Knowledge Sharing   581 s­ ignificantly predicted their contribution. Knowledge contribution to such an online community or network of practice could affect the contributor’s organizational reputation or others’ perceptions of his or her professionalism or even of the shared profession itself (Heinz & Rice, 2009; Wasko & Faraj, 2005). The importance of reputation in professional communities is underscored by Oreg and Nov’s (2008) study of contributors to open source projects, which revealed that reputation was a stronger motivator of knowledge sharing for open source software developers than for open source content creators (e.g., Wikipedia editors). Whereas open source content creation is accessible to all interested individuals and rarely requires peer review, open source software developers must possess a degree of expertise, and their work is vetted by a review process; as result, developers’ contributions could result in professional reputation-building that benefits their careers (Oreg & Nov, 2008).

Incentives Individuals are sometimes offered virtual or tangible incentives in exchange for ­contributing knowledge to online repositories. For example, some repositories (e.g., TripAdvisor, Yelp, Foursquare, and Wikipedia) recognize contributors with virtual badges, whereas other repositories may provide contributors with tangible incentives (such as free products or monetary compensation). Rewards have been found to be a significant, positive predictor of knowledge contribution quantity (Lou, Fang, Lim, & Peng, 2013), and personally meaningful incentives are especially motivating (Karau & Williams, 1995). For example, Farzan and Brusilovsky (2011) offered students access to a career planning tool in exchange for their contributions to a course recommendation system. The incentive was an effective motivator: students who had access to the career planning tool contributed significantly more knowledge to the system than those who did not. However, the effect of incentives on individuals’ contribution behavior varies; paradoxically, incentives may even discourage individuals from engaging in voluntary knowledge sharing. A meta-analysis of the effect of external rewards on intrinsic motivation (Cameron & Pierce,  1994) found that expected, tangible rewards consistently have a negative effect on individuals’ intrinsic motivation to engage in a task, as do rewards that are contingent on an individual’s performance on a task. Two theoretical explanations have been proposed for this apparent paradox. According to the overjustification hypothesis (Lepper, Greene, & Nisbett, 1973), when intrinsically motivated individuals are provided with an external reward for their behavior, they attribute their behavior to the external reward and discount the role of intrinsic motivation in their behavior. Cognitive evaluation theory (Deci & Ryan, 1985) posits that individuals have a need for competence and self-determination and intentionally engage in behaviors that fulfill these needs. However, those feelings can be diminished by controlling events, or events that are perceived as an attempt to determine one’s behavior, such as expected rewards, which results in decreased intrinsic motivation.

582   Kristin Page Hocevar ET AL.

Expected Individual Benefits and Costs Individuals’ motivation to contribute to online goods is dependent on the costs and benefits they expect to experience as a result of sharing their knowledge (Fulk et al., 1996; Heinz & Rice, 2009). Like social exchange theory, information foraging theory (Pirolli & Card, 1995) suggests that online information seekers make an a priori estimation of the costs and benefits associated with information from certain sources as part of the online information search (and subsequent sharing) process (Taraborelli, 2008). In addition to the motivations already discussed, the online knowledge sharing literature reveals a number of motivating individual benefits in these systems (Ardichvili, 2008), such as improved work performance or social activities, emotional benefits, intellectual benefits, and material gain. Additionally, KSS allow individuals to save time, compare information from different sources and experts, search at their own pace during problem-solving, engage in asynchronous use, and access information without having to know the specific individual provider (Yuan, Rickard, Xia, & Scherer, 2011). However, users also incur individual costs from using a KSS. Learning to use the KSS, as well as browsing, collecting, evaluating, and managing information requires time, energy, and expertise. Contributing knowledge requires similar resources, and users must learn how to codify their knowledge for retrieval and use by others. Additionally, individuals must consider the potential decrease in competitive advantage when sharing exclusive or valuable knowledge (Allen, 1990; Fulk et al., 1996), and they must also consider the potential damage to themselves or others from sharing low-quality or incorrect knowledge (such as in scientific KSS; Kim & Adler, 2015). Although users differ in their perception of these costs and benefits, most users value information accessibility over quality (Simon, 1972) or sufficiency (Chen, Duckworth, & Chaiken 1999). However, individuals with a high need for information tend to weigh information quality more heavily than accessibility (Lu & Yuan, 2011). Thus, users’ motivation to commit their time and effort to sharing accurate information will depend on the purpose of the information (i.e., benefits) and the consequence (i.e., costs) of collecting inaccurate information (Metzger, 2007). For example, individuals may be more motivated to share and collect accurate health or financial information than information of less consequential types, such as entertainment-related.

Other-Oriented Motivations As discussed, self-oriented motivations do not sufficiently explain all online knowledge sharing. Indeed, some studies (e.g., Chiu, Hsu, & Wang, 2006; Rashid et al., 2006) suggest that self-oriented benefits may not be the strongest motivations for online knowledge sharing. Thus, this section explores these other-oriented motivations, which include reciprocity, social comparison, social loafing, group belonging and sociality, and social/group identity. Before these motivating factors are addressed, we provide an

Motivations for Online Knowledge Sharing   583 overview of the social identity approach, which allows for a better understanding of other-oriented motivations. According to social identity theory (SIT; Tajfel & Turner, 1986; Turner, 1975, 1991), individuals possess both personal (i.e., those based on individual characteristics) and social (i.e., those based on group membership) identities; together, these identities constitute individuals’ self-concept. Individuals are motivated to attain and maintain a positive self-concept, and they frequently engage in social comparisons to assess their standing relative to others; depending on the context, these comparisons are based on either their personal or social identities. When social identity is salient, self-categorization theory (SCT; Turner, 1991; Turner, Hogg, Oakes, Reicher, & Wetherell, 1987) argues that individuals see themselves as representatives of their social group rather than unique individuals; consequently, their behavior is guided by the values and norms of their group. Moreover, individuals—driven by their desire for a positive self-concept— engage in behavior that will increase their social group’s status. The social identity approach provides a theoretical explanation for many other-oriented motivations, such as social comparison and group identification. Although SIT and SCT are relevant to many other-oriented motivations, social exchange theory (summarized in the previous section, Self-Oriented Motivations) can also explain several other-oriented motivations, such as reciprocity.

Reciprocity Often, individuals are motivated to engage in a behavior because of expected reciprocity by others. According to social exchange theory and related theories of social capital and group behavior (e.g., Blau, 1964; Ekeh, 1974; Putnam, 2000; Thibaut & Kelly, 1965), people tend to expect direct reciprocity from those who benefit from their time, effort, or other resources (e.g., knowledge). For example, users of social media sites (Oh & Syn, 2015) and newsgroups (Wasko & Faraj, 2000) were motivated to share knowledge by feelings of reciprocity. Similarly, individuals’ expectations of reciprocal relationships significantly predicted contribution to a voluntary, organizational community of practice (Jeon et al., 2011), and participants’ perceptions of reciprocity in virtual communities significantly and positively predicted both the quantity and quality of their contributions. Moreover, Cheung, Lee, and Lee (2013) found that contributors to an online professional community were more satisfied with their knowledge sharing experience and more likely to engage in future contributions when their expectations of reciprocity were fulfilled. Nonetheless, reciprocity is not uniformly associated with increased knowledge sharing. In fact, Wasko and Faraj (2005) found a negative relationship between expectations of reciprocity and knowledge sharing quantity, and others found no significant association between expectations of direct reciprocity and the quantity or quality of knowledge s­ haring in virtual communities (Chiu et al., 2006; Hung et al., 2011; Lin, Hung, & Chen, 2009). Changes in the online environment have expanded this one-to-one relationship to a one-to-many relationship. Thus, reciprocity can also be indirect or generalized; that is, a

584   Kristin Page Hocevar ET AL. contribution to a community can be reciprocated in the future by other community members or through access to a community resource (e.g., Ekeh, 1974; Kollock, 1999; Putnam, 2000; van Doorn & Taborsky, 2012; Wasko, Teigland, & Faraj, 2009). In fact, theories of social capital indicate that people volunteer or become civically active because they anticipate a “long term and conjectural” (Putnam, 2000, p. 135) return— not because they expect direct, immediate reciprocity. Scholars have used indirect or generalized reciprocity to help explain motivations for online knowledge sharing with varying success. In their larger knowledge sharing model, Cho et al. (2010) reported that a sense of belonging positively influenced generalized reciprocity, which positively influenced intention to share knowledge in Wikipedia. Similarly, teachers participating in an online community of practice were frequently motivated by the expectation of generalized reciprocity (Hew & Hara, 2007). Thus, in many online communities, a sense of generalized reciprocity is likely more motivating than expected direct reciprocity.

Social Comparison According to social comparison theory, individuals evaluate their opinions and abilities by comparing themselves to groups of similar or valued others, especially when objective, non-social criteria are few or non-existent (Festinger, 1954). If these comparisons reveal a discrepancy between an individual’s opinions or abilities and those of their group, the individual will attempt to reduce the discrepancy by improving his or her deficient opinion or ability. To do so, the individual may try to influence other group members, or—in the case of extreme discrepancies—they may try to redefine, disidentify with, or leave the group. Because most online sharing sites do not enforce or require membership, the easiest response to highly unfavorable social comparisons concerning contributions is to continue social loafing or to leave the group (Rice, 1987). Prior research (Samak & Sheremeta, 2013) has shown that social comparison influences online knowledge contribution. For example, Farzan et al. (2008) implemented a ranking system in an enterprise social networking site that highlighted the most productive contributors, making their reputation visible to other users of the site; following the implementation of this system, users contributed significantly more content to the site. Similarly, Hung et al. (2011) motivated participants in a computer-mediated group brainstorming task to contribute by providing them with information about their unique contributions and their ranking relative to other members of the group. After receiving this information, participants contributed significantly more ideas (quantity), and their ideas were significantly more useful and more creative (quality). Feedback that includes specific social comparison information (i.e., positive or negative comparisons of an individual’s contributions to those of other users) is also associated with future knowledge sharing (Cheshire & Antin, 2008; Harper, Li, Chen, & Konstan, 2007). In a field experiment, Harper et al. (2007) provided some users of a user-generated movie rating and recommendation site with information about their contribution amount compared to other users of comparable tenure in the community. Users who

Motivations for Online Knowledge Sharing   585 received this information rated significantly more movies in the following week than users who were not given this information; they were also more likely to express a desire to increase their standing, relative to other users, and they exceeded their lifetime average weekly contribution amount. Interestingly, there was a decline in the number of reviews provided by “above average” contributors in the week following receipt of the information, indicating users’ tendency to alter behavior to correspond to the social comparison norm. Cheng and Vassileva (2005) implemented a hierarchical membership system in a peer-to-peer file and bookmarking system, placing users into one of three membership categories based on their contribution amount relative to that of others. Each membership category was associated with additional functions within the system, and users who belonged to more prestigious membership categories—high contributors—had access to more additional functions than less frequent contributors. Following the introduction of the membership system and the additional functions, users increased their ­log-ins, time spent online in the network, and the number of resources shared. Approximately half of the users checked their membership data weekly, and virtually all of these users reported increased effort to upgrade their membership status. Surprisingly, many of the additional functions were not widely used by those who became eligible to do so, indicating that the desire to improve their ranking relative to other users was an effective motivator by itself. Ultimately, the implementation of this system resulted in increased quantity of contributions but decreased quality of contributions. One context for social comparison is social facilitation, which refers to the increase in effort that individuals exhibit when they are working coactively and know they can be compared to others (Huang & Fu, 2013). In social facilitation research, individuals work collectively with others, but their individual efforts are identifiable and, therefore, potentially evaluable. As a result, individuals experience increased motivation because they believe they are in competition with others (Cottrell, 1972) or because the possibility of evaluation leads to increased self-awareness (Carver & Scheier, 1981) and concerns about self-presentation (Bond, 1982). Extended to online repositories, this approach suggests that new contributors who experience an ambiguous situation may be influenced to contribute more knowledge when they are provided with statistics about users’ past contribution amounts (i.e., social comparison), especially if these statistics are based on high contributors; however, prior contributors would need different motivations.

Social Loafing The non-excludable and discretionary nature of online repositories, in combination with the costs of contributing, encourages individuals to engage in social loafing, or the tendency to be less motivated and exert less effort when working collectively than alone (Karau & Williams, 1995). The tendency to engage in social loafing is exacerbated when

586   Kristin Page Hocevar ET AL. individuals’ outputs are pooled, with this sum representing the group’s performance (Harkins, 1987). Individuals’ tendency to engage in social loafing has been attributed to the reduced ability to monitor the efforts of any one individual (Karau & Williams, 1995). Harkins (1987) found that individuals participating in collective tasks were more productive and more accurate when their efforts would be evaluated than when they would be pooled. Thus, reduced visibility may be the primary cause of social loafing. Williams, Harkins, and Latané (1981) concluded that identifiability mediates the relationship between collective tasks and social loafing. Based on this line of research, the collective effort model (Karau & Williams, 1995) posits that social loafing will be reduced when individuals believe others can evaluate their efforts, which requires visibility and identifiability, and when they feel an affinity for or identify with the group with which they are working. Thus, expectations of being able to engage in social loafing can serve as a motivation to not share knowledge. The possibility of social loafing is heightened in the online context, due to factors such as anonymity (i.e., low identifiability or visibility), ephemeral group membership, few expectations of future interaction, ease of access, decreased costs of storage and transmission, and the negligible or absent costs of non-participation, or “lurking” (Rice, 1987; Shirky, 2008). Several studies of online knowledge sharing support this line of reasoning. For instance, Samak and Sheremeta (2013) conducted a laboratory experiment in which individuals participated in a computer-mediated public goods game; participants’ contributions were either private (not visible) or public (visible). When all participants’ contributions were visible, participants contributed significantly more than when participants’ contributions were not. Erickson and Kellogg (2000) argued that, in face-to-face interactions, individuals’ behavior is based on social cues that are often absent in mediated contexts. To reduce this problem, they designed a socially translucent knowledge-sharing system that allowed for visibility, awareness, and accountability. As a result, users were more aware of other individuals’ presence and felt more accountable for their actions. Zhang, de Pablos, and Zhou (2013) examined the effect of visibility on sharing behavior in an organizational knowledge management system, taking into account organizational reward and exchange ideology, or the belief that an individual’s work effort is dependent on the organization’s treatment of the individual. Organizational reward was not significantly related to knowledge sharing behavior when visibility was low; however, when visibility was high, the relationship between organizational reward and knowledge sharing behavior was stronger for individuals with high exchange ideology.

Group Belonging and Sociality People may be more likely to share and seek knowledge online when they perceive similarity between themselves and other community members, perceive a sense of belonging with other online community members, or identify with the community or group, even when they do not know the others’ true identity (Chiu et al., 2006; Cho et al., 2010;

Motivations for Online Knowledge Sharing   587 Flanagin, Hocevar, & Samahito,  2013; Lampe et al.,  2010; Ling et al.,  2005; Rashid et al., 2006). Users of a movie recommendation site were significantly more likely to rate a movie when they believed their rating would benefit movie viewers who were similar to them (Rashid et al., 2006). People are likely to seek out, evaluate, and select knowledge from online health information sharing communities using heuristic cues that are based in perceived similarity with other users, even if they do not know them personally (Sillence, Briggs, Harris, & Fishwick, 2007). A desire to maintain interpersonal connectivity has also been positively linked to perceived contribution to an online community (Lampe et al., 2010). Similarly, affiliative tendency, or positive expectations in social relationships, positively corresponds with knowledge sharing amount even when the knowledge was shared with others unknown to the contributor (Lee & Jang, 2010). This supports research that has indicated a potential “social” nature of online knowledge sharers, even when the sites or systems used are mostly or wholly anonymous (Burke et al., 2009; Correa, Hinsley, & Gil de Zúñiga, 2010; Hocevar, Flanagin, & Metzger, 2014; Hughes, Rowe, Batey, & Lee, 2012; Mikal, Rice, Kent, & Uchino, 2015). Belonging influences knowledge sharing via a variety of mediators, including generalized reciprocity, altruism, subjective norms, and self-efficacy (Cho et al.,  2010). For example, Lampe et al. (2010) found that Everything users with a strong sense of belonging—or affective commitment toward a community—were more likely to contribute to the online encyclopedia in the future. Similarly, bloggers’ group involvement, or attachment and sense of belonging to a group, was positively related to their contributions (Kim, Zheng, & Gupta, 2011), and Hew and Hara (2007) found that commitment to the group was a strong motivation for knowledge sharing.

Group Identity Research has shown a direct relationship between shared group identity and online content contribution. Individuals’ identification with a particular virtual community can be a significant, positive predictor of both the quantity and quality of knowledge contribution (Chang & Chuang, 2011). Nov, Anderson, and Arazy (2010) showed that individuals who were affiliated with a project team contributed more to SETI@home, a volunteer science computing project, than those who were not. Group identity may also have indirect effects with online knowledge sharing. In Nov et al.’s (2010) study, team membership moderated the negative relationship between project tenure and contribution; in other words, the general decline in contribution over time was not as steep for individuals affiliated with a team. However, in order to be effective, this group identity must be salient or meaningful to the individuals. Flanagin et al. (2013) uncovered an indirect relationship between group identity, motivation, and contribution to an online course rating site. When individuals believed other users of the site were similar to them (i.e., high group identification), they experienced higher motivation to contribute and submitted more knowledge to the site than those who believed

588   Kristin Page Hocevar ET AL. the other users of the site were dissimilar to them (i.e., low group identification). Similarly, Dholakia, Bagozzia, and Pearo (2004) surveyed participants in several types of online communities (e.g., email lists, bulletin boards, newsgroups, chat rooms, multiplayer games) and also discovered an indirect relationship between social identity and contribution. Social identity was a significant predictor of individuals’ desire to interact with their community, which significantly predicted their commitment to participate in a joint action.

Expected Collective Costs and Benefits Expected collective benefits and costs are other-oriented motivations. Collective benefits include establishing ties with others, building a stronger, more sustainable online community, enhancing the relevant profession or community, broadly distributing relevant knowledge throughout a profession or community, and reinforcing normative considerations such as shared values, reciprocity, and conformity (Bagozzi & Tsai, 2014; Chen & Hung, 2010; Heinz & Rice, 2009; Yates, Wagner, & Majchrzak, 2010). Collective costs may occur through difficulty in filtering, collecting, and interpreting relevant knowledge in massive online systems. Additionally, the community’s credibility or reputation may be damaged by the spread of incorrect or harmful knowledge, breach of confidential or proprietary knowledge, and fluctuations in membership (Kraut & Resnick, 2011).

Contextual Factors The preceding sections reviewed self- and other-oriented motivations that drive online knowledge sharing. This section briefly reviews several contextual factors. These factors do not motivate contribution, but they serve as preconditions or moderators of relationships between motivations and online knowledge sharing.

Self-Efficacy Self-efficacy refers to a person’s judgment of his or her ability to execute a behavior and generally correlates with better performance (Bandura,  1997). Some scholars have suggested that self-efficacy can directly influence knowledge sharing; for example, selfefficacy positively relates to the amount of knowledge exchanged via an organizational knowledge management system (Cabrera, Collins, & Salgado, 2006). Studies have also suggested that self-efficacy indirectly affects knowledge sharing, either by mediating the relationship between sense of belonging and intent to contribute knowledge (Cho et al., 2010), or by predicting personal outcome expectations (i.e., positive personal outcomes associated with knowledge sharing), which then predict knowledge sharing

Motivations for Online Knowledge Sharing   589 intent (Hsu, Ju, Yen, & Chang, 2007). Self-efficacy can also reduce the perceived difficulty of an online knowledge seeking activity (David, Song, Hayes, & Fredin, 2007). Although self-efficacy can enhance performance, it may or may not be thoughtfully processed or perceived by an individual as a motivation to engage in an activity. Thus, self-efficacy may be more of a precondition of online knowledge sharing than a motivating factor by itself.

Trust Trust is positively associated with the quality of knowledge shared in online virtual communities (Chang & Chuang; 2011; Chiu et al., 2006), and trust based on identification with a virtual community of practice positively relates to knowledge sharing (Usoro, Sharratt, Tsui, & Shekhar, 2007). Indeed, trust and community- or group-related motivations likely go hand in hand, as people are more likely to trust those with whom they share group or community membership (Turner, 1991). However, a certain level of trust may be a necessary precondition of knowledge sharing rather than a motivation itself, as the potential knowledge sharer must have some level of additional motivation (Ardichvili,  2008). Huang et al.’s (2013) study of 17 work groups across the US and Western Europe explained how trust is required for developing both connective and communal knowledge and for knowing what knowledge is valuable and usable for collecting and contributing.

Venue The venue—or the contextual, design, or environmental factors of specific online KSS— may influence the relationship between motivations and sharing. Research on organizational technology-mediated KSS indicates a host of such factors. These include organizational support or incentives, an organizational culture that is supportive of knowledge sharing, job autonomy, level of activity load, team characteristics, and KSS affordances (Cabrera et al., 2006; Ellison, Gibbs, & Weber, 2015; Heinz & Rice, 2009; Rice et al., 2017; Wang & Noe, 2010; Yu, Lu, & Liu, 2010). In contexts such as organizational intranets, where knowledge sharing carries potential, professional risk, or the sharers may know each other personally, factors such as power differences between seeker and sharer, desire for impression management, evaluation apprehension, and organizational culture may influence sharing (Cabrera et al., 2006; Kim & Lee, 2006; see reviews in Hall, 2001; Heinz & Rice, 2009; Wang & Noe, 2010). Unfortunately, few studies compare motivations for knowledge sharing across venues or sites/systems. One problem with the knowledge sharing literature, particularly in the online context, is that many scholars tend to implicitly presume that all users of all kinds of online knowledge sharing venues have similar motivations for various sharing behaviors. An exception is Moore and Serva’s (2007) review of the literature on motivations for

590   Kristin Page Hocevar ET AL. c­ ontribution to virtual communities, which classified 14 motivations identified by prior studies by their relevance in different types of virtual communities at that time. Prior studies had identified motivations to contribute to Internet forums and wikis to include altruism, belonging, recognition, and reputation, while motivations for blog contribution included recognition and reputation, but not altruism or belonging. In Fuglestad et  al.’s (2012) study of MovieLens.com users, different reasons for joining the online community were similarly associated with different patterns of knowledge sharing.

Directions for Future Research Studies of intrinsic and extrinsic motivation for online knowledge sharing have struggled with effectively or consistently conceptualizing the two types of motivation, calling into question some of the results in this area of research. As a result, it may be prudent to rely on the operationalization (e.g., enjoyment) rather than the larger concept it is purported to represent (i.e., intrinsic or extrinsic motivation) when interpreting its relationship with online knowledge sharing. Future research should also examine the influence of intrinsic motivation on other motivational factors. For example, higher initial levels of intrinsic motivation can decrease the effects of other potential motivations on knowledge sharing (David et al., 2007), and extrinsic motivations can decrease intrinsic motivations (Cameron & Pierce,  1994). This suggests that intrinsic motivation should be measured even in research that is focused on other motivating factors. Though the fundamental challenge to online knowledge contribution is the social dilemma of individual self-interest versus collective action and public goods, more conceptual work needs to distinguish which motivations are clearly self-oriented versus other-oriented, and which aspects of those motivations involve one or the other, or both. For example, reputation typically refers to an individual motivation, but reputation is created through and embedded in social norms, comparisons, and consequences. Further, both intrinsic and extrinsic motivations can be more or less self- versus other-oriented. It would be useful to better understand when generalized or indirect reciprocity is more or less motivational to users. Possible avenues for future research include community motivations (cultural values and norms such as collectivism or individualism; Kang, Kim, Gloor, & Bock, 2011), and the explicitness and time frame of possible generalized reciprocity. Unfortunately, the literature examining online knowledge sharing only rarely distinguishes between knowledge collecting and contributing (typically referring to “sharing” or “exchange”). For example, Chai, Das, and Rao’s (2011) integrated model of motivations for knowledge-sharing behaviors does not distinguish between collecting and contributing. Nor does much prior research analyze relations between motivations and the quality (accuracy, helpfulness, usefulness, depth, etc.) and quantity (number or length of

Motivations for Online Knowledge Sharing   591 c­ ontributions) of the content (for two exceptions, see Chang & Chuang, 2011; Phang, Kankanhali, & Huang, 2014). Understanding what motivates users to share high quality knowledge is critical to the larger understanding of factors that influence the overall quality of Internet information (especially, for example, health information). One of the attributes of information that can influence its value is its accuracy (Laxminarayan & Macauley, 2012). In traditional media effects and persuasion literature, message characteristics such as information accuracy can influence perceptions and behavior (e.g., Millar & Millar, 2000; Petty & Cacioppo, 1986). In the online context, people who are knowledgeable about and involved in a topic may identify inaccuracies about that topic and thus be motivated to contribute content of their own (e.g., comments or edits) to correct it. This is especially valued in open source software communities and the Wikipedia community (Kittur & Kraut, 2008; Nov, 2007; Schroer & Hertel, 2009). Few studies have examined motivations for online sharing of high-quality knowledge, though Chang and Chuang (2011) and Chiu et al. (2006) reported that trust, communityrelated outcome expectations, shared vision, social interaction, and reputation significantly predict shared knowledge quality. Some research in virtual communities suggests that group identification positively relates to the quantity but not quality of shared knowledge (Chiu et al., 2006). Interestingly, out of many factors that predicted knowledge sharing quantity (social interaction ties, norms of reciprocity, identification, shared language, and shared vision), only trust, community-related outcome expectations, and shared vision (i.e., whether members share the same goals and values about knowledge sharing) significantly also predicted knowledge sharing quality. While group identification was not a significant predictor of knowledge quality, other group- or community-based variables (i.e., shared values/vision, community outcome expectations) were. Few studies consider both individual and collective levels of both costs and benefits as motivations for of online sharing, as reviewed by Heinz and Rice (2009); an exception is Brake’s consideration of benefits of online community knowledge sharing at both levels (2014). Finally, while much research considers a wide variety of aspects of KSS venues, few relate those to motivations for sharing. Extending Morre and Serva’s (2007) approach, it would be useful to conceptually or operationally define multiple motivations and venues, and to provide a theoretical rationale for the relationships found between each motivation and venue.

Conclusion Digital KSS (such as online repositories, networks of practice, and social network sites) have fostered the development of diverse online public goods, such as connectivity among known and unknown users worldwide as well as communality of shared content. This sharing has raised the question of why individuals would voluntarily share knowledge, when rational actor theory and self-interest would argue against engaging in

592   Kristin Page Hocevar ET AL. directly unreciprocated activity. So identifying and explaining the motivations for such online knowledge sharing has generated considerable research, although often contradictory and confounded. By identifying two categories of motivations, based on selforiented and other-oriented factors, along with contextual factors such as self-efficacy, trust, and the social and technological venue, this chapter has aimed to clarify distinctions and summarize considerable research on this fundamental question. Figure 21.1 portrays the organizing framework of motivations for online knowledge sharing. Yet a variety of research questions remain. As digital technology continues to evolve and diffuse, answering these questions will help system designers, practitioners, researchers, and users better understand how to motivate the important process of sharing knowledge online. However, ongoing developments in digital technology will likely reveal other motivations, possibilities, obstacles, and contextual factors.

Contextual factors • selfefficacy • trust • venue (social, KSS) Tensions • social dilemma • collective action • public goods

Motivations for knowledge sharing Self-oriented • enjoyment/entertainment • altruism • expertise • feedback • reputation • incentives • expected individual costs/benefits • intrinsic/extrinsic

Knowledge sharing • collecting • contributing • quality • quantity

Other-oriented • reciprocity • social comparison • social loafing • group belonging and sociality • group identity • expected collective costs/benefits • intrinsic/extrinsic

Figure 21.1  Summary of review of motivations for online knowledge sharing.

Motivations for Online Knowledge Sharing   593

References Allen, B. (1990). Information as an economic commodity. American Economic Review, 80, 268–273. Andreoni, J. (1990). Impure altruism and donations to public goods: A theory of warm-glow giving. Economic Journal, 401, 464–477. https://doi.org/10.2307/2234133 Arazy, O., & Gellatly, I.  R. (2012). Corporate wikis: The effects of owners’ motivation and behavior on group members’ engagement. Journal of Management Information Systems, 29(3), 87–116. doi:10.2753/MIS0742-1222290303 Ardichvili, A. (2008). Learning and knowledge sharing in virtual communities of practice: Motivators, barriers, and enablers. Advances in Developing Human Resources, 10(4), 541–554. doi:10.1177/1523422308319536 Bagozzi, R. P. & Tsai, H-T. (2014). Contribution behavior in virtual communities: Cognitive, emotional, and social influences. MIS Quarterly, 38(1), 143–164. https://doi.org/10.25300/ misq/2014/38.1.07 Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: W. H. Freeman and Company. Barab, S. A. (2003). An introduction to the special issue: Designing for virtual communities in the service of learning. Information Society, 19, 197–201. https://doi.org/10.1080/01972240309467 Batson, C. D. (2014). The altruism question: Toward a social-psychological answer. New York, NY: Psychology Press. Bimber, B., Flanagin, A. J., & Stohl, C. (2005). Reconceptualizing collective action in the contemporary media environment. Communication Theory, 15, 365–388. doi:10.1111/j.1468–2885 .2005.tb00340.x Blau, P. M. (1964). Exchange and power in social life. New York, NY: Wiley. Bock, G. W., Zmud, R. W., Kim, Y. G., & Lee, J. N. (2005). Behavioral intention formation in knowledge sharing: Examining the roles of extrinsic motivators, social-psychological forces, and organizational climate. MIS Quarterly, 29, 87–111. https://doi.org/10.2307/25148669 Bond, C. (1982). Social facilitation: A self-presentational view. Journal of Personality and Social Psychology, 42, 1042–1050. Brake, D.  R. (2014). Are we all online content creators now? Web 2.0 and digital divides. Journal of Computer-Mediated Communication, 19(3), 591–609. doi:10.1111/jcc4.12042 Brown, J. S., & Duguid, P. (2000). The social life of information. Boston, MA: Harvard Business Press. Burke, M., Marlow, C., & Lento, T. (2009). Feed me: Motivating newcomer contribution in social network sites. In Proceedings of CHI 2009 (pp. 945–954). New York, NY: ACM Press. https://doi.org/10.1145/1518701.1518847 Cabrera, Á., Collins, W. C., & Salgado, J. F. (2006). Determinants of individual engagement in knowledge sharing. The International Journal of Human Resource Management, 17(2), 245–264. https://doi.org/10.1080/09585190500404614 Cameron, J., & Pierce, W. D. (1994). Reinforcement, reward, and intrinsic motivation: A metaanalysis. Review of Educational Research, 64(3), 363–423. https://doi.org/10.2307/1170677 Carver, C., & Scheier, M. (1981). The self-attention-induced feedback loop and social facilitation. Journal of Experimental Social Psychology, 17, 545–568. https://doi.org/10.1016/ 0022–1031(81)90039–1 Case, D. O. & Given, L. M. (2016). Looking for information: A survey of research on information seeking, needs, and behavior (4th ed.). Bingley, UK: Emerald Group Publishing Ltd.

594   Kristin Page Hocevar ET AL. Chai, S., Das, S., & Rao, H. R. (2011). Factors affecting bloggers’ knowledge sharing: An investigation across gender. Journal of Management Information Systems, 28(3), 309–342. doi:10.2753/MIS0742-1222280309 Chang, H. H., & Chuang, S. S. (2011). Social capital and individual motivations on knowledge sharing: Participant involvement as a moderator. Information & Management, 48(1), 9–18. https://doi.org/10.1016/j.im.2010.11.001 Chen, C. J. & Hung, S. W. (2010). To give or to receive? Factors influencing members’ knowledge sharing and community promotion in professional virtual communities. Information & Management, 47(4), 226–236. doi:10/1016/j.im.2010.03.001 Chen, S., Duckworth, K., & Chaiken, S. (1999). Motivated heuristic and systematic processing. Psychological Inquiry, 10(1), 44–49. https://doi.org/10.1207/s15327965pli1001_6 Cheng, R., & Vassileva, J. (2005, January). User motivation and persuasion strategy for peerto-peer communities. Proceedings of the 38th annual Hawaii international conference on system sciences, user motivation and persuasion strategy for peer-to-peer communities (10 pp.). Hawaii: Institute of Electrical and Electronics Engineers (IEEE). https://doi. org/10.1109/hicss.2005.653 Cheshire, C., & Antin, J. (2008). The social psychological effects of feedback on the production of Internet information pools. Journal of Computer-Mediated Communication, 13, 705–727. doi:10.1111/j.1083–6101.2008.00416.x Cheung, C. M., Lee, M. K., & Lee, Z. W. (2013). Understanding the continuance intention of knowledge sharing in online communities of practice through the post-knowledge-sharing evaluation processes. Journal of the American Society for Information Science and Technology, 64, 1357–1374. doi:10.1002/asi.22854 Chiu, C. M., Hsu, M. H., & Wang, E. T. G. (2006). Understanding knowledge sharing in virtual communities: An integration of social capital and social cognitive theories. Decision Support Systems, 42(3), 1872–1888. doi:10.1016/j.dss.2006.04.001 Cho, H., Chen, M., & Chung, S. (2010). Testing an integrative theoretical model of knowledgesharing behavior in the context of Wikipedia. Journal of the American Society for Information Science and Technology, 61(6), 1198–1212. doi:10.1002/asi.21316 Clary, E.  G., Snyder, M., Ridge, R.  D., Copeland, J., Stukas, A.  A., Haugen, J., & Miene, P. (1998). Understanding and assessing the motivations of volunteers: A functional approach. Journal of Personality and Social Psychology, 74(6), 1516–1530. https://doi.org/10.1037 //0022–3514.74.6.1516 Connolly, T, & Thorn, B. K. (1990). Discretionary databases: Theory, data, and implications. In J.  Fulk, & C.W.  Steinfield (Eds.), Organizations and communication technology (pp. 219–233). Newbury Park, CA: Sage. Correa, T., Hinsley, A. W., & Gil de Zúñiga, H. (2010). Who interacts on the Web? The intersection of users’ personality and social media use. Computers in Human Behavior, 26(2), 247–253. doi:10.1016/j.chb.2009.09.003 Cottrell, N. (1972). Social facilitation. In C. McClintock (Ed.), Experimental social psychology (pp. 185–236). New York: Holt, Rinehart & Winston. Cova, B., Pace, S., & Skålén, P. (2015). Brand volunteering: Value co-creation with unpaid consumers. Marketing Theory, 15(4), 465–485. doi:10.1177/1470593115568919 David, P., Song, M., Hayes, A., & Fredin, E. S. (2007). A cyclic model of information seeking in hyperlinked environments: The role of goals, self-efficacy, and intrinsic motivation. International Journal of Human-Computer Studies, 65(2), 170–182. doi:10.1016/j. ijhcs.2006.09.004

Motivations for Online Knowledge Sharing   595 Dawes, R. M. (1980). Social dilemmas. Annual Review of Psychology, 31, 169–193. https://doi. org/10.1146/annurev.ps.31.020180.001125 De Dreu, C. K. W., & Nauta, A. (2009). Self-interest and other-orientation in organizational behavior: Implications for job performance, prosocial behavior, and personal initiative. Journal of Applied Psychology, 94(4), 913–926. doi:10.1037/a0014494 Deci, E. L., & Ryan, R. M. (1985). Intrinsic motivation and self-determination in human behavior. New York: Plenum. Deci, E. L., & Ryan, R. M. (2013). Motivation, personality, and development within embedded social contexts: An overview of self-determination theory. In R. M. Ryan (Ed.), The Oxford handbook of human motivation (pp. 85–107). Oxford, UK: Oxford University Press. Deng, L. & Poole, M. S. (2011). Knowledge utilization in electronic networks of practice. In H.  E.  Canary & R.  D.  McPhee (Eds.), Communication and organizational knowledge: Contemporary issues for theory and practice (pp. 209–220). New York: Routledge. Dholakia, U. M., Bagozzi, R. P., & Pearo, L. K. (2004). A social influence model of consumer participation in network-and small-group-based virtual communities. International Journal of Research in Marketing, 21, 241–263. https://doi.org/10.1016/j.ijresmar.2003.12.004 Ekeh, P. (1974). Social exchange theory: The two traditions. Boston: Harvard University Press. Ellison, N. B. & boyd, D. M. (2014). Sociality through social network sites. In W. H. Dutton (Ed.), The Oxford handbook of internet studies (pp. 153–172). Oxford, UK: Oxford University press. Ellison, N. B., Gibbs, J. L., & Weber, M. S. (2015). The use of enterprise social network sites for knowledge sharing in distributed organizations: The role of organizational affordances. American Behavioral Scientist, 59(1), 103–123. doi:10.1177/000276421450510 Erickson, T, & Kellogg, W A. (2000). Social translucence: An approach to designing systems that support social processes. ACM Transactions on Computer-Human Interaction, 7, 59–83. https://doi.org/10.1145/344949.345004 Farzan, R., & Brusilovsky, P. (2011). Encouraging user participation in a course recommender system: An impact on user behavior. Computers in Human Behavior, 27, 276–284. doi:10/1016/j.chb.2010.08.005 Farzan, R., DiMicco, J. M., Millen, D. R., Dugan, C., Geyer, W., & Brownholtz, E. A. (2008, April). Results from deploying a participation incentive mechanism within the enterprise. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 563–572). ACM. https://doi.org/10.1145/1357054.1357145 Fehr, E., & Fischbacher, U. (2003). The nature of human altruism. Nature, 425(6960), 785–791. https://doi.org/10.1038/nature02043 Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7(2), 117–140. https://doi.org/10.1177/001872675400700202 Flanagin, A. J., & Metzger, M. J. (2013). Trusting expert- versus user-generated ratings online: The role of information volume, valence, and consumer characteristics. Computers in Human Behavior, 29(4), 1626–1634. doi:10.1016/j.chb.2013.02.001 Flanagin, A.  J., Hocevar, K.  P., Samahito, S.  N. (2013). Connecting with the user-generated web: How group identification impacts online information sharing and evaluation. Information, Communication, & Society, 17(6), 683–694. doi:10.1080/1369118x.2013.808361 Fuglestad, P. T., Dwyer, P. C., Moses, J. F., Kim, J. S., Mannino, C. A., Terveen, L., & Snyder, M. (2012). What makes users rate (share, tag, edit & )? Predicting patterns of participation in online communities. In Proceedings of CSCW ‘12 (pp. 969–978). New York, NY: ACM Press. https://doi.org/10.1145/2145204.2145349

596   Kristin Page Hocevar ET AL. Fulk, J., Flanagin, A.  J., Kalman, M.  E., Monge, P.  R., & Ryan, T. (1996). Connective and ­communal public goods in interactive communication systems. Communication Theory 6(1), 60–87. https://doi.org/10.1111/j.1468–2885.1996.tb00120.x Fulk, J., Heino, R., Flanagin, A. J., Monge, P. R., & Bar, F. (2004). A test of the individual action model for organizational information commons. Organization Science, 15(5), 569–585. https://doi.org/10.1287/orsc.1040.0081 Hall, H. (2001). Input-friendliness: Motivating knowledge sharing across intranets. Journal of Information Science, 27(3), 139–146. doi:10.1177/016555150102700303 Hardin, G. (1968). The tragedy of the commons. Science, 162, 1243–1248. Harkins, S.  G. (1987). Social loafing and social facilitation. Journal of Experimental Social Psychology, 23, 1–18. https://doi.org/10.1016/0022-1031(87)90022-9 Harper, F. M., Li, S. X., Chen, Y., & Konstan, J. A. (2007). Social comparisons to motivate contributions to an online community. In Y. de Cort, W. Ijsselsteijn, C. Midden, B. Eggen, & B. J. Fogg (Eds.), Persuasive technology: Second international conference on persuasive technology, PERSUASIVE 2007, Palo Alto, CA, USA, April 26–27, 2007, Revised Selected Papers (pp. 148–159). Berlin, Germany: Springer Berlin Heidelberg. https://doi. org/10.1007/978-3-540-77,006-0_20 Heinz, M. & Rice, R. E. (2009). An integrated model of knowledge sharing in contemporary communication environments. In C.  S.  Beck (Ed.), Communication yearbook 33 (pp. 134–175). New York, NY: Routledge. Hew, K. F., & Hara, N. (2007). Empirical study of motivators and barriers of teacher online knowledge sharing. Educational Technology Research and Development, 55(6), 573–595. doi:10.1007/s11423-007-9049-2 Hoadley, C. (2012). What is a community of practice and how can we support it? In D. Jonassen & S. Land (Eds.), Theoretical foundations of learning environments (pp. 286–300). New York, NY: Routledge. Hocevar, K.  P., Flanagin, A.  J., & Metzger, M.  J. (2014). Social media self-efficacy and ­information evaluation online. Computers in Human Behavior, 39, 254–262. doi: 10.1016/j. chb.2014.07.020 Hoisl, B., Aigner, W., & Miksch, S. (2007). Social rewarding in wiki systems—Motivating the community. In D. Schuler (Ed.), Online communities and social computing (pp. 362–371). Berlin, Germany: Springer Berlin Heidelberg. Hollingshead, A. B., Fulk, J., & Monge, P. (2002). Fostering intranet knowledge sharing: An integration of transactive memory and public goods approaches. In P. Hinds, & S. Kiesler (Eds.), Distributed work (pp. 335–355). Cambridge, MA: MIT Press. Hsu, M. H., Ju, T. L., Yen, C. H., & Chang, C. M. (2007). Knowledge sharing behavior in virtual communities: The relationship between trust, self-efficacy, and outcome expectations. International Journal of Human-Computer Studies, 65(2), 153–169. doi:10.1016/j.ijhcs.2006.09.003 Huang, M., Barbour, J., Su, C., & Contractor, N. (2013). Why do group members provide information to digital knowledge repositories? A multilevel application of transactive memory theory. Journal of the Association for Information Science and Technology, 64(3), 540–557. doi:10.1002/asi.22805 Huang, S. W., & Fu, W. T. (2013, April). Don’t hide in the crowd!: Increasing social transparency between peer workers improves crowdsourcing outcomes. Proceedings of the SIGCHI conference on human factors in computing systems (pp. 621–630). Paris, France: Association for Computing Machinery. https://doi.org/10.1145/2470654.2470743 Hughes, D., Coulson, G., & Walkerdine, J. (2005). Free riding on Gnutella revisited: The bell tolls? Distributed Systems Online, IEEE, 6(6), 1–18. doi:10.1109/MDSO.2005.31

Motivations for Online Knowledge Sharing   597 Hughes, D. J., Rowe, M., Batey, M., & Lee, A. (2012). A tale of two sites: Twitter vs. Facebook and the personality predictors of social media usage. Computers in Human Behavior, 28(2), 561–569. doi:10.1016.j.chb.2011.11.001 Hung, S. Y., Durcikova, A., Lai, H. M., & Lin, W. M. (2011). The influence of intrinsic and extrinsic motivation on individuals’ knowledge sharing behavior. International Journal of Human-Computer Studies, 69, 415–427. doi:10.1016/j.ijhcs.2011.02.004 Jadin, T., Gnambs, T., & Batinic, B. (2013). Personality traits and knowledge sharing in online communities. Computers in Human Behavior, 29, 210–216. doi:10.1016/j.chb.2012.08.007 Javanmardi, S., Ganjisaffar, Y., Lopes, C., & Baldi, P. (2009). User contribution and trust in Wikipedia. In Proceedings of the 5th international ICST conference on collaborative computing: networking, applications, worksharing (pp. 1–5). https://doi.org/10.4108/icst. collaboratecom2009.837 Jeon, S. H., Kim, Y. G., & Koh, J. (2011). Individual, social, and organizational contexts for active knowledge sharing in communities of practice. Expert Systems with Applications, 38, 12,423–12,431. doi:10.1016/j.eswa.2011.04.023 Johnson, C. M. (2001). A survey of current research on online communities of practice. The Internet and Higher Education, 4(1), 45–60. doi:10.1016/S1096-7516(01)00047–1 Kalman, M. E., Monge, P., Fulk, J., & Heino, R. (2002). Motivations to resolve communication dilemmas in database-mediated collaboration. Communication Research, 29, 125–154. doi:10.1177/0093650202029002002 Kang, M., Kim, B., Gloor, P., & Bock, G-W. (2011). Understanding the effect of social networks on user behaviors in community-driven knowledge services. Journal of the American Society for Information Science and Technology, 62(6), 1066–1074. doi:10.1002/asi.21533 Kankanhalli, A., Tan, B. C., & Wei, K. K. (2005). Contributing knowledge to electronic knowledge repositories: an empirical investigation. MIS Quarterly, 29, 113–143. https://doi. org/10.2307/25148670 Karau, S.  J., & Williams, K.  D. (1995). Social loafing: Research findings, implications, and future directions. Current Directions in Psychological Science, 4, 134–139. https://doi. org/10.1111/1467–8721.ep10772570 Khansa, L., Ma, X., Liginlal, D., & Kim, S. S. (2015). Understanding members’ active participation in online question-and-answer communities: A theory and empirical analysis. Journal of Management Information Systems, 32(2), 162–203. doi:10/1080/07421222.2015.1063293 Kim, H. W., Zheng, J. R., & Gupta, S. (2011). Examining knowledge contribution from the perspective of an online identity in blogging communities. Computers in Human Behavior, 27, 1760–1770. doi:10.1016/j.chb.2011.03.003 Kim, S., & Lee, H. (2006). The impact of organizational context and information technology on employee knowledge-sharing capabilities. Public Administration Review, 66(3), 370–385. doi:10.1111/j.1540–6210.2006.00595.x Kim, Y. & Adler, M. (2015). Social scientists’ data sharing behaviors: Investigating the roles of individual motivations, institutional pressures, and data repositories. International Journal of Information Management, 35(4), 408–418. doi:10.1016/j.ijinfomgt.2015.04.007 Kittur, A., & Kraut, R. E. (2008, November). Harnessing the wisdom of crowds in wikipedia: quality through coordination. In Proceedings of the 2008 ACM conference on Computer supported cooperative work (pp. 37–46). New York, NY: ACM. Kittur, A., Chi, E. H., Pendleton, B. A., Suh, B., & Mytkowicz, T. (2007, April). Power of the few vs. wisdom of the crowd: Wikipedia and the rise of the bourgeoisie. Proceedings of the ACM conference on human factors in computing systems (pp. 1–9). San Jose, CA: Association for Computing Machinery. https://doi.org/10.1145/1460563.1460572

598   Kristin Page Hocevar ET AL. Kollock, P. (1999). The economies of online cooperation: Gifts and public goods in cyberspace. In M. A. Smith & P. Kollock (Eds.), Communities in cyberspace. London: Routledge. Kraut  R.  E. & Resnick, P. (2011). Encouraging contribution to online communities. In R.  E.  Kraut & P.  Resnick (Eds.), Building successful online communities: Evidence-based social design (pp. 21–76). Cambridge, MA: MIT Press. Lakhani, K. R., & Von Hippel, E. (2003). How open source software works: “Free” user-to-user assistance. Research Policy, 32, 923–943. https://doi.org/10.2139/ssrn.290305 Lampe, C., Wash, R., Velasquez, A., & Ozkaya, E. (2010). Motivations to participate in online communities. In Proceedings of the 28th international conference on Human factors in computing systems—CHI 10 (pp. 1927–1936). New York, NY: ACM Press. doi:10.1145/1753326.1753616 Laxminarayan, R., & Macauley, M. K. (2012). The value of information: Methodological frontiers and new applications in environment and health. New York, NY: Springer Science & Business Media. Lee, E. J., & Jang, J. (2010). Profiling good Samaritans in online knowledge forums: Effects of affiliative tendency, self-esteem, and public individuation on knowledge sharing. Computers in Human Behavior, 26(6), 1336–1344. doi:10.1016/j.chb.2010.04.007 Leonardi, P. M. & Meyer, S. R. (2015). Social media as social lubricant: How ambient awareness eases knowledge transfer. American Behavioral Scientist, 59(1), 10–34. https://doi.org/10.1177 /0002764214540509 Lepper, M. R., Greene, D., & Nisbett, R. E. (1973). Undermining children’s intrinsic interest with extrinsic reward: A test of the “overjustification” hypothesis. Journal of Personality and Social Psychology, 28, 129–137. https://doi.org/10.1037/h0035519 Li, X. (2011). Factors influencing the willingness to contribute information to online communities. New Media & Society, 13(2), 279–296. doi:10.1177/146444810372164 Lin, K., & Lu, H. (2011). Why people use social networking sites: An empirical study integrating network externalities and motivation theory. Computers in Human Behavior, 27, 1152–1161. doi:10.1016/j.chb.2010.12.009 Lin, M. J. J., Hung, S. W., & Chen, C. J. (2009). Fostering the determinants of knowledge sharing in professional virtual communities. Computers in Human Behavior, 25, 929–939. https://doi.org/10.1016/j.chb.2009.03.008 Ling, K., Beenen, G., Wang, X., Chang, K., Frankowski, D., Resnick, P., & Kraut, R. E. (2005). Using social psychology to motivate contributions to online communities. Journal of Computer Mediated Communication, 10(4). https://doi.org/10.1111/j.1083–6101.2005.tb00273.x Lou, J., Fang, Y., Lim, K.  H., & Peng, J.  Z. (2013). Contributing high quantity and quality knowledge to online Q&A communities. Journal of the American Society for Information Science and Technology, 64, 356–371. doi:10.1002/asi.22750 Lu, L., & Yuan, Y. C. (2011). Shall I Google it or ask the competent villain down the hall? The moderating role of information need in information source selection. Journal of the American Society for Information Science and Technology, 62(1), 133–145. doi:10.1002/ asi.21449 Marwell, G., & Oliver, P. (1993). The critical mass in collective action: A micro-social theory. New York, NY: Cambridge University Press. Metzger, M. J. (2007). Making sense of credibility on the Web: Models for evaluating online information and recommendations for future research. Journal of the American Society for Information Science and Technology, 58(13), 2078–2091. doi:10.1002/asi.20672 Mikal, J. P., Rice, R. E., Kent, R., & Uchino, B. (2015). 100 million strong: A case study of group identification and deindividuation on imgur.com. New Media & Society,18(11), 2485–2506. doi:10.1177/1461444815588766

Motivations for Online Knowledge Sharing   599 Millar, M. G., & Millar, K. U. (2000). Promoting safe driving behaviors: The influence of message framing and issue involvement. Journal of Applied Social Psychology, 30(4), 853–866. https://doi.org/10.1111/j.1559–1816.2000.tb02827.x Moore, T. D., & Serva, M. A. (2007). Understanding member motivation for contributing to different types of virtual communities: A proposed framework. In Proceedings of SGMISCPR ‘07 (pp. 153–158). New York, NY: ACM Press. https://doi.org/10.1145/1235000.1235035 Nov, O. (2007). What motivates Wikipedians? Communications of the ACM, 50, 60–64. doi: 10.1145/1297797.1297798 Nov, O., Anderson, D., & Arazy, O. (2010, April). Volunteer computing: A model of the factors determining contribution to community-based scientific research. Proceedings of the 19th international conference on World Wide Web (pp. 741–750). Raleigh, NC: Association for Computer Machinery. http://dx.doi.org/10.1145/1772690.1772766 Nov, O., Naaman, M., & Ye, C. (2010). Analysis of participation in an online photo-sharing community: A multidimensional perspective. Journal of the American Society for Information Science and Technology, 61, 555–566. doi:10.1002/asi.21278 Oh, S. (2012). The characteristics and motivations of health answerers for sharing information, knowledge, and experiences in online environments. Journal of the American Society for Information Science and Technology, 63, 543–557. doi:10.1002/asi.21676 Oh, S., & Syn, S. Y. (2015). Motivations for sharing information and social support in social media: A comparative analysis of Facebook, Twitter, Delicious, YouTube, and Flickr. Journal of the Association for Information Science and Technology, 66(10), 2045–2060. doi:10.1002/asi.23320 Olson, M. (1965). The logic of collective action: Public goods and the theory of groups. Cambridge, MA: Harvard University Press. Oreg, S., & Nov, O. (2008). Exploring motivations for contributing to open source initiatives: The roles of contribution context and personal values. Computers in Human Behavior, 24(5), 2055–2073. doi:10.1016/j.chb.2007.09.007 Osatuyi, B. (2013). Information sharing on social media sites. Computers in Human Behavior, 29(6), 2622–2631. doi:10.1016/j.chb.2013.07.001 Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. Advances in Experimental Social Psychology, 19, 123–205. doi:10.1016/S0065-2601(08)60214–2 Phang, C. W., Kankanhalli, A., & Huang, L. (2014). Drivers of quantity and quality of participation in online policy deliberation forums. Journal of Management Information Systems, 31(3), 172–212. doi:10.1080/07421222.2014.995549 Pirolli, P., & Card, S. (1995, May). Information foraging in information access environments. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 51–58). New York, NY: ACM Press. https://doi.org/10.1145/223904.223911 Putnam, R. D. (2000). Bowling alone: The collapse and revival of American community. New York, NY: Simon & Schuster. Quan-Haase, A., & Young, A. L. (2010). Uses and gratifications of social media: A comparison of Facebook and instant messaging. Bulletin of Science, Technology & Society, 30, 350–361. doi:10.1177/027046761038009 Rashid, A.  M., Ling, K., Tassone, R.  D., Resnick, P., Kraut, R., & Riedl, J. (2006, April). Motivating participation by displaying the value of contribution. Proceedings of the SIGCHI conference on human factors in computing systems (pp. 955–958). New York, NY: ACM Press. doi:10.1145/1124772.1124915 Rice, R. E. (1987). New patterns of social structure in an information society. In J. Schement & L. Lievrouw (Eds.), Competing visions, complex realities: Social aspects of the information society (pp. 107–120). Norwood, NJ: Ablex.

600   Kristin Page Hocevar ET AL. Rice, R.  E. (2004). The Internet and health communication: An overview of issues and research. In P. Lee, L. Leung, & C. So (Eds.), Impact and issues in new media: Toward intelligent societies (pp. 173–204). Cresskill, NJ: Hampton Press. Rice, R.  E., Evans, S.  K., Pearce, K.  E., Sivunen, A., Vitak, J., & Treem, J.  W. (2017). Organizational media affordances: Operationalization and associations with media use. Journal of Communication, 67(1), 106–130. doi:10.1111/jcom.12273 Rice, R. E., McCreadie, M., & Chang, S. J. L. (2001). Accessing and browsing information and communication. Cambridge, MA: MIT Press. Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25, 54–67. doi: 10.1006/ceps.1999.1020 Samak, A., & Sheremeta, R. (2013, May 6). Visibility of contributors and cost of information: An experiment on public goods. Retrieved from https://mpra.ub.uni-muenchen. de/46779/1/MPRA_paper_46,779.pdf Schroer, J., & Hertel, G. (2009). Voluntary engagement in an open web-based encyclopedia: Wikipediansandwhytheydoit.MediaPsychology,12(1),96–120.doi:10.1080/15213260802669466 Shirky, C. (2008). Here comes everybody: The power of organizing without organizations. New York, NY: Penguin Books. Shirky, C. (2010). Cognitive surplus: How technology makes consumers into collaborators. New York, NY: Penguin Books. Sillence, E., Briggs, P., Harris, P. R., & Fishwick, L. (2007). How do patients evaluate and make use of online health information? Social Science & Medicine, 64(9), 1853–1862. doi:10.1016/j. socscimed.2007.01.012 Simon, H.  A. (1972). Theories of bounded rationality. Decision and Organization, 1(1), 161–176. Smith, D.  H. (1981). Altruism, volunteers, and volunteerism. Journal of Voluntary Action Research, 10, 21–36. https://doi.org/10.1177/089976408101000105 Sunstein, C. (2006). Infotopia: How many minds produce knowledge. New York, NY: Oxford University Press. Surowiecki, J. (2004). The wisdom of crowds. New York, NY: Anchor Books. Tajfel, H., & Turner, J.  C. (1986). The social identity theory of intergroup behavior. In S. Worchel & W. G. Austin (Eds.), Psychology of intergroup relations (pp. 2–24). Chicago, IL: Nelson-Hall. Taraborelli, D. (2008). How the web is changing the way we trust. In K. Waelbers, A. Briggle, & P. Brey (Eds.), Current issues in computing and philosophy (pp. 194–204). Amsterdam: IOS Press. Thibaut, J.  W. & Kelley, H.  H. (1965). The social psychology of groups. New York, NY: John Wiley & Sons, Inc. Tsai, M. T., Chang, H. C., Cheng, N. C., & Lien, C. C. (2013). Understanding IT professionals’ knowledge sharing intention through KMS: A social exchange perspective. Quality & Quantity, 47(5), 2739–2753. doi: 10.1007/s11135-012-9685-4 Turner, J. C. (1975). Social comparison and social identity: Some prospects for intergroup behavior. European Journal of Social Psychology, 5, 5–34. https://doi.org/10.1002/ejsp .2420050102 Turner, J. C. (1991). Social influence. Milton Keynes, UK: Open University Press. Turner, J. C., Hogg, M. A., Oakes, P. J., Reicher, S. D., & Wetherell, M. S. (1987). Rediscovering the social group: A self-categorization theory. Cambridge, MA: Basil Blackwell.

Motivations for Online Knowledge Sharing   601 Usoro, A., Sharratt, M. W., Tsui, E., & Shekhar, S. (2007). Trust as an antecedent to knowledge sharing in virtual communities of practice. Knowledge Management Research & Practice, 5(3), 199–212. doi:10.1057/palgrave.kmrp.8500143 van Doorn, G. S. & Taborsky, M. (2012). The evolution of generalized reciprocity on social interaction networks. Evolution, 66(3), 651–664. https://doi.org/10.1111/j.1558–5646.2011.01479.x Wang, Y., & Fesenmaier, D.R. (2003). Assessing motivation of contribution in online communities: An empirical investigation of an online travel community. Electronic Markets, 13, 33–45. doi:10.1111/j.1558–56462011.01479.x Wang, S., & Noe, R.  A. (2010). Knowledge sharing: A review and directions for future research. Human Resource Management Review, 20(2), 115–131. https://doi.org/10.1016/j. hrmr.2009.10.001 Wasko, M. M., & Faraj, S. (2000). “It is what one does”: Why people participate and help others in electronic communities of practice. Journal of Strategic Information Systems, 9, 155–173. https://doi.org/10.1016/s0963-8687(00)00045-7 Wasko, M. M., & Faraj, S. (2005). Why should I share? Examining social capital and knowledge contribution in electronic networks of practice. MIS Quarterly, 29(1), 35–57. Retrieved from http://www.jstor.org/stable/25148667 Wasko, M. M., & Teigland, R. (2004). Public goods or virtual commons? Applying theories of public goods, social dilemmas, and collective action to electronic networks of practice. Journal of Information Technology Theory and Application, 6, 25–41. Wasko, M. M., Teigland, R., & Faraj, S. (2009). The provision of online public goods: Examining social structure in an electronic network of practice. Decision Support Systems, 47(3), ­254–265. doi:10.1016/j.dss.2009.02.012 Wenger, E. (1998). Communities of practice: Learning as a social system. Systems Thinker, 9(5), 2–3. Williams, K., Harkins, S. G., & Latané, B. (1981). Identifiability as a deterrant to social loafing: Two cheering experiments. Journal of Personality and Social Psychology, 40(2), 303–311. https://doi.org/10.1037/0022-3514.40.2.303 Yang, H. L., & Lai, C. Y. (2010). Motivations of Wikipedia content contributors. Computers in Human Behavior, 26(6), 1377–1383. doi:10.1016/j.chb.2010.04.011 Yang, X., Li, Y., Tan, C.H., & Teo, H. H. (2007). Students’ participation intention in an online discussion forum: Why is computer-mediated interaction attractive? Information & Management, 44, 456–466. doi:10.1016/j./im.2007.04.003 Yates, D., Wagner, C., & Majchrzak, A. (2010). Factors affecting shapers of organizational wikis. Journal of the Association for Information Science and Technology, 61(3), 543–554. doi:10.1002/asi.21266 Yu, J., Jiang, Z., & Chan, H.C. (2007). Knowledge contribution in problem solving virtual communities: The mediating role of individual motivations. Proceedings of SIGMIS-CPR conference on Computer personnel research: The global information technology workforce (pp. 142–152). St. Louis, MO: Association for Computing Machinery. https://doi.org/10.1145 /1235000.1235034 Yu, T. K., Lu, L. C., & Liu, T. F. (2010). Exploring factors that influence knowledge sharing behavior via weblogs. Computers in Human Behavior, 26(1), 32–41. doi:10.1016/j.chb .2009.08.002 Yuan, Y. C., Rickard, L. N., Xia, L., & Scherer, C. (2011). The interplay between interpersonal and electronic resources in knowledge seeking among co-located and distributed employ-

602   Kristin Page Hocevar ET AL. ees. Journal of the Association for Information Science and Technology, 62(3), 535–549. doi:10.1002/asi.21472 Zhang, X., de Pablos, P.  O., & Zhou, Z. (2013). Effect of knowledge sharing visibility on ­incentive-based relationship in Electronic Knowledge Management Systems: An empirical investigation. Computers in Human Behavior, 29, 307–313. doi:10.1016/j.chb.2012.01.029 Zhu, H., Zhang, A., He, J., Kraut, R. E., & Kittur, A. (2013). Effects of peer feedback on contribution: A field experiment in Wikipedia. In Proceedings CHI 2013 (pp. 2253–2262). New York, NY: ACM Press. https://doi.org/10.1145/2470654.2481311

SECTION 8

G OV E R NA NC E AND AC C OU N TA BI L I T Y

chapter 22

ESCR R ev iew Governance and Security Simeon J. Yates, Gerwyn Jones, William H. Dutton, and Elinor Carmi

Introduction This chapter provides an overview of the analyses of the literature, Delphi process and any relevant workshops for the Governance and Security domain. The chapter first explores the results of the various digital humanities analyses of the literature and the review of methods and theory. In the case of this domain the Delphi process did not provide as clear a set of ideas by the close of the second round. Not only did the responses move away from the initial scoping questions, there was also a lot overlap with the Citizenship and Politics domain (chapter 16). We have therefore focused on those topics that are distinct from the other domains. The recommendations for areas of future study are presented in the last section. As a reminder, the initial ESRC scoping questions for this area of work were • What are the challenges of ethics, trust and consent in the digital age? • How do we define responsibility and accountability in the digital age?

Initial Comments The naming of this domain as “Governance and Security” broadened the potential responses to a wider range of topics than these initial scoping questions, as became clear in the Delphi review, including elements from other domains. Separating out the distinctive elements, we identified an emphasis on issues of privacy, law, and governance. However, issues of trust, accountability, and responsibility and their governance remain

606   Simeon J. Yates et al. important. A key issue was brought up by team members and by stakeholders in Digital Leader salon events—namely the relative success of some, but more often the well documented failure of other, government projects to successfully deploy digital technologies for governance. There is little literature from this review that empirically documents the failure, over the last quarter century, of successive governments to exploit the particular communicative and networking affordances of digital technology in the interests of more equitable, inclusive, and cost-effective government. As will be discussed later, the absence from the Delphi review and the literature of detailed work on success and failure factors may indicate a key area for future work.

Literature Analysis Topics The literature analysis is designed to identify two sets of data. First, key topics within the existing literature, which also allow comparison with areas of importance identified by the Delphi review. Second, a content analysis of the literature to explore the predominance of specific theory, methods, and approaches. Table 22.1 lists the 10 most common (at least 2% of the cases) concepts identified in the Round 1 literature. Table 22.2 lists the concept pairs. All the literature collected from both rounds was analysed using WordStat. WordStat identified 15 topics, shown in Table 22.3. There is a much stronger correlation between

Table 22.1  Analysis Concepts Ranked Topic Child Datum Privacy Law Internet Information Parent Governance Protection Innovation Health Government Inspectorate Code

Percent 11.1 7.3 6.8 5.0 4.7 4.4 4.0 3.9 3.0 2.9 2.8 2.2 2.1 2.0

Note: Concepts occurring in at least 2% of the cases.

ESCR Review: Governance and Security   607 Table 22.2  Concept Pairings—Main and Secondary Concepts Concepts

Percent

Concepts

child childhood harm literacy parent pornography robot

19.15 1.47 3.10 1.87 5.42 3.55 3.75

code regulation zip

3.50 2.16 1.34

datum directive inspectorate protection regulator request

12.67 2.57 1.02 6.40 1.34 1.34

governance internet security

6.81 5.75 1.06

government probability regulation

3.83 .98 1.67

Percent

Concepts

Percent

health item locus score topic

4.85 1.47 1.14 1.10 1.14

law principle protection rule weber

8.60 1.92 2.89 2.81 .98

innovation logic meaning police

4.97 1.10 1.87 2.00

inspectorate parent school

3.67 1.79 1.87

parent quality restriction school visit

6.97 1.92 .73 3.63 .69

internet legitimacy para protocol religion religiosity self-regulation

8.03 1.83 .65 2.53 .90 1.06 1.06

privacy protection regulator rev security

11.74 6.60 1.55 .61 2.97

protection regulation right

5.22 1.55 3.67

Note: bolded term is the main concept; the unbolded terms below that and above the line are the related sub-concepts.

the topic and concept mapping for this domain (see Table 22.4). There appear to be five major topics in this literature: • State use of digital media—especially with regard to surveillance of social movements and protest • Internet regulation and governance—both national and international • Children’s use of digital media—both protection and regulation • Regulation and governance of automated systems • Deception in digital media Looking at the variation in key concepts from 2000 to the present, there is a shift from general issues around the regulation of technology use, to a much stronger contemporary focus on privacy and data protection. One consistent conceptual focus is on the use of digital technologies by children. In terms of the most frequently occurring topic pairs, early on (2000–2004; Figure 22.1), the primary concerns were about Internet use or access, information and

608   Simeon J. Yates et al. Table 22.3  WordStat Analysis of Topics Topics

Keywords

Eigenvalue

Freq

Cases

Cases (%)

Social movements and protest communication

SOCIAL; COMMUN; SOCIETI; POLIT; MEDIA; ORGAN; PROTEST; MOVEMENT; THEORI

1.80

40,072

592

97.9

Internet governance

GOVERN; SECTOR; PRIVAT; SERVIC; PUBLIC; POLICI; INTERNET; REGUL; BUSI

1.53

35,868

578

95.5

Measurement

VARIABL; WA; MEASUR; TEST; RATE; PARTICIP; AVERAG; EFFECT

3.31

23,554

572

94.6

Automation

HUMAN; AUTONOM; AGENT; ROBOT; COMPUT; SYSTEM

1.58

16,711

565

93.4

EU commission and privacy

EUROPEAN; PRIVACI; COMMISS; PROTECT; EU; DATA; IMPACT; ASSESS

1.72

22,750

562

92.9

Urban migration mobile

CHINA; MIGRANT; CHINES; URBAN; CITI; PHONE; MOBIL; CLASS; LABOR; ICT

2.95

9095

522

86.3

Social media

FACEBOOK; SITE; TWITTER; USER; GOOGL

2.13

8508

504

83.3

Law enforcement

LAW; LEGAL; COURT; ENFORC; REGUL; RULE; PROTECT; CRIMIN

9.67

15,647

497

82.2

Marxist analysis

CAPIT; CAPITALIST; MARX; LABOUR; ECONOMI; FUCH; PRODUCT

2.35

7147

493

81.5

Education

TEACHER; LEARNER; STUDENT; CLASSROOM; LEARN; EDUC

1.83

6712

484

8.0

Children’s internet use

CHILDREN; PARENT; CHILD; LIVINGSTON; RISK

2.49

7451

436

72.1

Voting

ELECT; VOTE; PARTI; DEMOCRAT

1.92

3525

383

63.3

Employment

EMPLOY; EMPLOYE; WORKER

1.98

2481

262

43.3

Deception

DECEPT; DECEIV; TRUTH; DETECT; BURGOON

1.68

2871

256

42.3

Surveillance

VEILLANC; SUR

1.93

1218

90

14.9

technology, cable companies and cable policies concerning access and broadband, technical policy issues such as information and privacy or government and statistics or domain names, computers and education (schools, teachers, students), technology and women, and children and media.1 By 2012–2016 (Figure  22.2) emphases had shifted to greater coverage of parents and school or education, data and information

Table 22.4  Comparison between Concepts and WordStat Topics Concept/ Topic

Social movements Internet Measurement Automation EU commission Urban Social Law Marxist Education Children’s Voting Employment Deception Surveillance and protest governance and privacy migration media enforcement analysis internet use communication mobile

Child Datum Privacy Law Internet Information Parent Governance Protection Innovation Health Government Inspectorate Code

                           

        X                 X

                    X      

                           

  X     X       X          

                           

                           

      X           X        

                           

                        X  

X           X           X  

                      X    

                           

    X                      

    X                      

610   Simeon J. Yates et al.

computer/peo innovation/s

society/tech

child/time

cable/intern service/stat

internet/ne

internet/use

technology/woma computer/student

child/medium

policy/state

innovation/i computer/school

government/i

cable/company

access/cable

broadband/cabl

internet/sup

access/internet delivery/ser

computer/teacher internet/use

computer/use

child/televi

transfer/wipo

information/technolo

information/privacy

domain/name

communication

child/dutch computer/technolo

health/inte

government/stat

cable/compet

name/transfer name/wipo cable/service

child/comput

access/comp

computer/valu

election/inte

computer/educ cable/isp child/use

breast/inter access/infor child/internet

access/serv

privacy/the

Figure 22.1  Governance and security 2000–2004: Most frequent concept pairs.

privacy and protection, and concerns with children and risk or various media (Internet, medium, social networking sites), as well as more explicit treatment of relationships among data, information, education, and school. State use of digital systems. In contrast to the Citizenship and Politics domain ­(chapter 16) where engagement with politics and political action via digital media is a focus, here the question is about the extent to which governments monitor, control, or utilize digital media. Within this are debates around the rights of citizens, the use and sharing of citizen’s data, and broader concerns about “surveillance societies.” For example, Elahi (2009) sets out five dilemmas about the way data is used, how it is stored, for what purposes, and who has access to it. The first dilemma is about tensions and shifts among cultures, values, and identities, and attempts to find a shared practice and set of regulatory practices for them all. The second is about individual rights versus society’s wellbeing. The third is conflicting attitudes about data ownership. The fourth is about tensions around scale in terms of temporal, geographical, and political environments. And finally, fifth is about trust and control. As Elahi notes:

ESCR Review: Governance and Security   611

group/member

access/datum administrato

internet/reg

quality/scho

group/websit

education/parent

information/scho

privacy/sur

inspectorate/sch

parent/websi

datum/user

age/child

internet/ne education/schoo

child/risk

information/sec child/sns

controller/d

parent/qualit internet/study child/medium

datum/protection

factor/risk

parent/school

datum/law

darum/privacy

internet/use education/in

internet/risk

information/privacy

country/sext

service/user

privacy/protecti

child/internet datum/info

child/parent

boy/girl

datum/service

information/pa group/website harm/risk contact/memb

inspectorate/p datum/right

datum/survei

risk/sns knowledge/po child/right

information/

Figure 22.2  Governance and security 2012–2016: Most frequent concept pairs.

The institutions tasked with regulation of the physical world are not equipped with the necessary speed and flexibility to undertake a similar role in the virtual world; the scale and scope of the task are materially different. The challenge for the future of privacy and consent in the cyberspace domain is one of adaptation. (Elahi, 2009, p. 114)

One of the key tensions in the literature, and potentially between the work presented in this domain with that of the Citizenship and Politics domain, is this one between individual and state needs, and thus rights and the balance of power in political systems. Again, as Elahi notes: While digital technology has increased the power of the individual, it has also challenged the human rights of the individual with regard to privacy and consent. The same technologies that have forced governments and institutions to become more transparent and driven down costs for the consumer have also enabled

612   Simeon J. Yates et al. g­ overnments and businesses to collect personal information on the public to an unprecedented degree.  (Elahi, 2009, p. 115)

The use of digital and media technologies to monitor citizens has been widely debated. Lyon has written on this at length, noting how digital surveillance has become an ­inescapable part of everyday life in the global north. Lyon suggests thinking about surveillance as “social sorting” in order to divert the singular focus on privacy. Lyon argues that we have seen “the rise of the safety state” which enables social sorting and other discriminatory judgment through the processing of personal data for the purpose of care or control. These automated surveillance systems enable the state to gain have power over citizens through influencing or managing people’s behaviors. he shows how digital surveillance has become everyday practice in a wide variety of organizations such as schools, day care for children, or in the protection of homes. Importantly, Lyon notes that this is not necessarily in all cases an attempt to create a centralized surveillance state. Surveillance is often fragmented among organizations and between state and private organizations. As Lyon notes: In many cases, however, surveillance is the by-product, accompaniment, or even unintended consequence of other processes and practices. It is sometimes not until some system is installed for another purpose that its surveillance potential becomes apparent . . . Retailers may install ceiling mounted cameras in stores to combat shoplifting only to discover that this is also a really good way of monitoring employees as well.  (Lyon, 2010, p. 111)

As has become very apparent with the rise of “Big Data,” everyday interactions become points of surveillance: . . . the mundane activities of shopping using credit and loyalty cards may also contribute to profoundly significant processes of automated social sorting into newer spatially based social class categories that modify older formations of class and status.  (Lyon, 2010, p. 112)

One of the major social concerns that is reflected in the research is the use of data by the state to directly monitor citizens. Brown and Korff (2009) argue that the disproportionate power of surveillance enabled by the Internet and conducted by police, intelligent agencies, and government is problematic for democracy and rule of law. They raise concerns that grounds for suspicion that agencies such as the police need to justify monitoring and surveillance of digital media has become increasingly vague—especially in the context of counter-terrorism. These debates have been played out in a range of policy measures and legislation in Europe over the last decade and have become more pressing with the rise of social media. The recent European “General Data Protection Regulation” (GDPR, 2016) has provided some very restrictive requirements on uses of citizens’ data, though we witness a range of caveats for policing and security work. However, the GDPR’s principles fit with Brown and Korff ’s argument that data should not be held

ESCR Review: Governance and Security   613 “just in case” (2009, p. 129). The political, social, and research debate around the justified use of citizens’ data and the surveillance of citizens is on-going. Those such as Brown and Korff who are concerned about the regulation of digital surveillance argue the following: . . . the collection of data on ‘contacts and associates’ (i.e., on persons not suspected of involvement in a specific crime or of posing a threat), the collection of information through intrusive, secret means (telephone tapping and email interception) and the use of ‘profiling’ techniques, and indeed ‘preventive’ policing generally, must be subject to a particularly strict ‘necessity’ and ‘proportionality’ test, and surrounded with particularly strong safeguards.  (Brown & Korff, 2009, p. 130)

Such concerns overlap with the next topic, namely the regulation—by both government and commercial platform providers—of digital media and technology use. Internet regulation and governance. An obvious area of research work is on the ­governance of digital media themselves. This covers a range of issues including attempts by national government and international agencies to regulate digital media, its use, formats, and access. It also covers the governance and regulation of technical standards and data protection, through to the legal underpinnings of our use of specific platforms. A key area is the manner in which commercial providers of digital technologies, platforms, and media regulate their use, and the use they make of citizens’ data. This ranges from copyright issues through to “acceptable” use as well as commercial use of collected data. For example, Pollach (2005) examines commercial websites’ privacy policies and checks whether they adequately communicate data handling practices so that people can have informed decisions when using the web. Her analyses show that most of the policies are written in complicated and vague ways, justify placing cookies on people’s computers, and set the default to sending unsolicited marketing communication. Only some of the websites openly and clearly admit to sharing and selling people’s data with third party companies. Pollach argues that these companies use several communicative strategies such as mitigation and enhancement, obfuscation, relationship building, and persuasive appeals, noting: Overall, the analysis of communicative strategies in privacy policies has revealed that they contain vague statements, which prevents informed consent on the part of Internet users and may lead to ethical problems if they misinterpret the claims made in these documents.  (Pollach, 2005, p. 232)

Such behavior has policy and ethical implications: Web merchants do not seem to abide by these common ethical principles when they communicate their privacy standards. It seems that they still need to learn how to use the power the Internet has bestowed on them in an ethical and respectful way, for example by posting clearly and unequivocally worded privacy policies. (Pollach, 2005, p. 232)

614   Simeon J. Yates et al. At a national and transnational government level there has been a sustained policy drive to support the development of digital technologies, digital service, and more broadly a “digital society.” Mansell (2014) examines the history of policy initiatives designed to push the “Digital Agenda” from the mid-1980s to the present. The goal of the work is to provide a critical assessment of the values and priorities these policy agendas portray. Mansell focuses on the flagship European initiative “A Digital Agenda for Europe,” arguing that economic growth is one of the top priorities endorsed by the European Commission and that the program privileges the commercial interests of industry stakeholders: There is a continuous emphasis on stimulating investment in the infrastructure, in the early period, in reference to promoting integrated broadband communication and, later, to underpinning the mobile Internet and new delivery channels and platforms.  (Mansell, 2014, p. 207)

She further argues that the social and citizen (demand side) aspects of digital technology policies have been underplayed in these policies: Social and demand-side issues made appearances but the principal goals remained consistent with supply-oriented strategies, commercial market priorities and competitiveness. These tensions reflect the contradictory values embedded in the scaffolding of information society policies. They are particularly visible in three areas: (i) trust and security; (ii) open information and copyright; and (iii) public service media.  (Mansell, 2014, p. 207)

These two areas come together in the actual use of systems by citizens and in their understandings of how they and their data are regulated. For example, Fogel and Nehmad (2009) provide an early example of the examination of risk taking, trust, and privacy concerns in relation to social media—focusing on the use of Facebook and Myspace by college students. Young people who had a social media profile were significantly more risk-taking in data they shared than ones that did not, and men tended to take more risks and to have less privacy concerns than women: As the Internet is not a private club, clearly those who are posting information about themselves on their social networking profiles are more comfortable with the possible risks of their information being seen by others. (Fogel & Nehmad, 2009, p. 159)

Such work points out that many aspects of regulation are relative to the social and cultural expectations of citizens. Data that contemporary users may be happy to share may have been considered too private or too risky to share by a prior generation. It may also be the case that users look to play off potential risks of data sharing with the benefits and services offered by platforms. Or alternatively, repeated data breeches and their implications may make people more wary of what data they share. Even in this early work of social media Fogel and Nehmad note:

ESCR Review: Governance and Security   615 As our data were collected in 2007 . . . it is possible that more people are aware today of the possible privacy concerns of disclosing a home address on a social networking profile.  (Fogel & Nehmad, 2009, p. 159)

When it comes to sharing and privacy risks of data, most of the concerns arise around children’s use of digital technologies, as the next section discusses. Children’s use of digital media. Regulating children’s use of any new medium (from radio to TV to computer games) has always generated a strong research program. This is very much the case for the full variety of digital media and technologies. Such work often focuses on aspects of media regulation, but more in-depth work explores how government, schools, and parents look to manage and regulate digital media use by children. One of the key researchers in this area, Sonia Livingstone, early on examined the emerging research agenda in relation to children’s Internet use (2003). Some of the problems in this initial research were: seeing children as a homogenous category, while rendering them marginal to “general” Internet use. As Livingstone argues, there is not enough consideration taken of what children and youth actually do and want from the Internet, which leads to a lack of proper understanding of how they use it. Primarily, she argues that children’s main interest in the Internet is new opportunities to communicate with their social networks; hence, most of the contacts are mostly local. In terms of education, there is little evidence on whether or how the Internet benefits children’s education. Livingstone notes that differences in digital media use between adults and children can also lead to challenges in regard to digital media use in the home: Parents are developing strategies to manage and regulate the internet within the home, framed by educational aspirations for their children; meanwhile children prefer online entertainment centred on fandom transferred from already established media—music, celebrities, sports, television programmes. (Livingstone, 2003, p. 149)

She also points out that much of the early debate around digital technology use by children focuses on its educational value: Parents themselves seem conflicted about the value of computers, and in some ways they are also in conflict with their children. This conflict centers on the potential educational value of computers, on the use of so-called ‘educational’ applications, and on the grey area of ‘edutainment.’  (Livingstone, 2003, p. 153)

Livingstone and Bober (2006) explore the negotiations that parents conduct with their children in regard to the opportunities and risks that Internet use entails, as they seek for meaningful engagement with it in their daily lives. Parents’ strategies are often concerned with practicalities (such as affordability), use (through discussion, sharing media time with children, and control), and cultural and cognitive issues relating to

616   Simeon J. Yates et al. the Internet. Children’s tactics, however, are often about resisting these strategies, especially since they sometimes have more expertise with new media than their ­parents. The authors take a child-centered approach, which includes in the research design an examination of adult-child divergence and taking into account the child’s concerns: At least two difficulties undermine parents’ attempts to regulate their children’s internet use. The first is that while parents are responsible for their children’s safety, they must also manage their children’s growing independence and rights to privacy, something that children themselves feel strongly about. The second is that, as parents and children agree, children are more often more expert on the internet than their parents.  (Livingstone & Bober, 2006, p. 103)

In the conclusion of their substantive work Kids Online, Livingstone and Haddon (2009) argue that research about children needs to be conducted with them, not on them. In  addition, usage varies, influenced by socio-economic differences which correlate with education and regional and other sources of inequality. In terms of gender, they note that there has been a substantive increase in girls’ use of the Internet to the point of parity, although there are still some key differences in types of use linked to gender preferences and confidence in skills. As the Internet and digital media have developed, so has the assessment of the risks faced by children using the Internet. The authors identify more contemporary risks as being commercial exploitation, hate online, seeing violence and harmful content, seeing pornography, and self-harm. Yet the benefits from digital media use are also complex and not solely derived from educational uses: Two implications follow. First, the simple fact of using the internet may not mean that a child achieves their potential or gets the most from it, and further support and encouragement to progress or expand their activities may be required. Second, the fact that a child plays games online may not be, as worried adults are tempted to judge, a ‘waste of time,’ for this may represent a step towards further activities, one that is fun, gives confidence and develops skills. (Livingstone & Haddon, 2009, p. 5)

With the rise of social media and the linkage between social media use, popular media, and interpersonal interaction, concerns have shifted to issues of young adults  and relationships. This work clearly overlaps with the Communication and Relationships (chapter 8) domain. Within this, feminist scholars have pointed out the extent to which existing inequities and sexual double standards remain, and may be reinforced by, the use of digital media. As an example, Ringrose et al. (2013) examine teens’ (12–15) digital image exchange, commonly called “sexting,” mainly focused on the UK context. The researchers have conducted interviews, focus groups, and analyzed Facebook and Blackberry online posts. They argue that teen girls receive contradictory messages about the way to perform their gender; on the one hand they are supposed to perform and

ESCR Review: Governance and Security   617 produce forms of “sexy” self-display, and on the other hand they face legal, moral, and “slut shaming” consequences. The authors argue that we have to understand the underlying gendered discourses and power that enable a context where girls’ mediated body parts (e.g., images of breasts) are highly valued as commodities, where it is possible for such images to be traded like currency, which then constructs a situation where girls stand to ‘lose’ something (namely their sexual reputation) when images are shared, in ways similar to debates on female virginity.  (Ringrose et al., 2013, pp. 319–320)

Thus identities and social roles are shaped and changed, in part through their articulation via digital media and technologies. As Ringrose et al. note, since girls must negotiate moral discourses regarding their sexual reputation, and being attractive and wanted, when deciding whether to send images, this suggests a new norm of feminine desirability as mediated (though not determined) by the affordances of digital technology. (Ringrose et al., 2013, p. 312). Regulation and governance of automated systems. A growing a new area of concern is that around the governance of automated systems—especially when understood as artificial intelligence (AI). An example from early in our data set is Quesenbery (2002), who provides an overview of electronic performance support systems (EPSS) such as intelligent agents, information visualization, search engines, and collaborative filtering. These agents have several properties: they are reactive, autonomous, goal-oriented, temporally continuous, communicative, adaptive, mobile, flexible and demonstrate “character.” They help users by performing tasks autonomously, training or teaching users, helping users collaborate, or monitoring procedures. Importantly, Quesenbery points out that despite their benefits, agents have several drawbacks, particularly around trust, privacy, control, and personalization: Users must trust that the software will work properly, and help them do their jobs well and provide adequate support for those tasks, but they do not typically need to worry about whether it will work in ways that are contrary to their own interests. They may, however, not feel trusted by their employer if they are not sufficiently empowered to do their work by overly restrictive software or inadequate performance support.  (Quesenbery, 2002, p. 454)

This concern about trust runs through much of the literature on the social and organizational impacts of automation and was a major theme of our workshops that explored the impacts of automation and augmentation (see chapter 24). A key issue in the discussion of trust in AI and automated systems is the visibility of their underlying processes—the extent to which they are a “black box.” A critical factor in establishing trust is how transparent the agent’s action is—that is, how easily the user can see inside the transaction or independently validate the results. This transparency is just as important in establishing trust with a software agent as with a human vendor.  (Quesenbery, 2002, p. 455)

618   Simeon J. Yates et al. Ethics stand out as a key concern in this literature. Sharkey and Sharkey (2012) outline the development of robot applications for helping the elderly and their carers, monitoring their health and providing them companionship. As well as giving the elderly an increased sense of control and autonomy, robotic assistive technology could increase the social contact the elderly person experiences, by making it possible for them to get to and from social meeting places; again with likely improvements in their psychological welfare. (Sharkey and Sharkey, 2012, p. 31)

However, the authors raise six ethical concerns around this topic based on human rights and shared human values:

1. The reduction of human contact because of the robots leading to neglect 2. Increasing of feelings of objectification and losing control 3. Loss of privacy 4. Loss of personal liberty 5. Potential for deception and loss of dignity 6. The conditions by which elderly people should be able to control the robots

Importantly, Sharkey and Sharkey do not use these ethical concerns as an argument against the use of robots, but rather they emphasize that with a proper balance between benefits and ethical issues, robots could improve the quality of life of elderly people. They also point out that there is a need for governance and regulation, as technologies are moving beyond current legislation. This issue of digital media and technologies moving beyond current governance provision and the need to balance innovation and regulation permeates many of the domains covered in this book: for example, social media platforms in the context of both interpersonal and political communication, data use by major platforms, or technologies in health care. In relation to health care based robotics, Sharkey and Sharkey note: Since the effect of robots on the lives of the elderly depends on the ways in which they are deployed, the development of guidelines about their use in care homes, and in their own homes, could help to guard against their misuse. At present, apart from fundamental human rights legislation, there is little protection for elderly people against the potential downsides of robot care.  (Sharkey & Sharkey, 2012, p. 37)

One area where ethics, regulation, and trust of autonomous systems is most acute is around their use for military and policing activities (related to the first topic of state use of digital systems). For example, Noorman and Johnson (2014) seeks to uncover the “black box” discourse around the autonomy of military robots to show how the question of responsibility changes once this framing is peeled back. Noorman and Johnson argue that more analysis into what is inside robots is needed in order to understand responsibility

ESCR Review: Governance and Security   619 issues surrounding military autonomous robots. They uncover the negotiations about the meaning of autonomy of robots, and the desires and goals of various social groups with an interest in the development of these technologies. Their argument is that humans still have influence and hence responsibility in autonomous robots: 1. Designers and developers outline the problem that these autonomous systems are supposed to solve and create their behavior. 2. Norms and rules still govern the behavior of autonomous systems. 3. These systems depend on people who develop verification and validation methods of ensuring trust and confidence of such systems. Technologies do not merely replace human beings; rather they complement and change human activity. They allow human actors to do things they could not do before, and as a result they shift roles and responsibilities and create new ones. So it is with robots. Tracing the distinctive ways in which robotic systems perform tasks is essential to understanding how tasks and responsibilities are created and distributed across the broader sociotechnical system.  (Noorman & Johnson, 2014, p. 55)

In this sense they are repeating the key argument of Actor Network Theory (see Latour, & Woolgar, 1986, for example) in which human agency is key at all stages, but that elements of systems with agency are no longer just humans. Many aspects of the social system—be that a robot or a workplace device—become “black boxes” where the social and technical systems and infrastructures behind them are hidden from the user or citizen. Thus, The important question is not whether human actors can be held responsible (they can), but how tasks are distributed among human and non-human components of the system, whether the machine parts have been adequately tested, whether the human actors involved have been adequately trained for their tasks, what risks are involved, and how those risks are being managed and minimized. (Noorman & Johnson, 2014, p. 60)

Theory, Method, and Approach As with the other review chapters, the following analysis builds on Borah (2017). It explores the extent to which theory and methods were employed in the reviewed papers. Most of the analysed papers (60%) were inductive, either describing findings or building theory, with 40% being deductive (testing existing theory). Only 24% of the papers undertook primary data collection, with 63% being discursive reviews of or reflective on existing research (Table 22.5). The main disciplines from which theory was used or for which theory was developed were sociology (52%), psychology (17%), communication and media (12%), politics (9%), economics (5%), and philosophy (3%). Only actual use of

620   Simeon J. Yates et al. Table 22.5  Empirical Approach   Theoretical (synthesis of current or prior work) Discursive/descriptive (no new data or theory) Primary empirical (data collected and analysed) Secondary empirical (analysis of existing data)

Percent 33.5 29.9 23.6 13.0

theory for the purposes of design or analysis were coded; general reference to prior work and theory were not coded. There was considerable variety in the specific theories applied from these disciplines. Theories of the information or networked society prevailed in the sociology discipline (12% of total) and theories of identity within psychology (3%). Of the theories explored, either empirically or discursively, it was those pertaining to the informational or network society that proved most popular. This was followed by those that examined privacy, the public/private sphere, or political economy. Surprisingly little attention appears to have been paid to exploring issues of trust between government and the governed, public participation in the government decision-making process or, indeed uses of technology to improve the governance of our communities. For those studies that undertook empirical research, the main research methods were literature reviews (38%), surveys (26%), and interviews (16%), with 61% qualitative and 39% quantitative (Tables  22.6 and  22.7), with none of the work using a “big data” approach. The majority of the empirical work focused on specific groups, with a limited number of general population studies (Table 22.8). Unfortunately, there is little account (as of yet) of how government, at either the national or local level, has managed and responded to the social media and big data revolutions. It is a surprising omission given the recent emphasis on the centrality of government, particularly local government, to implementing aspects of “digital society” from the “smart city agenda,” through “digital by default government services” to industrial strategies focused on AI and automation. This may be a feature of the selected literature, as we have focused primarily on social research, and much of the recent work on the technologies of digital government has been undertaken within the Information and Computer Science disciplines. That literature reports on designing and building technical systems, with some social science input rather than with much examination of their actual social impacts. There was limited discussion of how technology might be used to “foster a civic wellbeing.” This would fit with arguments made in the ESRC project’s Digital Leaders Salons and review events with stakeholders, where a “public value” orientation for the administration of public services in place of the current was put forward. This was contrasted with current digital government approaches that are formulated much more around systems efficiency and cost savings—referred to by some as the New Public Management paradigm, as in the United Kingdom. It was argued that a public value governance approach to service delivery and regulation is more congruent with the information and communication affordances of digital technology, particularly those associated with the

ESCR Review: Governance and Security   621 Table 22.6  Research Methods  

Percent

Literature Review (general or narrative) Survey Interview(s) Content analysis Focus groups Ethnography Textual (linguistic-discourse analysis) Experiment Other

38.0 26.7 16.0 6.7 4.0 4.0 2.0 1.3 1.3

Table 22.7  Analytic Approach  

Percent

Qualitative (textual-non-discourse) Statistical (numerical) Discourse (textual-linguistic-discourse)

60.49 39.0 .7

Table 22.8  Study Population  

Percent

Case study(ies) General population Specific group

16.4 16.4 67.3

advent of social media. As such it may be more likely to usher in a smart governance process that can integrate and take advantage of the local democratic and economic opportunities long associated with digital media, but which local government has hitherto failed to grasp. However, these emergent ideas do rest upon a number of assumptions, not least of which that there is a favorable local governance environment capable of sustaining this approach, that have themselves received little empirical investigation.

Delphi Review The following sections detail the results of the Delphi process for the Governance and Security domain, covering suggested scoping or research questions, key topics to address within these questions, and key challenges to researching these questions.

622   Simeon J. Yates et al.

Future Research and Scoping Questions The Delphi review identified a set of scoping questions for the domain; these were coded into the eight categories detailed in Table 22.9, while Table 22.10 presents their ranked importance from the confirmatory survey. As will be discussed later (chapter 25), there are a number of areas identified in the scoping question and challenges analysis that cross-cut all the domains, a key one being ethics. Table 22.9  Delphi Review Scoping Questions Question category

Example questions

Privacy and access to work of government and public bodies

How do we manage privacy in the age of WikiLeaks? Can any email or digital communication be considered private or should all Government officials, including University Professors, assume their email is open for the public to read?

Fake news

How do we separate fact from fiction? Once claim being made in the current US Electoral campaign is that WikiLeaks and other hackers are trying to influence the US election by not only revealing but also manipulating the information they leak. How does the public know that leaked information is accurate?

Accountability for digital systems and their impacts

In addition to regulatory oversight, how do we encourage organisations, especially companies, to recognise and accept responsibility and accountability for their actions?

Transnational governance of digital economy

How do we go about making rules in the digital economy? It may be worthwhile to explore how the TPP (let’s call it TPP2) might be negotiated using processes for the digital economy.

Algorithms and the law

What are the risks to modern norms and practices of law as more and more of our interactions and data are defined by algorithms we do not understand or have access to, as well as by monetization processes—as these and related phenomena undermine basic conceptions of transparency, agency, autonomy, respect for the human person, etc.?

Human factors in cyber security

On security, it’s been said that the weakest link in security is the human element. Yet, a lot of the work seems to be in the technical/technological area. What can be done to improve the human element in security?

Ethics

How will ethics—especially the virtue ethics question of what is the good life, the good life worth living, both individually and collectively—proceed as our technological future becomes ever less predictable as it simultaneously threatens all but unthinkable outcomes? (See Vallor, 2016).

Agency and autonomy in digital age

What will happen to our sense of human identity, agency, and capacities for intimate relationships, ranging from friendship through long-term relationships and parenting as AIs and social robots become increasingly human-like, thereby calling into question core notions of agency and autonomy, affection and love, etc. (Cf. the Foundation for Responsible Robotics, https://responsiblerobotics.org/, for a much more extensive list of questions.)

ESCR Review: Governance and Security   623 Table 22.10  Delphi Review Scoping Questions Ranked by Importance Question category Accountability for digital systems and their impacts Algorithms and the law Human factors in cyber security Ethics Fake news Agency and autonomy in digital age Privacy and access to work of government and public bodies Transnational governance of digital economy

Percent 16.7 16.7 16.7 16.7 11.1 11.1 5.6 5.6

The consultation workshop identified a set of scoping areas that added to those from the Delphi process, namely: • Understanding government levels (international, national, regional, local) • The role of key decision makers within government • Cultural differences in digital governance • Legalization behind users, uses and technology developments • Public services—especially how surveillance is now “normal” • A need for a focus on policy instead of starting with the technology • What are digital technologies’ role in governance in very different socio-economic areas? We note that these issues are far more focused on aspects of governance than on the more personal issues of trust and accountability in the original scoping questions. The topics identified in the Delphi review were coded into 10 categories shown in Table 22.11, with their ranked importance from the confirmatory survey in Table 22.12.

Table 22.11  Key Topics Ranked by Percent of Delphi Survey Responses Topic Cyber security Governance of digital economy Government digitization Privacy Education Ethics Legal issues Methods Political communication Transnational governance

Percent 37 11 11 11 5 5 5 5 5 5

624   Simeon J. Yates et al. Table 22.12  Key Topics Ranked by Importance from Delphi Survey Topic

Very important

Important

Neutral

Unimportant

Very unimportant

Privacy

83.3%

16.7%

0.0%

0.0%

0.0%

Cyber security

66.7

33.3

0.0

0.0

0.0

Governance of digital economy

33.3

50.0

16.7

0.0

0.0

Government digitization

16.7

50.0

33.3

0.0

0.0

The consultation workshop highlighted the following additional topics or modifications to topics: • Access to data (who owns it?) And how “re-combining data” needs to be included in privacy issues • Legislation governing the digital economy • Citizens’ attitudes toward digital technology governance and links with actual behavior • Cyber-security needing to be broadened out to be more relevant to people and society (what exactly are the dangers?)

Research Challenges The challenges in undertaking research in this area identified by the Delphi panel were grouped into 8 categories. Table 22.13 lists these, ranked by the number of coded items, with those deemed by the consultation workshop to be domain specific marked in bold. Table 22.14 shows their ranking by the confirmation survey. The consultation workshop highlighted a set of challenges not covered in the Delphi returns, including • Governance based on values, culture, beliefs and evidence • Future proofing governance for the digital age • Big data • Reconstituting labor contracts • Being people centric not technology driven • Understanding how people benefit—governance that achieves the best trade-off between human need and economic need

ESCR Review: Governance and Security   625 Table 22.13  Challenges Ranked by Percent of Cases Challenge

Percent

Ethics Big data and analytics Cross-cultural engagement Cybersecurity Digital divide Disruptive change Governance Transnational governance

31 23 8 8 8 8 8 8

Note: Those considered as specific to the Governance and Security domain in bold.

Table 22.14  Challenges Ranked by Importance from Delphi Survey Challenge

Very important

Important

Neutral

Unimportant

Very unimportant

Big data and analytics - both methods and use by government

66.7%

16.7%

16.7%

0.0%

0.0%

Detecting cyber attacks

50.0

33.3

16.7

0.0

0.0

Ethics for digital research

16.7

66.7

16.7

0.0

0.0

Transnational governance of digital economy

16.7

66.7

0.0

16.7

0.0

Understanding disruptive change

16.7

50.0

33.3

0.0

0.0

Understanding digital divides

0.0

66.7

33.3

0.0

0.0

Understanding cross-cultural engagement via digital technologies

0.0

33.3

66.7

0.0

0.0

Given the considerable breadth of ideas and responses in the Delphi responses and the consultation workshop, we looked to combine this broad range of ideas with the material from the literature section to provide a clearer picture of future research challenges. The next section undertakes this reflection.

626   Simeon J. Yates et al.

Conclusion As with the Organizations and Digital Technologies domain (chapter  11) the lower response rates from the Delphi phase limit some of the confidence in the results. Also, it is clear from the Delphi responses that the identified key literature presents a broader brief than that in the initial ESRC scoping questions. Finally, there are two areas identified by the research that are important, but which may already have substantive ongoing activity, namely cybersecurity and children’s use of digital media. Both of these are clearly mature research areas with substantive empirical research behind them. We would argue that future work in these areas should target specific issues, potentially where they intersect with cross-cutting themes (see chapter 25), such as inequalities and divides in children’s digital lives, or digital literacies and cyber security. The following four potentially overlapping areas need further work, especially as there appears to be less empirical work in these areas:

1. Impact of social media on governance 2. Success factors in digital governance at local, national, and international levels 3. Privacy, citizenship, the state, and surveillance in the digital age 4. Regulation and governance of automated systems

Having said that, these topics and the majority of questions and topics identified in the Delphi and workshop discussions crossover with the other domains. We would note that that they in particular cut across the Citizenship and Politics (chapter 16) and Data and Representation (chapter 18) domains. It is also the case that the challenges identified within this domain all fall within the cross-cutting issues to be discussed in chapter 25. Governance and security of ourselves as citizens, of our data, and of the systems we use is clearly not just a technical issue. It is core to how we govern, manage, and regulate our digital society. It therefore needs to be underpinned by our assessment of and debates around ethics and rights. Social research on the impacts of digital media and technologies needs to provide a strong empirical basis for these debates.

Note 1. As part of the review, The Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View

ESCR Review: Governance and Security   627 Data Visualisations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https:// waysofbeingdigital.com/literature-analysis-interactive-results/for interactive visualizations with mouse-overs of the main clusters of concepts within each Domain, and the relative frequency of concepts associated with each cluster.

References Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Brown, I., & Korff, D. (2009). Terrorism and the proportionality of internet surveillance. European Journal of Criminology, 6(2), 119–134. Elahi, S. (2009). Privacy and consent in the digital era. Information Security Technical Report, 14(3), 113–118. Fogel, J., & Nehmad, E. (2009). Internet social network communities: Risk taking, trust, and privacy concerns. Computers in Human Behavior, 25(1), 153–160. Latour, B. & Woolgar, S. (1986). Laboratory life: The construction of scientific facts. Princeton, NJ: Princeton University Press. Livingstone, S. (2003). Children’s use of the internet: Reflections on the emerging research agenda. New Media & Society, 5(2), 147–166. Livingstone, S., & Bober, M. (2006). Regulating the internet at home: Contrasting the perspectives of children and parents. In D. Buckingham & R. Willett (Eds.), Digital generations: Children, young people, and new media (pp. 93–113). Mahwah, NJ: Lawrence Erlbaum. Livingstone, S., & Haddon, L. (Eds.). (2009). Kids online: Opportunities and risks for children. Bristol, UK: Policy Press. Lyon, D. (2010). Surveillance, power and everyday life. In P.  Kalantzis-Cope & K.  GherabMartin (Eds.), Emerging digital spaces in contemporary society (pp. 107–120). London, UK: Palgrave Macmillan. Mansell, R. (2014). Here comes the revolution—the European digital agenda. In K. Donders & C.  Pauwels (Eds.), The Palgrave handbook of European media policy (pp. 202–217). London, UK: Palgrave Macmillan. Noorman, M., & Johnson, D. G. (2014). Negotiating autonomy and responsibility in military robots. Ethics and Information Technology, 16(1), 51–62. Pollach, I. (2005). A typology of communicative strategies in online privacy policies: Ethics, power and informed consent. Journal of Business Ethics, 62(3), 221. Quesenbery, W. (2002). Who is in control? The logic underlying the intelligent technologies used in performance support. Technical Communication, 49(4), 449–457. Ringrose, J., Harvey, L., Gill, R., & Livingstone, S. (2013). Teen girls, sexual double standards and “sexting”: Gendered value in digital image exchange. Feminist Theory, 14(3), 305–323. Sharkey, A., & Sharkey, N. (2012). Granny and the robots: Ethical issues in robot care for the elderly. Ethics and Information Technology, 14(1), 27–40. Vallor, S. (2016). Technology and the virtues. Oxford, UK: Oxford University Press.

chapter 23

G ov er na nce a n d Accou n ta bilit y i n I n ter n et of Thi ngs (IoT) N et wor ks Naomi Jacobs, Peter Edwards, Caitlin D. Cottrill, and Karen Salt

Introduction As digital technology becomes smaller, and processing power greater, there are opportunities to integrate it more fully into our environment and society. We are coming closer to achieving Mark Weiser’s (1991) concept of ubiquitous computing, where objects in our environments are able to collect, store and transmit data they integrate to the point that they “disappear” and are no longer noticed as distinct from the general surroundings. While this brings potential societal benefits, it also introduces novel challenges in terms of governance. In the late 1990s, Kevin Ashton coined the term “Internet of Things” (IoT) to describe supply chain technology that could collect and share data without direct human intervention (Ashton, 2009). This term is now used more extensively, and covers a wide range of spatially distributed sensors and devices that collect and share data. This concept is now driving a major technological shift and is affecting numerous aspects of society. Projections suggest there may be 20 billion connected devices by 2020 (Gartner, 2017), and this level of deployment has profound economic implications. The Organization for Economic Co-operation and Development mentions “industrial and commercial processes, consumer and home services, energy, transport systems, health care, infotainment and public services” as key areas that are likely to be affected (OECD, 2015, p. 240), with estimates for added global economic value by 2020 ranging from US$1.9 to $14.4

Governance and Accountability in Internet of THINGS   629 trillion (Walport, 2014). Such broad-scale implementation and impact raises questions of governance, defined in the next section. As these connected technologies become critical parts of society and infrastructure, significant attention has been focused by policymakers at the international, national, and city level on how they should be regulated and managed. The Internet of Things introduces new complexities of governance which are distinct from general digital governance, given levels of information sharing involved, multiple actors and stakeholders, questions of accountability when issues arise, and distribution in physical space. Media attention has frequently focused on the IoT at the level of objects and technologies available commercially for private use, often in the household setting; for example, smart fridges that know when you need more milk and purchase it for you (Hammersley, 2013; Walker, 2018; Molloy, 2018). Existing regulations and policy often focus on new consumer products delivered by technology innovation (with products such as “wearables,” “smart homes” and other consumer applications acting as a driver for industry).1 However, such regulations also cover large public-sector infrastructure projects using widely deployed IoT sensor networks, which are becoming more common in civic settings. Projects such as these are often framed as part of the notion of the “smart city.” “Smart city” is used extensively when discussing the use of technology to strategically enhance civic infrastructure (such as traffic and energy management); however, many have expressed concern that it is not a useful term. Angelidou (2014) extends this criticism by noting that there remains no agreed definition of smart (or intelligent) cities. While not all so-called smart city solutions involve IoT technology, and not all IoT technologies at the city level are necessarily part of smart city initiatives, there is a significant crossover between public implementations of IoT and the smart city rhetoric (e.g., Future Cities Catapult, 2017; Hill et al., 2016). This may be shaping discussion of IoT at the national and international level as a significant amount of funding focusses narrowly on smart city programming rather than on the more general concept of the IoT and the potential it has for system-wide change (Gunashekar et al., 2016, p. 40). This lack of differentiation could be an issue if other areas are neglected, and critical reflection is sidelined in the rush to be included in those gaining from the widely lauded benefits of IoT. Governance for diverse utilizations will be increasingly challenging, as already seen in live projects such as the Chicago Array of Things (discussed later in this chapter). Gaps in governance will remain without a consideration of responsibility, accountability, and what technology can actually achieve. Questions of accountability and governance in relation to the IoT are central to the work of the “TrustLens” project: Trusted Things & Communities: Understanding & Enabling a Trusted IoT Ecosystem. This project, funded by the United Kingdom’s Engineering and Physical Sciences Research Council (EPSRC), asks questions such as: What are the appropriate governance arrangements covering IoT deployments? How do we deliver meaningful accountability? How can we develop an understanding of the interplay between individuals and devices, and the wider relationship to social/cultural norms? Work carried out as part of this project to understand the existing literature on governance in IoT deployments has informed this review, which is not a comprehensive

630   Naomi Jacobs ET AL. systematic review but instead a deep foray into governance in this context that we hope may spark additional research and further work. This is, ultimately, a societal issue—not just one for technologists and sociologists to debate. To this end, this chapter will explore the literature on and current examples of gov­ern­ ance structures in IoT systems. We will focus on public deployments rather than individual consumer purchases, as this is a potentially more complex and understudied landscape. While private (household) and public (shared) deployments may entail different data storage and privacy implications, and thus require different or complementary governance models, this review will have applicability for all IoT systems. A focus on public deployments includes the potential impact for a wide range of citizens above and beyond those who choose to purchase IoT technologies for household or individual use. This is particularly the case because in these circumstances many public or community users have limited agency and choice over their interaction with these technologies, as we shall see. The review builds on existing literature and case studies to construct a framework of principles for IoT governance, highlighting emerging and remaining questions about governance. The case studies are not chosen to be representative examples, but, instead, represent diverse scenarios, deployments, and policies that can be aligned with the framework, drawing commonalities between theory and practice. Because the proliferation of devices outpaces the literature, by including these we aim to encourage scholars to include an analysis of active deployments in order to narrow this distance.

Principles of Internet of Things Governance Considering Governance Governance is a complex topic with a variety of definitions. UNESCO, for example, defines it as “the exercise of political, economic and administrative authority in the management of a country’s affairs, including citizens’ articulation of their interests and exercise of their legal rights and obligations” (Furness, 2012, p. 204). Weber, one of the relatively few to have written about governance specifically in the context of the IoT, describes it as “the design of institutions and the structure of authority to allocate resources and coordinate or control activities in the society” (Weber, 2013, p. 341). As discussed previously, realization of IoT projects has far reaching economic and social implications. These can be complex and involve many actors; for example, individuals, communities, service providers, and others. In this multi-actor context, gov­ ern­ance requires careful consideration. Different governance regimes may be applicable for different circumstances, and change accordingly, requiring flexible IoT governance. Maheswaran and Misra (2015) suggest the following four orientations to considering

Governance and Accountability in Internet of THINGS   631 flexible IoT governance challenges: space-, time-, device-, and data-oriented governance, which provide a framework for this variation. IoT technologies can be used to promote social good, foster learning, and promote justice, but may also be used to exert control and exacerbate inequality, therefore we have a duty to not only consider “what is possible, but what is responsible” (Surman & Thorne, 2016, p. 3). This question of exerting control speaks to the balance or imbalance of power that technology can foster. In examining literature pertaining to governance of IoT, we have identified four emergent themes and principles that should inform any discussion of governance structures in these systems. These are as follows: Levels of Governance (describing multilevel operation of governance), Legitimacy and Representation (describing how citizens are represented in governance models), Accountability (describing obligations to justify actions), and Transparency (describing visibility of processes and procedures). We will review key literature which foregrounds these topics in relation to IoT governance and will explain their significance. While these themes emerged as a consequence of reviewing the literature, three out of the four share commonalities with the principles that Weber (2013) identifies as key for IoT governance, which are: Legitimacy and representation, transparency, accountability, and infrastructure governance. There is also some overlap evident with the four principles listed by Almeida, Doneda, and Monteiro (2015) that they suggest should be adopted for data protection: notice and choice, data minimization, access to personal data, and accountability.

Levels of Governance There already exist many national and international bodies devoted to the governance of the internet, and there has been some discussion as to whether IoT governance at the highest level is encompassed by that, or needs separate regulatory and policy oversight (Smith, 2012). High level top-down governance, at an international level, would depend on the agreement of consensus rules and policies. Weber (2016) suggests that based on the failure of the International Telecommunications Union (ITU) to reach consensus on such rules for the internet, it is unlikely that such implementation of a top-down approach would be possible for the IoT. This is especially the case given the broad proliferation of devices and implementations. Top-down governance does not, however, have to be globally applied, and can exist at several different levels using different tools, as illustrated by Smith (2012), see Figure 23.1. Weber (2013) also emphasizes that tools for governance must encompass requirements at different levels including grass-roots, national, and city governance. Reviewing the scope of IoT projects, we also find it important to consider governance at different geographic and socio-cultural scales. There may be governance implications for IoT initiatives even below the city level. There may be small-scale local projects, tailored to particular communities. An example of this is the “damp buster” frog-shaped humidity sensors deployed by the REPLICATE project in Bristol (The Bristol Approach, 2018).

632   Naomi Jacobs ET AL.

Domain: Area of IoT Activity

Co-operation Policy Co-ordination Governance Tools

Mapping Space: For identifying and position organisations in relation to governance capability and scope of coverage

Standards Regulations Law Governance Topic: Description of Governance issue to be considered

City

Regional

State

Transnational

Global

Scope of Governance

Figure 23.1  Governance tools and their application at different levels of IoT activity. (Smith, 2012)

Projects may be initiated via bottom-up rather than top-down development, outside of the control or knowledge of city officials (Jiang et al., 2016). Citizen developed grassroots initiatives nevertheless have governance requirements. Earlier, we distinguished between individual IoT enabled objects that function in private spaces, and larger linked deployments which may be in public spaces. These latter IoT deployments may consist of individual citizen action, but gain their power from data collected from a large number of distributed devices such as air quality sensors (Chen et al., 2016). This collective data gathering is of common interest, thus models of equitable sharing and collaboration are necessary for wider impact. Grass-roots governance models can vary. For example, deployments might be examples of “citizen science,” where members of the public actively engage in scientific projects addressing real-world problems (Angelidou & Psaltoglou,  2017). These may be initiated by research groups and cultural institutions, arise from concerns of local groups, or be a consequence of publically shared data generated as an integral feature of crowdsourced sensing devices. Balestrini, Diez, Marshall, Gluhak, and Rogers (2015) make a distinction between citizen science and participatory sensing, noting that in the latter case users can appropriate IoT devices for their own situated purposes, not necessarily responding to the specific needs of a community or scientific endeavor. They suggest that crowdfunded participatory sensing initiatives can be less effective for successful use of IoT technologies as community tools unless other support features are implemented, such as local champions, peer-learning (which can strengthen social interactions as well as providing skills), and reward mechanisms. There may also be interactions between different governance strategies both at different levels, and when multiple authorities within an IoT ecosystem each have their own models. The existence of multiple platforms may impact citizens’ rights in ways that may

Governance and Accountability in Internet of THINGS   633 not be evident at the level of the individual deployment; for example, numerous studies have revealed the potential to re-identify “anonymous” individuals via the use of linked data sets (Douriez et al., 2016; Narayanan & Shmatikov, 2008; Sweeney, 2002). Given distributed data collection and ownership rights in the IoT, the potential for such re-identification of individuals due to inconsistent or conflicting data policies is likely to grow.

Legitimacy and Representation Representation of different stakeholders at different levels is a critical component of the ideals of democratization. Weber (2013) argues that an IoT that was within the power of a specific private or public authority would not comply with the principle of legitimacy being granted by democratic participation, “since the outcome should reflect the values of the stakeholders represented” (p. 345). Citizens’ rights must be protected, especially given that there are serious implications of these technologies in areas such as privacy. Maheswaran and Misra (2015) suggest that governance schemes that are beneficial for all stakeholders and therefore likely to see good compliance are ones where there is effective information flow between the user and corporate network operators. How much control citizens have in various governance models is also critical. A traditional top-down high level governance model in which a designated authority implements a governance plan may appear to have limited scope for direct citizen engagement, such as the development of Songdo, South Korea (see the case study under that subheading). Citizen governance may be introduced later, such as through the invitation of representatives to sit within a consultation group (as with the Chicago Array of Things in the United States: see under the Chicago, United States subheading); however, there may still be limited control. Similarly, the level of authority available to citizens is variable within bottom-up participatory governance models even if citizen involvement appears to be more direct. Balestrini, Diez, Marshall, Gluhak, and Rogers (2015), in discussing crowdfunded IoT initiatives, note that the ability for individuals to perceive they are making a meaningful contribution is important. Grass-roots, bottom-up efforts may find it difficult to effect change because of resource, infrastructural, or technological limitations. This does not mean that effective participatory governance is not possible, just that it is not always a consequence of increasing participation. Current projects underway to examine such systems of gov­ ern­ance include the Jam & Justice project, which is mapping instances of participatory urban governance. By crowdsourcing examples of initiatives including active citizen participation, they are building a collection which aims to offer insights to understand the range of practical efforts to achieve more inclusive governance, not only listing examples but “exploring what makes them exciting, what they are aiming to achieve, what makes them effective, and how they might contribute to more just outcomes.”2 Open systems which allow control by people directly affected by the implementation of technology may provide flexibility, but this openness may depend on factors of accountability and transparency.

634   Naomi Jacobs ET AL.

Accountability Accountability is a theme that much of the literature highlights as particularly important given the implications for privacy of the data gathering and monitoring capabilities of many IoT systems. Weber (2011) acknowledges the complexity of the concept of accountability in different contexts, but argues that, “a general definition incorporating the main elements of accountability is directed to the obligation of a person (the accountable) to another person (the accountee), according to which the former must give account of, explain and justify his actions or decisions in an appropriate way” (p. 134). This definition presupposes that both the accountable and accountee are known, and that the order of operations that take place is understood by both parties. The potential for these assumptions to be violated may often cast doubt upon the ability to ensure accountability in the IoT—a situation that requires further exploration of the concept. Weber and Weber (2010) suggest that placing IoT systems wholly within the power of a specific private or public authority risks decreasing legitimacy and democratic participation. In order to mitigate these risks, they suggest systems should be designed to promote accountability by ensuring that formal requirements about how rules are made, interpreted, and applied are inherent to the framework of the system. The legitimacy of organizations is also heightened when all stakeholders concerned with the IoT are included, generally by some form or reasonable representation, and thus Weber and Weber suggest multipolar and decentralized policy within an institutional setting. Almeida, Doneda, and Moneiro (2015) point out that boundaries between IoT and surveillance may blur or vanish if initiatives to protect privacy are not put in place. They include accountability as one of the four principles on which rules and norms must be built to ensure this (the others being: notice and choice, data minimization, and access to personal data). Weber (2011, p. 134) identifies key elements of accountability serving as a basic guideline for what must be included when establishing a legal framework for accountability measures for governing bodies: standards that hold governing bodies accountable at the organizational level; readily available information, including consultation procedures enabling active rather than passive information flow to concerned recipients; and the ability for beneficiaries of accountability to impose sanctions if standards are not met, which requires adequate participation schemes through direct voting channels and indirect representation schemes.

Transparency Much of the literature related to IoT governance highlights transparency as a critical aspect. For stakeholders to be able to follow-up on governance actions, they need to know what actions exist or are possible, and understand the details. If procedures are transparent, citizens have the capacity for active involvement and a certain level of control over the decision-making process, allowing for “a certain level of ‘democratic’ legitimization and predictability” (Weber & Weber, 2010, p. 75). This may be of particular

Governance and Accountability in Internet of THINGS   635 importance given that in the IoT there may be private actors involved in deployments, and private entities responsible for governance of some aspects. Regulation to enforce transparency must therefore also extend to rules governing manufacturers, which, given the global marketplace, may require a global view being applied to the governance process (Almeida, Doneda, & Moneiro, 2015). This includes transparency in the data collection process, so that data owners can understand what data is being collected (or has already been collected), where it goes, and who has access. This is related to accountability and protection of rights. In the case of the IoT, Weber and Weber (2010) identify several different categories of transparency. Transparency in this context must include both the functioning of the governance system and consequences of actions within it. For this reason, they include the category of procedural transparency, which covers the disclosure to the public of rules and procedures in the operation of organizations, and the necessity of governance and lawmaking being made both accessible and comprehensible. Other categories of transparency identified are decision-making transparency, which allows scrutiny of the decision-making purpose and the reasons decisions were made, and substantive transparency in which rules are established to avoid arbitrary or discriminative decisions, and may include requirements of rationality and fairness. Weber and Weber also note that transparency can operate in different “directions”—superiors might have insight into the actions of subordinates (transparency upwards), and vice versa (transparency downwards), as well as transparency between those inside and outside organizations in both directions. Maheswaran and Misra (2015) suggest that transparency is important so that users can understand and have input into policies and their selection, and can understand which policies are operating in different spaces (to allow navigation between them). To enable this, they propose a novel management framework for the IoT called Social Governance. The literature indicates that transparency should be in place throughout the lifetime of projects, and be adaptive, allowing consultation by users during both the preparation and launch of projects through feedback mechanisms. In addition, final decisions of governing bodies should be published alongside the decision-making process and considerations that led to them (Weber & Weber, 2010). This, it is suggested, will “allow the participants in the process to understand how their insights and expertise have influenced the policy outcomes” (p. 85).

Use of Themes The literature overview we have outlined has identified four themes, which provide an initial exploratory framework that might be used when considering the principles of governance models in place for public IoT deployments. Ideally, governance models should exhibit legitimacy and representation, accountability, and transparency, and operate at a suitable level or levels for the deployment in question, which may span several layers of governance.

636   Naomi Jacobs ET AL. Table 23.1  Key Emergent Themes and the Case Studies to Which They Particularly Relate Emergent governance theme

Focus case study

Level at which governance model functions

Comparison of top-down national policies and local implementations

Citizen representation and agency

Chicago, United States

Accountability of governance

Songdo, South Korea

Transparency of governance

New York, United States

With these principles of governance in mind, the following section of this review will explore in further detail some case studies of operational governance models with the aim of understanding the current landscape and how these themes are or are not met. Although the themes have relevance to all case studies, first the importance of considering the level of governance is uncovered by examining different national and regional examples, and then each theme is illuminated through three city case studies. Table 23.1 summarizes the key theme for each case study.

Case Studies: Regional/National IoT Governance National and regional governmental bodies are increasingly acknowledging the importance of implementing policies to manage IoT governance, reflected in frameworks developed by a range of bodies. Looking at the similarities and differences between some of these approaches provides insight into the origins and current state of policy and regulation. We present a selection of case studies, while noting that these are examples rather than a comprehensive list. Each emergent governance theme highlighted previously within the literature review will be presented alongside a relevant case study.

Top-down IoT Governance in the European Union The European Commission was the first supranational body to attempt to create an IoT governance framework (Weber, 2013). The European Union has actively included IoT policies in strategies such as the Digital Agenda for Europe 2020, and has promoted IoT research and innovation (OECD, 2015). In 2013, the European Commission released a report containing the results of a public consultation on the governance of the IoT which had been carried out in 2012, consisting

Governance and Accountability in Internet of THINGS   637 of more than 600 responses to an online questionnaire including stakeholders from ­academia and industry, and members of the public (European Commission,  2013). In this report, there appears to have been a contrast between the views of those representing industry, and citizens and consumer organizations, particularly with regard to privacy. The former suggested that the current Data Protection framework was sufficient, while the latter desired greater focus on privacy and data protection, with emphasis on control of data remaining with data subjects. Industry representatives were cautious of the stifling effect which inappropriate governance might have on a fledgling industry, in contrast to non-industry respondents who stressed the importance of ensuring fundamental rights are maintained. In several points of the survey it was emphasized that gov­ern­ance models should preserve these rights, giving people the ability to choose whether to be part of an IoT system, the right to “disconnect” at any time and the ability to ensure that their data is not used without explicit consent, so that they would have control of their data at all times. Despite the fact that a bottom-up multi-stakeholder approach is suggested for defining the ethical framework, governance was considered by many respondents to be necessary beyond self-regulation, with calls for independent public-sector organizations to provide regulatory oversight and governance including regular audits (European Commission, 2013, p. 9). In January 2015, a report produced by the CEN-CENELEC-ETSI Smart and Sustainable Cities and Communities’ Coordination Group (SSCC-CG) was released, providing a first overview of IoT standardization at the European level. Shortly after this, in March 2015, the European Commission launched the Alliance for Internet of Things Innovation (AIOTI). This organization is intended to develop and support dialogue and interaction among European IoT players, and create “a dynamic European IoT ecosystem to unleash the potentials of the IoT” (European Commission, 2016b). This organization has several working groups focusing on different aspects of European IoT development. In April 2016 the European Commission released a Staff Working Document entitled Advancing the Internet of Things in Europe (European Commission, 2016a). It highlights data ownership as a critical issue currently without consistent practice in standards across the Union, and that a lack of common technical standards for IoT platforms, including architecture and data models, may cause obstacles for the proper operation of systems across EU member states. The report recommended action under three pillars of IoT advancement (European Commission, 2016a, p. 4): 1. A single market for the IoT: IoT devices and services should be able to connect seamlessly and on a plug-and-play basis anywhere in the European Union and scale up across borders. 2. A thriving IoT ecosystem: open platforms used across vertical silos will help developer communities innovate. As a kick-start, IoT deployments in selected lead markets will be supported.

638   Naomi Jacobs ET AL. 3. A human-centered IoT: the IoT in Europe is to respect European values, empowering people along with machines and businesses, thanks to high standards for the protection of personal data and security, visible notably through a ‘Trusted IoT’ label. The report also identified a list of challenges that must be addressed for users’ trust in the IoT to be obtained, including “trustworthy identification both of users and devices in a distributed environment, where governance structures are not always clear” (European Commission, 2016a, p. 29). This highlights that lack of transparency in gov­ern­ance may be (at least partially) overcome by transparency of actors and systems. The General Data Protection Regulation (GDPR), which came into force in May 2018, is highlighted as a means by which privacy protection and thus trust can be increased, as it requires “data protection by design and by default” (GDPR Art. 25).3 The proposition of a “Trusted IoT Label,” as mentioned in the third of the three pillars (see preceding list), is expanded as being similar to the EU energy labelling scheme (EU Directive 92/75/EC) which rates energy efficiency of appliances on a scale from A to G. Such an IoT trust label might function as a demonstration of compliance with measures taken to effectively meet these challenges. Similar initiatives to develop such a labelling scheme are underway by other bodies, such as the IoT Trustmark initiative (Bihr, 2017). In November 2016, Working Group 4 of the Alliance for Internet of Things Innovation (the working group focused on European IoT policy) followed this up with a recommendations report related to the digitization of industry. The report points out that IoT security is not easily measured and thus implementation of such a labelling system may be difficult. The working group, instead, suggests the development of a Trust Charter, which is proposed as being led by industry. Such a charter would outline best practice, identify security issues for consideration and support compliance. Table 23.2 summarizes how the themes relate to the case study. Table 23.2  Mapping of Themes in EU Governance Emergent governance theme

Mapping from case study

Level at which governance model Supranational top-down policy functions Multi-stakeholder approach Citizen representation and agency

Public consultation process Balance between maintaining rights and encouraging innovation Inclusion of consent and control highlighted, “right to disconnect”

Accountability of governance

Regulatory framework with independent oversight suggested Technical standards

Transparency of governance

Transparency noted as increasing trust Trusted IoT Label suggested, but difficult to implement Development of a Trust Charter proposed

Governance and Accountability in Internet of THINGS   639

Top-down IoT Governance in the United States The United States has also recognized the potential economic and social impacts of IoT technologies. Several organizations within or supporting the government have examined issues surrounding the IoT and provided advisory information. One of these is the Technological Advisory Council, which was formed in 2010 to provide technical advice to the Federal Communications Commission (FCC) and is comprised of academic and industry experts appointed by the FCC Chairman. In December 2014 this body provided a recommendations report to the FCC that focused on maintaining security and privacy, and included the following: monitoring network traffic, ensuring availability of network capacity and spectrum space for IoT growth, defining the role of FCC in this context, participating in IoT security activities with other government stakeholders, and conducting consumer awareness campaigns and internal scenario exercises to be able to respond to “widespread consumer events related to IoT” (OECD, 2015). In 2016, the United States passed a bill which required “the establishment of a working group tasked with identifying proposals meant to facilitate IoT growth.” The bi-partisan Developing Innovation and Growing the Internet of Things (DIGIT) Act requires the following: The working group must: (1) identify federal laws and regulations, grant practices, budgetary or jurisdictional challenges, and other sector-specific policies that inhibit IoT development; (2) consider policies or programs that encourage and improve coordination among federal agencies with IoT jurisdiction; (3) implement recommendations from the steering committee; (4) examine how federal agencies can benefit from, use, and prepare for the IoT; and (5) consult with nongovernmental stakeholders.4

In January 2017 the US Department of Commerce released a “green paper” which identified four areas of engagement related to IoT on which to focus: enabling infrastructure availability and access, crafting balanced policy and building coalitions, promoting standards and technology advancement, and encouraging markets. This report was based on analysis of comments gathered during a consultation process that included a wide variety of stakeholders (from private sector, academia, government, and civil society). The executive summary suggests four key principles to define US IoT policy: ensuring an IoT environment that is inclusive and widely accessible to consumers, workers and businesses; recommending policy and taking action to support a stable, secure and trustworthy IoT environment; advocacy for an open and globally interoperable IoT environment built on industry-driven, consensus-based standards; and encouraging growth and innovation by reducing barriers to entry for expanding markets and by convening stakeholders to address public policy challenges (US Department of Commerce, 2017, p. 2). US policy is currently positioned to encourage private sector leadership in the area of technology development. The role of government in this area is identified but remains light-touch, reducing barriers for expanding markets rather than imposing strict regulation. The green paper indicates that “The Department will also advocate

640   Naomi Jacobs ET AL. against attempts by governments to impose top-down, technology-specific ‘solutions’ to IoT standardization needs” (p. 13). The need for flexibility and the fostering of an “innovative and adaptive environment” is highlighted. However, coordination across US Government partners is also mentioned as a potentially important area to develop because of the complex nature of the IoT landscape. Certain areas of IoT development are highlighted as warranting particular policy attention, including smart cities, “due to the investment and cooperation required to help communities realize the benefits of connectivity” (p. 8). Standards development is also included as an area on which to focus, highlighting that while no single organization has the resources or expertise to develop the standards necessary for this developing field, the US could play an important role in the development of international standards that are “voluntary, consensus-based, and open to participation by interested stakeholders” (p. 48). As with EU policy, interoperability is a key concern. Initiatives described cover a range of methods to optimize IoT availability, digital inclusion and utility across the United States, including “empowering communities to become smart cities” (p. 21). Such programs support the development of smart municipal infrastructure, including broadband provision. Questions of privacy, security, and data ownership, as well as standards development, are also addressed by the proposed initiatives; however, while comments from stakeholders who took part in the consultation are noted as highlighting transparency in both the decision-making process and in data handling, it is not clear exactly how some of these will be implemented. Table 23.3 summarizes how the themes relate to the case study. Table 23.3  Mapping of Themes in United States Governance Emergent governance theme

Mapping from case study

Level at which governance model functions

National policy Light-touch top-down governance, reducing market barriers Co-ordination across US Government partners Consultation with non-government stakeholders Private-sector leadership

Citizen representation and agency

Policy promoting inclusive and widely accessible systems Digital inclusion highlighted, including “empowering communities to become smart cities”

Accountability of governance

Focus on development of international IoT standards that are consensus-based and open to participation by interested stakeholders Otherwise limited focus on accountability

Transparency of governance

Transparency mentioned as a concern of stakeholders for addressing privacy concerns Implementation of policies not necessarily clear

Governance and Accountability in Internet of THINGS   641

Top-down IoT Governance in the United Kingdom The UK government has invested significantly in IoT, with funding allocated under InnovateUK, the Catapult schemes, and through the Department for Digital, Culture, Media, and Sport.5 Many cities around the United Kingdom are implementing IoT technology solutions and aiming to become “smart cities.” This has included the identification of 22 “superconnected cities” and a specific program of investment to these between 2010 and 2015, with up to £150 million to develop cities’ digital infrastructure, focusing primarily on broadband capacity and provision of public WiFi. In December 2014, the UK Government’s Chief Scientific Advisor Sir Mark Walport released a report, the Blackett Review on the Internet of Things, discussing potential impacts and risks of IoT technology. In this report, the Internet of Things was highlighted as critical to the United Kingdom’s economic growth, with discussion of the importance of developing national or international standards to enable a consistent infrastructure, cross-connectivity and interoperability, as well as to “ensure the system is trustworthy and trusted” (Walport, 2014, p. 8). Security is a key concern raised in maintaining trustworthy devices and data systems, with standards recommended to provide “security by default.” For this to be implemented, there is a need to consider both the devices and the system(s) in which they are embedded. In May 2015, a briefing paper highlighted the Internet of Things as one of the key issues for Parliament.6 IoTUK, a collaboration between the Digital Catapult and Future Cities Catapult, was set up to manage and coordinate UK IoT investment and growth, supporting government funded projects.7 Additionally, an all-parliamentary group on smart cities was set up. The British Standards institute has also published several guidance documents and standards, including in 2015 an overview of the smart cities landscape,8 and Publicly Available Specification (PAS) documents in 2017 giving practical guidance for developing smart city strategies.9 This is in addition to a policy paper on UK digital strategy published in March 2017, which includes a short section on the Internet of Things (Department for Digital, Culture, Media and Sport, 2017). By 2016, IoTUK was reporting large amounts of UK IoT investment, including £42m privately invested into IoT companies since mid-2015 (IoTUK, 2016). The fast growing scope of the industry indicates the criticality and urgency of introducing appropriate policy and regulation. The same year, a study designed to support policy feedback for IoT development in the United Kingdom was commissioned by IoTUK and BCS, The Chartered Institute for IT, and carried out by policy research organization RAND Europe. The nature of IoT governance, and specifically data governance, was a question raised in the report, with indication from case studies that there has been a lack of consistency in this area, with businesses creating their own data governance structures in an ad hoc manner to suit their needs. Some of those organizations that contributed to the report saw data governance as an ethical rather than a technical question, relating to who should have access to personal data. The report also highlighted that because there

642   Naomi Jacobs ET AL. has not yet been much experience of privacy breaches, technology providers “rarely considered potential wider privacy implications when developing and implementing their solutions” (Gunashekar et al., 2016, p. 30). The report also considered the importance of citizen representation and involvement, stating that “the significance of involving consumers of technology in informing IoT policy and in decision making cannot be overestimated” (p. ix). It did, however, conclude that clearly defined data governance procedures were, in general, seen as “the key to joining up systems, facilitating seamless interoperability, and, consequently, enabling successful IoT implementations” (p. 23). The UK Digital Strategy published in 2017 highlights three examples of funding for research and innovation in IoT: NHS test beds for health-related IoT innovations,10 the PETRAS IoT research hub,11 and the Manchester Cityverve smart city demonstrator.12 Manchester was well-placed to obtain the large Cityverve project, as a pioneer of new devolved government structures in the United Kingdom, giving greater local power and accountability through an elected mayor via the 2014 Devolution Agreement (GMCA, 2018). This project has enabled the region to join up public services and centralize data systems, facilitating “open data” access to regional datasets, coordinated by the Greater Manchester Data Synchronization program (GMDSP).13 Speaking at the All-Parliamentary Group on Smart Cities, Minister for International Trade Greg Hands stated a commitment to supporting the Internet of Things and being a worldleader in smart cities by co-operating to present a single UK smart city offer (Say, 2017). It is not clear, however, what support is being offered at the individual city level beyond the larger projects mentioned earlier. Table 23.4 summarizes how the themes relate to the case study.

Table 23.4  Mapping of Themes in UK Governance Emergent governance theme

Mapping from case study

Level at which governance model functions

National policy Ad hoc governance models developed by businesses Regional devolved governance policies for IoT and open data

Citizen representation and agency

Citizen representation highlighted as critical in decision making

Accountability of governance

Studies to examine the nature and consistency of IoT governance Local devolution provides greater power and accountability regionally

Transparency of governance

Necessity of clearly defined data governance procedures highlighted PAS documents include guidance for developing smart city strategies

Governance and Accountability in Internet of THINGS   643

Case Studies: Local IoT Deployment Many cities in many countries around the world are taking up opportunities offered by national government initiatives, aiming to become a part of the “smart city” trend by introducing IoT technology in a variety of contexts. As already described, it is difficult to find consensus on the definition of a smart city, and there is great variation in what makes these cities “smart,” the level of implementation, and the governance strategies that are employed. One approach in comparing smart city implementation is that taken by digital market research organization Juniper Research, which defines a smart city as “an urban ecosystem that places emphasis on the use of digital technology, shared knowledge and cohesive processes to underpin citizen benefits in vectors such as mobility, public safety, health and productivity” (Sorrell, 2017, p. 2). Based on metrics including technology, transport, energy, open data, apps and the economy, Juniper Research14 releases an annual report with a ranking of “top smart cities.” The top ranked cities in 2017 were Singapore, London, New York, San Francisco, and Chicago. Previous years’ lists also included high rankings for Barcelona and Oslo, but there are numerous other examples of smart city initiatives worldwide. Popular deployments that different cities are implementing include smart lighting, waste management, traffic management, air quality sensors, and services associated with free WiFi. Long range, low power wireless technologies such as HaLow and LoRaWan15 promise capacity for a wide deployment of sensors and other devices, expanding the capabilities and versatility. In the last few years, many applications and services have been implemented at the local level by councils or other city governance bodies. Some of these may be deployed individually through smaller initiatives, while others are part of concerted efforts to create connected smart city environments. IoT deployments are generally implemented with the aim of providing benefit to the ­people, often under programs with broadly defined goals such as “reducing harm” (Aberdeen City Council, UK). Three short case studies of IoT deployments in Chicago (US), Songdo (South Korea), and New York (US) will now be presented. While relatively similar in their profile as large cities, they demonstrate differences that provide an indication of the wide variance in terms of governance models in urban spaces, and each provides a focus on one of the three principles of governance identified previously.

Chicago, United States The city of Chicago has a long history of data collection for public use. Set up in 2010, a data portal16 requires by executive order the provision of data from every city agency, and provides public access to wide ranging data sets, from metrics of sanitation and

644   Naomi Jacobs ET AL. graffiti removal, to building permits issued, and individual taxi trips taken in the city. The Open Data Executive Order (No. 2012-2) specifies unprecedented transparency, honesty and accountability as objectives for the city. The Chicago Array of Things project began in 2014, and received funding as part of the United States’ national Smart Cities Initiative. It grew out of a smaller project initiated in 2013 by Charlie Catlett and the Computation Institute’s Urban Center for Computation and Data, to teach high school students about data collection in cities (Mendelson, 2015). The project describes itself as a “fitness tracker” for the city, with the goal of measuring detail of the city “to provide data to help engineers, scientists, policymakers and residents work together to make Chicago and other cities healthier, more livable and more efficient.”17 The intention was that devices would be deployed at a variety of locations in the city, placed on municipal lampposts, housing a variety of sensors chosen based on information from local citizens and stakeholders about what would be of use (Moser, 2014). For example, Catlett was contacted by residents and local services who wanted air quality data, therefore involving sensors to measure nitrogen dioxide, ozone, carbon monoxide, hydrogen sulfide and sulfur dioxide levels. The original deployment was managed by the University of Chicago and Argonne National Laboratory, in collaboration with the City of Chicago. This work encountered issues finding appropriate locations, power sources, and data connectivity to deploy air quality sensors, despite support from government officials. However, it laid the groundwork for a much larger initiative. The subsequent collaboration to develop the Array of Things aimed to deploy a wide range of sensors across the city. Initially proposing a rollout in 2014, the project experienced some controversy and push back from the public, who proved to be wary about potentially intrusive technological installations and the privacy implications that might result (Mosendz, 2014; Moser, 2014). Concerns raised in press coverage included the fact that the devices would collect information on WiFi and Bluetooth devices in order to gather data on pedestrian numbers. There were also questions about the decision-making and governance process. For example, Alderman Robert Fioretti said that the project team should have obtained permission from the City Council before going ahead with the project (Byrne, 2014). This was not only due to the potential privacy issues raised by the installation, but also by questions of ownership of data, and accusations of potential lost income by not charging a private data collection service for using public utility poles, rather than permit free use to a public-academic collaboration. There was significant media interest in the project and the implications for the public, which led to a delay while the privacy policies and technical infrastructure were redeveloped to address some of the concerns raised. Deployment recommenced in 2016 with more attention to the governance of the project in part as a response to the public ­pushback noted previously. Initial privacy policy and oversight arrangements were made public on June 13, 2016, followed by a three-week “listening period” during which residents were invited to provide feedback. This included two public listening sessions at libraries, managed by the non-profit organization Smart Chicago Collaborative (Elahi, 2016a). Additionally, input

Governance and Accountability in Internet of THINGS   645 was sought via public forums and a digital tool called MyMadison.io, created by the OpenGov Foundation and described as “a government policy co-creation platform that opens up laws and legislation previously off-limits to individuals and the Internet ­community.”18 Responses to these policy events and questions asked by the public are ­available online19 and consider several key aspects of privacy, governance, and accountability. It was agreed that the first nodes in the system would not be installed until finalization of the policies, two or three weeks after the completion of the public consultation. The final policy would also be approved by employees from the city’s Department of Innovation and Technology, as well as the law department, and communities would be able to make suggestions on the placement of sensor nodes using the project website. The governance and privacy policies were released in their final form on August 15, 2016 and are available on the project’s website.20 These include details of an Executive Oversight Council responsible for oversight of the program, co-chaired by the Commissioner of the City’s Department of Innovation and Technology and Charlie Catlett (Director of the Urban Center for Computation and Data, and the original founder of the project). This committee is made up of representatives from academia, industry, non-profits and the community, and supported by additional specialist groups such as the Technical Security and Privacy Group and Scientific Review Group, who will evaluate operations and give recommendations, as appropriate. The project now presents itself as “community technology” (Mendelson, 2015) and highlights its focus on transparency and openness. The devices do not collect data on individuals, and have been modified to process and analyze the data directly, reducing security and privacy concerns by removing the necessity to store or transmit data. The devices themselves have been designed to be large and mounted at eye level “to be more visible and less mysterious” (Mendelson, 2015). Some people, however, were still concerned about issues of privacy even in this second iteration of the project. For example, Stella (2016) questions the implementation of the new privacy policy, commenting that the algorithms do still retain some photographic data for later access despite the privacy policy stating that images will only be used for a short time for analysis. The data itself is owned and thus controlled by the University of Chicago. Some also criticized the privacy policy as initially written for being too brief in its descriptions of how data will be collected and used, with Ray Everett, a founding board member of the non-profit International Association of Privacy Professionals, noting that unlike a consumer purchased device that individuals place in their own home (such as an Amazon Echo) it is not possible to opt-out when you are walking in a public space. Due to the nature of the deployment, it would affect many citizens in Chicago, even those who might be unaware of it, and a greater level of scrutiny is needed (Elahi, 2016b). Chicago Array of Things is unusual in its direct academic origins and the nature of its implementation, delay due to public pushback, and incorporation of consultation. The controversy that the project experienced led to changes in implementation, both in terms of governance and the functioning of the devices themselves.

646   Naomi Jacobs ET AL. This case study demonstrates the importance of transparency and accountability, since the initial iteration of the project raised concerns due to deficits in these key areas; however, it is particularly useful for examining issues of citizen representation and agency. When the project began, citizens’ views were incorporated in terms of where sensors may be most usefully placed, and what data might benefit the city and its residents. However, a lack of public consultation and transparency meant that concerns were raised about the purpose and nature of the deployment. By incorporating new gov­ ern­ance structures that more heavily incorporated public consultation, the project aimed to more fully provide representation and agency. Despite this, there are still concerns with deployments of this nature where citizens have limited control over whether or not their data is collected, since “opting out” is difficult when IoT devices are deployed in public spaces.

Songdo, South Korea South Korea has a reputation as an extremely technologically advanced nation. The capital, Seoul, initiated the Smart Seoul 2015 project to capitalize on existing developments and bring together many different initiatives to boost the city’s smart infrastructure (Hwang & Choe, 2013). Much of this smart city development work is government-facilitated. However, there have also been initiatives that are public-private collaborations, such as an NFC-based mobile payment system, and citizen developed initiatives such as the School Newsletter Application, which sends alerts to parents (Hwang & Choe, 2013). In 2003, construction began on a purpose-built smart city development close to Incheon, around 40 miles from Seoul, on reclaimed land (Carvalho, 2015). This created the Incheon Free Economic Zone, made up of three areas, including the city of Songdo, which was intended to foster international business relating to key technological areas such as information technologies, nanotechnologies and biotechnologies. This purposebuilt “from scratch” smart city development was “planned in greenfield areas with almost no former residents or infrastructure, with purposely loose and flexible regulations” (Carvalho, 2015, p. 44). The initial development project to create a new Smart City was due to be completed in 2015, but still has some ongoing construction with investment plans until 2022 (Lee, Kwon, Cho, Kim, & Lee, 2016). There are a variety of different smart city IoT implementations in the Songdo city infrastructure, including traffic control, emergency response co-ordination, abnormal sound monitoring, energy saving, and citizen communication. Songdo’s development as a “ubiquitous city” underwent implementation planning in 2008–2009 and a private and public joint corporation (U-City Corporation) was established to undertake a pilot project to achieve some of the goals. The largest single shareholder is Incheon Metropolitan City (28.6 percent), with the rest shared between private corporations, including Incheon IT Corporation (Lee et al., 2016). While Lee et al.’s examination of the city’s development includes a section on governance, it focusses primarily on the organizational structure of this corporation and the strategies for sharing information

Governance and Accountability in Internet of THINGS   647 and, ultimately, data produced from the integrated technology being implemented. The case study notes that the city will be guaranteed independence from standard governance structures in Korea, which it suggests “will make the daily lives of citizens more convenient and simplify decision making procedures when problems arise thus making swift actions possible” (p. 31). However, there are no details provided about how these simplified ­decision-making structures will operate and how they will incorporate citizens’ views and needs, or what other risks may arise from reduced gov­ern­ance structures. Some have noted similarities between the governance exemptions here and in other spatially limited, highly controllable regions such as Singapore, which, although portrayed technically as a democracy, is dominated by a single party with a high level of control of systems (Maxwell, Watts, & Purnell, 2016). Carvalho includes a little more detail on this flexible governance structure, including the fact that exemptions were granted from national and local policy frameworks such as procurement and building height regulations. They argue that this provided opportunities to test new services and business models which would not fit with existing regulations and practices, such as foreign and privately supplied IT solutions which may be partly monopolistic (Carvalho, 2015). The private party in question is largely Cisco, who gained the contract to implement the city structures, and aims to offer “cities as a service” whereby a single, Internetenabled utility will charge residents to provide bundled urban necessities such as water, power, traffic and telephony (Lindsay,  2010). Carvalho also criticizes aspects of the implementation, suggesting that technology was “largely pushed from corporations to residents, in a rather inflexible fashion” (Carvalho, 2015, p. 50). Carvalho is concerned that Cisco’s solutions have been developed outside of the country and brought in, leaving limited possibilities for input from local expertise of users and stakeholders. In fact, as Shwayri (2013) points out, the city was planned in large part with foreign investors and immigrants in mind. There is also a suggestion that solutions being offered are at the level of individual consumption and not structural, providing ease of use but not infrastructural benefits to improve wider city efficiencies. This case represents a public-private partnership model with a consortium of organizations, and demonstrates the purpose-built “smart city.” This is distinct from governmental strategies to transform existing cities with infrastructure technology. The high flexibility of governance afforded to the city by its centralized government also provides particularly favorable circumstances for rapid smart city development, but may be difficult to replicate elsewhere, and may lead to industry influence with reduced transparency in their governance. Additionally, it seems that uptake of new residents to the area has been limited, with a struggle to find those who want to live and work there, and the city seems empty and soulless (as described by Borowiec, 2016, and James, 2016). It is difficult to say yet whether this will prove to be a successful smart city initiative. This case study raises questions around transparency and citizen representation and agency, but primarily highlights accountability as an issue. While facilitated by government, implementation is led by private companies. This may have a significant impact on accountability, particularly if their involvement has been facilitated by suspending normal standards and regulatory processes.

648   Naomi Jacobs ET AL.

New York, United States In January 2016, New York City began to install free WiFi across the city (Lufkin, 2016), through the installation of “links”—hub kiosks that deliver high-speed gigabit WiFi as well as phone calls, device charging, and city services via a touchscreen interface. These LinkNYC devices were developed and provided by a company called CityBridge, a consortium of companies which ultimately is part of Google’s umbrella corporation, Alphabet (Pinto, 2016); their aim is to roll out 7,500 units over 12 years.21 The units are not only free to users but provided to the city free of charge, funded by “advertising, sponsorships and partnerships” (as described on the project website). The provision of the technology by a private company means that CityBridge is responsible for its management and implementation. Sottek (2014) points out that the commercial motivations of the system makes it likely that that its roll out will be influenced by profit rather than public service, and that poor neighborhoods may be unfairly discriminated against when choices are made as to where to install the devices, as they will be placed preferentially based on predicted advertising revenues. This is an issue given that the system was promoted to the city in part based on solving the digital divide by the provision of free municipal WiFi. The level of service provided is also affected by this, as reported by Smith (2014), who found that while kiosks were being deployed throughout the city, WiFi speeds were up to 10 times slower for those kiosks without advertising. These slower kiosks make up a much larger proportion of those in poorer areas of the city, potentially increasing rather than reducing the digital divide, and demonstrating segmented service in a similar manner to that enabled by the decision of the FCC to no longer enshrine Net Neutrality (Fung, 2018). There have also been issues in the deployment with regards to governing the usage of the devices. When initially deployed, the LinkNYC kiosks included the facility to operate as a web browser, allowing users to view internet content. In September 2016, however, the web browser functions were removed after complaints from residents, businesses, and elected officials that they were being abused by users spending excessive time at them, or using them to view pornography (McGeehan, 2016). LinkNYC released a statement describing changes made in response to this and other community concerns such as limiting maximum volume at night, suggesting that the system was designed to be flexible “so we can learn how people use LinkNYC, how they want to see it improved, and make adjustments over time” (Citybridge, 2017). Another issue which has arisen concerns the potential of the devices to infringe upon privacy. They require a log in, and collect MAC and IP addresses which, since they are associated with personal mobile devices, can be used to identify and track individuals and connect this with their web browsing. Pinto (2016) raises this concern but also questions the right of the city to surrender personal data on behalf of citizens by agreeing to the private contract for the service, in effect selling citizens’ privacy to a for-profit company. They also are not reassured by the company’s response that they are reviewing their privacy policies to reflect actual practices, since the privacy policy was “written

Governance and Accountability in Internet of THINGS   649 before we knew exactly how the network would operate.” In Pinto’s view this indicates CityBridge is “making this up as they go along.” Other companies are now competing with LinkNYC. For example, LQD WiFi, bought by telecommunications company Verizon in November 2016, has developed a similar kiosk system called Palo. It appears that LQD Wifi is planning to pilot these in at least one area of New York, New Rochelle (Lunden, 2016). This may introduce competition or it may mean that the city must introduce further governance and regulations regarding the deployment. This case study provides a contrast to those discussed so far as it is a wholly private service deployed in the city at no point of use cost to the city or citizens, providing technology infrastructural services but supported by commercial concerns of advertising and data sales. This leads to particular issues with regards to transparency, as the private company does not make clear the details of their governance, nor underlying motivations which may impact upon the service and its function, such as the discriminatory impact of advertising-led service. As a privately led deployment that is authorized by the city, this also places limits on accountability, and citizen agency or representation.

Conclusion The case studies presented each show elements of how different IoT governance principles can affect or interact with the nature and outcomes of IoT deployments. They also represent different levels of governance in operation, particularly between the top-down national governance and more bottom-up city-level deployments that may include different stakeholders. Currently there are many different governance strategies for IoT deployments, as the number of these interventions steadily grows. Without proper gov­ ern­ance, there are risks that these deployments can have consequences for the public, affecting privacy and security, especially given the extreme rate at which the use of these devices and systems is growing (Umeh, 2014). This applies at the national and supranational level in terms of top-down legislation, regulation and national policy, and also at the local deployment level where currently there seems to be a wide range of governance models and strategies that do not necessarily appear to be guided by specific national policy. This is understandable given that said national policies are often limited in terms of guidelines for local implementation. Looking at the top-down governance models we examined of the European Union, the United States, and the United Kingdom, we can see that they share several common features. Economic growth and innovation are often priority areas of national interest, and funding packages and incentives for private sector leadership are encouraged. All three also highlight the importance of trust, privacy, and security, and include the need for regulation and ethical standards to prevent exploitation of the public. But despite this strong support from central government for IoT investment, implementation at the

650   Naomi Jacobs ET AL. city level is often less structured and inconsistent. There appears to be a gap between the type of top-level recommendations and regulations that are communicated by governing bodies, and the day to day governance structures that are required locally for a successful implementation. The three examples of local (city-level) deployments indicate a small fraction of the range of different models that might allow deployments, including public-academic partnerships (in Chicago), public-private partnerships (in Songdo), and fully private initiatives supported by local government (in New York). We must also consider the critical role played by fully grass-roots, citizen-led deployments, which can improve democratization. These are becoming more common with the availability of cheap, offthe-shelf IoT devices and components (Surman & Thorne, 2016), and open-source code bases that allow “tinkering” by citizen groups and individual hobbyists to contribute to IoT infrastructure and address shared concerns (Kortuem & Kawsar, 2010). The examples given also show that pushing out a deployment rapidly without full consideration of governance, privacy implications for the public, and the long-term implications of such technologies can have negative impacts such as increasing public mistrust (as in the case of Chicago) or introducing discriminatory practices in provision of “public” services (in New York). While it is not uncommon that such systems encounter challenges and need revising after their deployments occur, this may be exacerbated by the urgency with which these projects are being implemented. The perceived benefits of being able to implement smart cities technology, and the numerous organizations offering “solutions,” mean that there has to date been a rush to be included and seen as successful in doing so.22 Thus, we have seen around the world a large number of pilot projects, demonstrator indicatives, and rapid deployments,23 which are generally discussed in terms of their positive outcomes, yet undertake limited information sharing about the issues that are encountered, the barriers to scaling, and other limitations that prevent such pilots being rolled out more widely in a sustainable manner. In the United Kingdom, for example, while there are many such initiatives, several of which are worldleading, there is not a self-sustaining private sector market that can function effectively without public sector involvement, and, as Altabev (2017) notes, “we’re still very experimental, we have a situation where we seem to be forever stuck in pilot phases and loving grant funded projects.” Supporting citizens in IoT deployments requires addressing questions raised by each of the themes we have identified from the literature. This may be supported by effective consultation and inclusion, transparency of governance, and the closing of the gov­ern­ ance gap identified here between top-down policy and city or citizen level implementation. We should also carefully consider representation and individual agency—whether citizens have any say in what IoT devices they interact with, where they are installed, what data is collected and how it all comes together in the background—including education about why this matters. These are both critical areas for ongoing research. As IoT technology becomes more ubiquitous, the data collected becomes ever more powerful in what it can “say” about individual habits, practices, behaviors and beliefs. Developing a trusted IoT ecosystem will require the realization of transparent and accountable systems. These should not only make citizens aware of deployments which impact them,

Governance and Accountability in Internet of THINGS   651 but enable querying of the system to provide an appropriate level of transparency for their needs, and evaluation of risk. This is a critical area for ongoing research. Further, reviewing the literature leads us to recommend that governance of these systems should be an evolving process rather than a static decision, adaptable to the changing technologies involved, and to the needs of stakeholders including industry, citizens and localities.

Notes 1. http://www.urbantransformations.ox.ac.uk/project/jam-and-justice-co-producing -urban-governance-for-social-innovation/ 2. http://ontheplatform.org.uk/article/mapping-participatory-urban-governance. 3. Though a number of the articles do not yet have corresponding laws, making it a document in development. 4. https://www.congress.gov/bill/114th-congress/senate-bill/2607 5. For a list of UK investments see Appendix E of Gunashekar et al. (2016). 6. https://www.parliament.uk/business/publications/research/key-issues-parliament-2015/ technology/internet-of-things/ 7. https://iotuk.org.uk/ 8. PD 8100, Smart cities overview—Guide, summarized in the document Making cities smarter: Guide for city leaders. 9. PAS 183, 184, 185. 10. https://www.england.nhs.uk/ourwork/innovation/test-beds/ 11. https://www.petrashub.org/ 12. https://cityverve.org.uk/ 13. http://gmdsp.org.uk/ 14. https://www.juniperresearch.com/home 15. https://www.lora-alliance.org/what-is-lora/technology 16. https://data.cityofchicago.org/ 17. https://arrayofthings.github.io/index.html 18. http://opengovfoundation.org/the-madison-project/ 19. https://arrayofthings.github.io/policy-responses.html 20. https://arrayofthings.github.io/final-policies.html 21. https://www.link.nyc/faq.html#when 22. This may be due to pressure from national bodies and funders, but also is likely to be exacerbated by the tone of numerous workshops, symposiums and conferences aimed at officials around smart cities, which use language that consistently seems to imply that public service has to be offering these technologies or they are not modern. 23. For example, see https://www.nominet.uk/list-smart-city-projects/for a non-comprehensive list of such projects.

References Almeida, V. A., Doneda, D., & Monteiro, M. (2015). Governance challenges for the Internet of Things. IEEE Internet Computing, 19(4), 56–59. Altabev, D. (2017). Citycast [Podcast], Episode 6. https://soundcloud.com/cityverve/citycast -episode-6-with-david-altabev

652   Naomi Jacobs ET AL. Angelidou, M. (2014). Smart city policies: A spatial approach. Cities, 41, S3–S11. Angelidou, M., & Psaltoglou, A. (2017). Enhancing urban sustainability through social innovation: Citizen environmental sensing for air quality monitoring. Sixth international conference on environmental management, engineering, planning and economics, (CEMEPE) and SECOTOX, Thessaloniki, Greece. https://www.researchgate.net/publication/318014463 Ashton, K. (2009). That “internet of things” thing. RFID Journal, 22(7), 97–114. Balestrini, M., Diez, T., Marshall, P., Gluhak, A., & Rogers, Y. (2015). IoT community technologies: Leaving users to their own devices or orchestration of engagement. EAI Endorsed Transactions on Internet of Things, 1(1). http://dx.doi.org/10.4108/eai.26-10-2015.150601 Bihr, P. (2017). A TrustMark for IoT. Thingscon. https://www.thingscon.com/report-a-trustmark -for-iot/ Borowiec, S. (2016). Skyscrapers? Check. Parks? Check. People? Still needed. Los Angeles Times (May, 31st). Retrieved from http://www.latimes.com/world/asia/la-fg-korea-songdosnap-story.html The Bristol Approach. (2018). Project: Damp Homes. Bristol, UK: The Bristol Approach. http://www.bristolapproach.org/wp-content/uploads/2018/05/Portrait-Resource-DAMPHOMES.pdf Byrne, J. (2014). Alderman wants hearing on Emanuel deal to install light pole sensors. Chicago Tribune (June, 23). Retrieved from http://www.chicagotribune.com/news/local/politics/chialderman-wants-hearing-on-emanuel-deal-to-install-light-pole-sensors-20140623-story.html Carvalho, L. (2015). Smart cities from scratch? A socio-technical perspective. Cambridge Journal of Regions, Economy and Society, 8(1), 43–60. Chen, L. J., Hsu, W., Cheng, M., & Lee, H. C. (2016). LASS: A location-aware sensing system for participatory PM2. 5 monitoring. In Proceedings of the 14th annual international conference on mobile systems, applications, and services companion (p. 98). ACM. http://dx.doi. org/10.1145/2938559.2938560 Citybridge. (2017). Update about our services. Citybridge. Retrieved from https://www.link. nyc/service-update.html Department for Digital, Culture, Media and Sport. (2017). Policy paper: UK Digital Strategy. https://www.gov.uk/government/publications/uk-digital-strategy Douriez, M., Doraiswamy, H., Freire, J., & Silva, C. T. (2016, October). Anonymizing NYC taxi data: Does it matter? In IEEE international conference on data science and advanced analytics (DSAA), 2016 (pp. 140–148). IEEE. Elahi, A. (2016a). City seeks input on privacy policy for Array of Things sensor network. Chicago Tribune (June, 10th). Retrieved from http://www.chicagotribune.com/bluesky/ originals/ct-array-of-things-privacy-policy-bsi-20,160,610-story.html Elahi, A. (2016b). City needs more detail in Array of Things privacy policy, experts say. Chicago Tribune (June, 20th). Retrieved from http://www.chicagotribune.com/bluesky/originals/ ct-expert-array-of-things-privacy-policy-bsi-20,160,621-story.html European Commission. (2013). Report on the Public Consultation on IoT Governance. European Commission. Retrieved from https://ec.europa.eu/digital-single-market/en/news/conclusions -internet-things-public-consultation European Commission. (2016a). Commission staff working document: Advancing the Internet of Things in Europe. European Commission. Retrieved from https://ec.europa.eu/digitalsingle-market/en/news/staff-working-document-advancing-internet-things-europe European Commission. (2016b). The Alliance for Internet of Things Innovation (AIOTI). European Commission. Retrieved from https://ec.europa.eu/digital-single-market/en/ alliance-internet-things-innovation-aioti

Governance and Accountability in Internet of THINGS   653 Fung, B. (2018). The FCC’s net neutrality rules are officially repealed today. Here’s what that really means. Washington Post (June, 11). Retrieved from https://www.washingtonpost. com/news/the-switch/wp/2018/06/11/the-fccs-net-neutrality-rules-are-officially -repealed-today-heres-what-that-really-means/?utm_term=.7af6cb96fdc4 Furness, A. (2012). Foundations for IoT Governance. In I.  G.  Smith (Ed.), The Internet of Things 2012: New Horizons. CASAGRAS2. http://www.internet-of-things-research.eu/pdf/ IERC_Cluster_Book_2012_WEB.pdf Future Cities Catapult. (2017). Smart City Strategies: A Global Review 2017. Future Cities Catapult. https://futurecities.catapult.org.uk/wp-content/uploads/2017/11/GRSCS-Final-Report.pdf Gartner. (2017). Leading the IoT: Gartner insights on how to lead in a connected world. Gartner. https://www.gartner.com/imagesrv/books/iot/iotEbook_digital.pdf GMCA (Greater Manchester Combined Authority). (2018). Devolution. https://www. greatermanchester-ca.gov.uk/homepage/59/devolution Gunashekar, S., Spisak, A., Dean, K., Ryan, N., Lepetit, L., & Cornish, P. (2016). Accelerating the Internet of Things in the UK. Santa Monica, CA, and Cambridge, UK: Rand Corporation. Hammersley, B. (2013). When the world becomes the Web. Wired. (July 4). Retrieved from http://www.wired.co.uk/article/when-the-world-becomes-the-web Hill, N., Gibson, G., Guidorzi, E., Amaral, S., Parlikad, A. K., & Jin, Y. (2016). Scoping study into deriving transport benefits from big data and the Internet of Things in smart cities: Final report for Department of Transport. Didcot, UK: Ricardo Energy & Environment. Hwang, J. S., & Choe, Y. H. (2013, February). Smart cities Seoul: A case study. ITU-T Technology Watch Report. Seoul. https://www.itu.int/dms_pub/itu-t/oth/23/01/T23010000190001PDFE.pdf IoTUK. (2016). IoTUK Industry Insights: UK IoT Investment by VCs, Angels and the Crowd. IoTUK. https://iotuk.org.uk/wp-content/uploads/2016/10/UKIoTInvestment.pdf James, I. (2016). Songdo: No Man’s City. Korea Expose (October, 14). Retrieved from https:// koreaexpose.com/songdo-no-mans-city/ Jiang, Q., Kresin, F., Bregt, A.  K., Kooistra, L., Pareschi, E., Van Putten, E., Volten, H., & Wesseling, J. (2016). Citizen sensing for improved urban environmental monitoring. Journal of Sensors, 2016, Article ID 5656245. http://dx.doi.org/10.1155/2016/5656245 Kortuem, G. & Kawsar, F. (2010). User innovation for the internet of things. Proceedings of the Workshop: What can the Internet of Things do for the citizen (CIoT), Eighth International Conference on Pervasive Computing (Pervasive 2010), Helsinki, Finland. Lindsay, G. (2010). Cisco’s big bet on new Songdo: Creating cities from scratch. Fast Company (February, 1st). Retrieved from https://www.fastcompany.com/1514547/ciscos-big-bet-new -songdo-creating-cities-scratch Lee, S. K., Kwon, H. R., Cho, H., Kim, J., & Lee, D. (2016). International case studies of smart cities: Songdo, Republic of Korea. Washington, US: Inter-American Development Bank. Lufkin, B. (2016). NYC’s new public wi-fi is obscenely fast. Gizmodo. Retrieved from http:// gizmodo.com/nycs-new-public-wifi-is-obscenely-fast-1753825735 Lunden, I. (2016). Verizon buys LQD WiFi to expand its IoT strategy into “smart cities.” Techcrunch. Retrieved from https://techcrunch.com/2016/11/14/verizon-buys-lqd-wifi-to -expand-its-iot-strategy-into-smart-cities/ McGeehan, P. (2016). Free Wi-Fi kiosks were to aid New Yorkers. An unsavory side has spurred a retreat. New York Times (September 15). Retrieved from https://www.nytimes.com/2016/09/15/ nyregion/internet-browsers-to-be-disabled-on-new-yorks-free-wi-fi-kiosks.html Maxwell Watts, J. & Purnell, N. (2016). Singapore is taking the “Smart City” to a whole new level. The Wall Street Journal (April, 24). Retrieved from https://www.wsj.com/articles/ singapore-is-taking-the-smart-city-to-a-whole-new-level-1461550026

654   Naomi Jacobs ET AL. Maheswaran, M. & Misra, S. (2015, December). Towards a social governance framework for Internet of Things. In IEEE 2nd world forum on Internet of Things (WF-IoT), 2015 (pp. 801–806). IEEE. Mendelson, Z. (2015). Chicago’s array of things may give big data boost to urban planning. Next City (October 15). Retrieved from https://nextcity.org/daily/entry/array-of-things -chicago-smart-cities-data-sensors Molloy, F. (2018). Fridges that order milk? How the IoT will ease everyday drudgery. The Lighthouse. (February 13th). Retrieved from https://lighthouse.mq.edu.au/article/fridges -that-order-milk-how-the-internet-of-things-will-change-everyday-life Moser, W. (2014). What Chicago’s “Array of Things” will actually do. Chicago Magazine (June 27). Retrieved from http://www.chicagomag.com/city-life/June-2014/What-Chicagos -Array-of-Things-Will-Actually-Do/ Mosendz, P. (2014). Chicago gets a new surveillance system straight out of a video game. The Atlantic (June 24th). Retrieved from https://www.theatlantic.com/technology/archive/2014/06/ chicago-gets-a-new-surveillance-system-straight-out-of-a-video-game/373339/ Narayanan, A. & Shmatikov, V. (2008, May). Robust de-anonymization of large sparse datasets. In IEEE Symposium on security and privacy, 2008. SP 2008 (pp. 111–125). IEEE. OECD. (2015). OECD Digital economy outlook 2015. Paris: OECD Publishing. doi:http://dx. doi.org/10.1787/9789264232440-en Pinto, N. (2016). Google is transforming NYC’s payphones into a “Personalized Propaganda Engine.” The Village Voice (July 6th). Retrieved from http://www.villagevoice.com/news/ google-is-transforming-nycs-payphones-into-a-personalized-propaganda-engine-8822938 Say, M. (2017). Minister calls for “single UK smart city offer.” UK Authority. Retrieved from http://www.ukauthority.com/smart-places/entry/6861/minister-calls-for-single-uk -smart-city-offer Shwayri, S. T. (2013). A model Korean ubiquitous eco-city? The politics of making Songdo. Journal of Urban Technology, 20(1), 39–55. Smith, I. G. (Ed.) (2012). The Internet of Things 2012: New horizons. CASAGRAS2 http://www. internet-of-things-research.eu/pdf/IERC_Cluster_Book_2012_WEB.pdf Smith, G. (2014). Exclusive: De Blasio’s Wi-Fi plan gives slower service to poorer neighborhoods. New York Daily News (November 24). Retrieved from http://www.nydailynews. com/new-york/exclusive-de-blasio-wi-fi-plan-slower-poor-nabes-article-1.2021146 Stella, R. (2016). Chicago’s innovative new streetlights will monitor the city’s every move. Digital Trends. Retrieved from http://www.digitaltrends.com/home/city-of-chicago -adding-sensors-to-its-streetlights/ Sorrell, S. (2017). Smart cities: Strategies, energy, emissions & cost savings 2017–2022. Juniper Research. https://www.juniperresearch.com/researchstore/iot-m2m/smart-cities/strategies -forecasts-in-energy-transport-lighting Sottek, T. C. (2014). New York City’s ambitious free Wi-Fi plan sounds great, unless you live in a poor neighborhood. The Verge (November 24). Retrieved from http://www.theverge. com/2014/11/24/7275567/nyc-public-wifi-is-rich Surman, M. & Thorne, M. (2016). We all live in the computer now: A Netgain paper on society, philanthropy and the Internet of Things. Netgain Partnership. Retrieved from https:// netgainpartnership.org/internet-of-things/ Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(05), 557–570. Umeh, J. (2014). Internet of Things: Exciting yet scary. BCS (September 19). Retrieved from http://www.bcs.org/content/conBlogPost/2357

Governance and Accountability in Internet of THINGS   655 US Department of Commerce. (2017). Green Paper: Fostering the advancement of the Internet of Things. The Department of Commerce Internet Policy Taskforce & Digital Economy Leadership Team, National Telecommunications and Information Administration. Retrieved from https://www.ntia.doc.gov/other-publication/2017/green-paper-fostering-advancement -internet-things Walker, T. (2018). Smart refrigerators that let you know when the milk is on the turn, toothbrushes that keep track of dental hygiene and tennis rackets that help you play better: Welcome to ‘the internet of things’. The Independent. (January, 12th). Retrieved from https:// www.independent.co.uk/life-style/gadgets-and-tech/news/smart-refrigerators-that-letyou-know-when-the-milk-is-on-the-turn-toothbrushes-that-keep-track-of-9053681.html Walport, M. (2014). The Internet of Things: Making the most of the second digital revolution. A report by the UK Government Chief Scientific Adviser. The Government Office for Science. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_ data/file/409774/14-1230-internet-of-things-review.pdf Weber, R.  H. (2011). Accountability in the Internet of Things. Computer Law and Security Review, 27(2), 133–138. Weber, R. H. (2013). Internet of things: Governance quo vadis? Computer Law and Security Review, 29(4), 341–347. https://doi.org/10.1016/j.clsr.2013.05.010 Weber, R. H. (2016). Governance of the Internet of Things: From infancy to first attempts of implementation? Laws, 5(3), 28. https://doi.org/10.3390/laws5030028 Weber, R. H. & Weber, R. (2010). Internet of Things. Berlin: Springer Berlin Heidelberg. Weiser, M. (1991). The Computer for the 21st Century. Scientific American, 265(3), 94–105.

section 9

SY N T H E SIS

chapter 24

ESRC R ev iew Future Research on the Social, Organizational, and Personal Impacts of Automation: Findings from Two Expert Panels Simeon J. Yates and Jordana Blejmar

Introduction This chapter, like the final chapter (chapter  25), is concerned with future research ­challenges. Whereas the final chapter is based on all elements of the ESRC project and the non-ESRC chapters, this draws on the two workshops undertaken during the projects that specifically focused on the social impacts of artificial intelligence (AI) and automation: • UK Economic and Social Research Council and UK Defence Science and Technology Laboratory (ESRC-DSTL) Workshop: “The automation of future roles” • UK Economic and Social Research Council and US National Science Foundation (ESRC-NSF) Workshop: “Changing work, changing lives in the new technological world” Both workshops sought to explore the key future social science research questions arising from ever-greater levels of automation, use of artificial intelligence, and the augmentation of human activity. Both workshops consisted of a set of domain experts brought together to review these issues in a structured setting. The chapter provides an overview of the issues identified by the two workshops. Though both workshops followed a similar format and were supported by the same facilitation team, the approaches

660   Simeon J. Yates and Jordana Blejmar taken and outcomes, though highly complementary, were slightly different. Both workshops developed a broad structure within which to explore questions or issues ranging from global or societal issues through to personal or individual foci. The ESRC-DSTL workshop produced less of a focus on individual and behavioral aspects compared to the ESRC-NSF workshop. The ESRC-DSTL workshop developed very specific questions for each level, whereas the ESRC-NSF workshop generated a set of questions potentially addressed at one or more levels. Unlike the other ESRC chapters in this book, there is no extensive literature review underpinning the results. The chapter begins with an overview of the context in which the workshops were developed, followed by an outline of the methods used to facilitate the workshops. Then it considers how the workshops defined their scope—this proved broader than the initial starting points. Next is an overview of the identified research areas integrating the outcomes from both workshops. This is followed by more detailed descriptions of the issues and questions identified in the workshops. The chapter concludes with some reflections on the key issues and overlaps with the rest of the ESRC review, and provides a detailed Appendix of all the issues, themes, and questions arising from the two workshops. The impact of digital technologies on work and work experience has been extensively discussed, across multiple academic disciplines, for much of the last half-century: from the more utopian visions of an “end to work” (Gorz, 1985), to detailed critical studies of the impact on organizations (Zuboff, 1988), and of course more contemporary accounts and predictions (Anderson & Rainie, 2010; see also chapters 1, 12, and 23). The issue also attracts considerable policy attention in the United Kingdom and the United States where it is likewise both a major concern for industry (Chui et al., 2015; Manyika, 2017) and a recurring topic of discussion in the media and popular writing (e.g., Ford, 2015). For academic researchers, however, it is an area fraught with challenges, with many of the key unanswered questions seeming to require interdisciplinary and multidisciplinary approaches. With these issues in mind, the aims and objectives of the workshops were to • Assess the social and behavioral research challenges of understanding the impacts of automation, AI, and augmentation • Assess major knowledge gaps and discuss how research could help in addressing these gaps • Identify priority areas for research and potential research collaborations • Identify and assess prior academic and stakeholder predictions of the impact of new technologies on human tasks, roles, and jobs • Identify and assess methodologies by which impacts and effects can be assessed, in particular: ◦ Tasks, roles, and jobs ◦ Human knowledge, skills, and attributes ◦ Organizational structures and cultures ◦ Organizational development

ESRC Review: Future Research on the Impacts of Automation   661 Workforce training, recruitment, engagement, and motivation ◦ ◦ Decision making in organizations. These issues were organized into thematic questions as starting points for the workshops: • What impacts might the automation of the future workforce bring? • What new forms of work emerge because of digital technologies? • How do we live with and trust the algorithms and data analysis used to shape key features of our lives? • How do we construct the digital to be open to all, sustainable and secure? • How does digital technology affect our autonomy, agency, and privacy? • Does digital technology help or hinder us in participating at individual and community levels? • What new forms of community emerge because of digital technologies? • Will digital technology make us healthier, better educated, and more productive?

Social and Economic Context Concerns and questions about the impact of automation and augmentation have been with us throughout history, often focusing on how and where technologies were implemented. There is the often-quoted example of why Heron of Alexandria’s “invention” of steam power in 50 AD never got used—the arguments being the presence of “cheap” and reliable slave labor or the higher reliability of water power. Over time, automation of previously manual work and the augmentation of human performance with machines has been at the heart of the “industrial revolution” from the Spinning Jenny through to present day digital technologies. This automation and augmentation underpins the constant increase in human productivity (see Figure 24.1). Why then the current concerns over automation and augmentation? A quick review of recent media coverage would point to issues of • Robotics • Artificial intelligence • Data analytics In each case the concern is for human roles being replaced by computing and technology solutions, from the military (drones), through manufacturing (robots) to office work (AI and algorithms). Notably the issue of human augmentation with technology is not extensively addressed—unless the use of (or claimed excess use) of digital media by children and young people may be considered an example of media panic over an

662   Simeon J. Yates and Jordana Blejmar 340 320 300 280 260 240 220 200 180 160 140 120

1953 = 100

100 80 1950

1955

1960

1965

Labour Productivity

1970

1975

1980

1985

Private Employment

1990

1995

Real GDP

2000

2005

2010

2015

Median Houshold Income

Figure 24.1  Productivity graph.

a­ugmenting technology (see, for example, chapters 4 and 9). These discussions reflect and are reflected in popular science and technology books and articles that have often focused on the negative consequences for work and employment (e.g., Ford, 2015; see also chapter 1) The debate in public media reflects the ongoing debate in academic work around automation, with very different conclusions being drawn over the likely impact of automation and augmentation on work and society. Arguments about the more negative aspects are often balanced with the more positive or “business as usual” models. The common thread in much of this debate is the question of the impacts upon levels of employment that automation and augmentation may engender. Long-term structural unemployment due to technological innovation has been a concern of economists since Ricardo (1821/2009). During the majority of the 19th and 20th centuries, the more positive argument was that in the long term “compensation effects” in the economy would generate new jobs that would match or outweigh short term “technological unemployment.” Compensation effects include: manufacture of new machines, investments in new products, wage changes either increases and spending or reductions leading to increased reemployment, and lower prices and higher demand. As a result, levels of wage growth, employment, and productivity have grown together over the long term—though short-term technological unemployment and deskilling have happened (see Figure 24.1). More recently, economists have argued that compensation effects are no longer keeping pace with the impacts of high-performance computing; thus, technological unemployment has become a structural feature of the economy (Brynjolfsson & McAfee, 2014; Ford, 2015). This has been popularized as

ESRC Review: Future Research on the Impacts of Automation   663 the “hollowed out middle” where higher-skilled “blue collar” and lower-skilled “white collar” work has become automated, such as through robots in the factory and algorithms in the office (Ford, 2015; Madland, 2015). These debates are of course of central importance to economics as a discipline and to long-term social and economic policy. At the same time, this debate does not address the myriad of specific issues and consequences (intended or otherwise) that come from increased automation and augmentation. Identifying and exploring this broader set of issues was the goal of the ESRC-NSF workshop. The workshop did not seek to answer these questions—rather it sought to identify the broad topics where social and behavioral sciences could provide important novel or further insight.

Method and Project Context The two two-day workshops funded and supported by the ESRC, DSTL, and NSF in 2017 included a mix of academics from the United Kingdom and the United States from a range of social and behavioral sciences (see Table 24.1). The workshops formed the final part of the ESRC “Ways of Being in a Digital Age” review project, the main results of which are presented in the ESRC review chapters (2, 3, 8, 11, 14, 16, 18, and 22). The ESRC-NSF workshop had a stronger focus on workplace and socio-economic issues both by design and as a function of attendees. The ESRC-DSTL workshop had a broader remit and included some computer science and industry colleagues.

Table 24.1  Expertise Represented at the Two Workshops ESRC-DSTL

ESRC-NSF

Artificial intelligence, agents and autonomous systems

Anthropology

Computer science

Behavioral science affecting markets for technology products.

Computer science

Cognitive science

Cyber security

Community in virtual environments and other emerging social media

Digital culture and digital inequalities

Computer science

Digital cultures and automation

Computer science and human performance

Digital humanities, automation and media archaeology

Consumption and use of new ICTs and the coevolution of technology and culture.

Employment research

Data science

Employment studies

Design and creative technology (continued)

664   Simeon J. Yates and Jordana Blejmar Table 24.1  Continued ESRC-DSTL

ESRC-NSF

Human sciences

Developmental psychology

Information systems and information management

Digital cultures

Interaction design methods, human computer interaction

Digital globalization and feminist labor theory

Law and regulation

Digital humanities

Management and digital organizations

Digital humanities

Management systems and digital economies

Digital technology and travel

Management, work, technology, and globalization

Economics and technology diffusion

Operations and process management

Hispanic studies and digital cultural memory

Personnel development

Human computer interaction

Philosophy of mind, philosophy of language and cognitive science

Information studies

Robotics

Information systems

Robotics and autonomous systems

Organization studies and human-centered technology

Social and occupational psychology

Organizational communication, diffusion of innovations, environmental communication

Sociology and computer science

Psychology

Sociology of contemporary work

Psychology, human geography and science and technology studies

Sociology of inequalities

Robotics, neurobiology of the sensory motor system and computational neuroscience

Sociology of work and organizations

Situated decision making and technology mediated mobility

 

Social change and digital culture

 

Social informatics, human factors, and interactive computer systems design

 

Sociology of work and organization

Both workshops were facilitated discussions designed to collectively develop a set of consensus positions though documenting interactions and debates. In this sense the workshops functioned as a condensed Delphi consultation (see Linstone & Turoff, 1975; chapter 2). In both cases the workshops involved participants engaging in a range of activities designed to break down potential issues and questions, gain consensus on the key issues, and then restructure these into themes. This also included work on likely timelines and challenges. Much of the work involved applying visual sorting and ­clustering methods to topics generated through the interactions and debates. The methods used were designed

ESRC Review: Future Research on the Impacts of Automation   665 by the facilitators and built on prior work designed to generate ideas for and develop the details of interdisciplinary research projects. Figures 24.2 to 24.4 provide examples of the clustering work undertaken at the workshops (providing two examples of the Post-it Notes prepared and initially organized by the teams, and re-organized through general discussion; and an example of how those were re-analyzed to derive a more condensed and clear structure). The primary outcomes from the workshop are a set of future research areas broadly defined, as well as specific research questions, to which the attendees thought that social and behavioral sciences could or should make a significant contribution. Within these, a large number of specific research questions and topics were identified and are detailed in the Appendix. The “Ways of Being” team built on the clustering and linking work undertaken in the workshop to collate these questions and topics into coherent themes. The themes from both workshops show considerable overlap, and the following sections present overviews of these themes.

Definitions In both of the workshops the definition of key terms and scope was an important initial discussion point. Though the declared foci of the workshops were on workplace

Figure 24.2  Clustering of ideas: ESRC-NSF workshop.

666   Simeon J. Yates and Jordana Blejmar

Figure 24.3  Political, economic, social, technical, legal, and environmental clustering: ESRCNSF workshop.

t­echnologies, especially automation and AI, separating automation, AI, augmentation, algorithms, and digital technologies in general proved problematic, as did the distinction between workplace and broader socio-economic issues. As a result, at its broadest the discussion addressed the social and global impacts of digitization, but also addressed narrower questions of human-machine interaction. The discussions especially reflected on • Automation versus or alongside augmentation? • How to distinguish between “automation” and “digital” It was noted that “Automation” and “Augmentation” marked points on a spectrum of technological interventions in human action. Some systems remove all human intervention and are fully “automated”; others “augment” human abilities through the automation or enhancement of aspects of a task or activity. It was strongly argued that separating these in an arbitrary manner was unhelpful. Further, it was noted in the workshops that ideas of “digitization” and “automation” appeared to both overlap and sometimes be synonymous. Importantly, it was recognized that digitization provides

ESRC Review: Future Research on the Impacts of Automation   667

Figure 24.4  Final research topic template: ESRC-DSTL workshop.

668   Simeon J. Yates and Jordana Blejmar the opportunity for considerable automation of tasks or augmentation of human actors, as it opens the data or activity to computational processing. For the purposes of this chapter, the focus is therefore on systems that automate tasks or augment human action—where digitization or artificial intelligence is in many cases a necessary but not sufficient condition for the automation or augmentation.

Proposed Research Areas Both workshops produced priority lists of areas that the participants believed needed to be the focus of further research work. The ESRC-NSF workshop in particular generated 10 broad research areas from the clustering work of the workshop and further analysis of the workshop materials by the research team:

1. Technology development and adoption 2. Trust 3. Complexity and the scale of the topic 4. Evidence and methods 5. Global environments 6. Education, skills, and employment 7. Inequalities 8. Embodiment and cognitive demands 9. Ethics 10. Impactful social science

Reviewing these areas, the team noted that each included, to a greater or less extent, the following issues: • User issues—such as psychological capacity or system interactions • Citizen issues—such as questions of rights or individual consequences • Social issues—cultural, political or economic, including policy • Global issues—such as environmental impact or global economy • Methodological issues—how to undertake relevant social and behavioral research Not all topics were seen to intersect with all issues, although taken together the two lists produced 29 intersecting topic–issue pairs (see Table 24.2). These topics and issues are very similar to the outcomes of the ESRC-DSTL workshop. The eight main topics identified by the workshop were

1. Social and cultural attitudes to automation 2. Community and social issues

ESRC Review: Future Research on the Impacts of Automation   669 Table 24.2  ESRC-NSF Workshop: Topics by Issues



Topics

User Issues

Citizen Issues

Social Issues

Global Issues

Methods Issues

Technology development and adoption

X

X

X

X

 

Trust

X

X

X

 

 

Complexity and the scale of the topic

 

 

X

X

X

Evidence and methods

 

 

 

 

X

Global environments

 

X

X

X

 

Education, skills, and employment

 

X

X

 

 

Inequalities

 

X

X

X

 

Embodiment and cognitive demands

X

X

 

 

 

Ethics

 

X

X

 

X

Impactful social science

X

X

X

X

X

3. System design for being (in)digital 4. Organizations, professions, and work 5. Trust and accountability 6. Meaningful life roles 7. Oversight and governance 8. Research methods As with the ESRC-NSF workshop, questions about impacts were grouped into three levels: • Wider social impacts • Community and organizational level impacts • Individual experiences and understandings

All topics were deemed to intersect with these, creating 24 potential starting points for research. Finally, the workshop noted some specific concerns that were relevant for some topic-level intersections. Table 24.3 provided the topics, levels, intersections, and those specific concerns.

670   Simeon J. Yates and Jordana Blejmar Table 24.3  ESRC-DSTL Workshop: Topic Areas by Level of Impact Topic

Wider social impacts

Community and organizational

Individual experiences and understandings

Social and cultural attitudes to automation

X

X

Impacts on beliefs about and experiences of technologies

Community and social issues

X

X

X

System design for being (in)digital

X

Development and implantation of systems

X

Organizations, professions, and work

Overall socio-economic X Government and impacts organizational policy   and strategy Overall socio-economic impacts

Trust and accountability

X

X

X

Meaningful life roles

X

X

X

Oversight and governance

Government and organizational policy and strategy   X

X

X

X

X

Research methods

Looking at the content and discussion underlying these lists from the two workshops we find very similar issues. We have combined these into 11 core topic areas that are examined in the next section:

1. Social and cultural attitudes toward AI and automation 2. Technology development, system design, and adoption 3. Trust in automated systems; Oversight and governance 4. Complexity and the scale of the topic 5. Evidence and research methods 6. Global environments 7. Education, skills, and employment; organizations, professions, and work 8. Inequalities; Community and social issues; Social impacts 9. Embodiment and cognitive demands; System design for being (in) digital 10. Ethics 11. Impactful social science

ESRC Review: Future Research on the Impacts of Automation   671

Identified Research Topics Social and Cultural Attitudes toward AI and Automation This issue probably underpins the whole debate. Although it shows up in a variety of issues in the all the following discussion, it was only separately identified in the ESRCDSTL workshop. Table 24.4 highlights the potential research questions that the workshop identified for each level. These clearly overlap with issues around trust and also around technology development. These are clearly research questions that can be addressed by a variety of disciplines, from the humanities (especially the historical, media, and cultural aspects) as well as social sciences, political sciences, and economics. It reflects the large question of how societies understand, discuss, and evaluate AI and automation technologies.

Technology Development, System Design, and Adoption All the workshop discussions included very high level and broad questions about the social, personal, and psychological impacts of automation. Much of this discussion focused around issues of technology adoption. Technology adoption (see chapter 13) is a well-researched area (Davis et al.,  1989; Venkatesh,  2000; Venkatesh & Davis,  2000; Venkatesh et al., 2003) with many case study-based studies of specific technology implementations. Within the analysis, this category functioned as a catch-all for general questions about the impacts of automation. These often reflected broader general questions proposed in media coverage and popular accounts. Examples included

Table 24.4  ESRC-DSTL Workshop: Social and Cultural Perceptions by Level of Impact Wider social

Community and organizational

Individual experiences and understandings

How do attitudes to automation How do attitudes towards Does everyone benefit equally technology and automavary by social class, age, and from automation? tion shape the developethnic background? Do differences between national ment and implementation What can we learn from social/ cultures affect attitudes to of technology (acceptance/ cultural anxieties about automation? rejection)? automation concerning What can we learn about historic regulation and accessibility of debates and controversies automated systems? about automation?

672   Simeon J. Yates and Jordana Blejmar • How do we prepare for a world of intelligent computational aspects (beyond work issues)? • What are the risks and benefits of emerging technologies? • Will we be blindsided by some new technology, such as a brain-computer interface? • How do you conduct research concerning citizens who do not want to adopt technology? • People working longer with technology. Will new technology allow people to work more efficiently or have more leisure time? Is there an absence of utopian thinking about the future of work? The ESRC-DSTL workshop identified “system design” as a key topic area, though the underlying questions clearly overlap with topics from the ESRC-NSF workshop that focused on embodiment, cognitive demands, and ethics. Table 24.5 details the questions identified in the ESR-DSTL workshop.

Trust in Automated Systems; Oversight and Governance This was a key discussion point throughout the workshop. There appeared to be three themes to this cluster in the ESRC-NSF workshop: • Citizen knowledge and skills • Understanding and addressing human-machine trust • Social impacts and consequences of trust in algorithms The first theme focuses on the knowledge and skills of citizens and their ability to assess (or not) technology; i.e., to what extent is trust of technology based on substantial understanding. Do citizens have the digital knowledge and skills to evaluate automated, algorithmic, and augmented systems at work and in everyday life? Do they understand Table 24.5  ESRC-DSTL Workshop: Technology Acceptance and Systems Design by Level of Impact Wider social

Community and organizational

How can we design ways of How do we design “roles” in being digital that respect an automated society that alterity and difference? can withstand or resist What are the effects on “being” commodification and the in digital spaces? profit motive?   What ethical considerations should be “built into” systems prior to automation?

Individual experiences and understandings Why should tasks (as opposed to roles) be automated in a socio-technical system? What are the benefits of replacing or augmenting or evading automation of tasks? What are the theoretical and practical contingencies in the move from operator to operated?

ESRC Review: Future Research on the Impacts of Automation   673 what data they are sharing? How can research help them to understands this? What research is needed to highlight the understandings and assumptions citizens have about automated systems? • Implicit in these questions was a need to enhance citizens’ digital literacy so as to support their ability to evaluate how much to trust automated systems and services. This second theme focused more on the relationship between citizens and automated systems. In particular, there was a focus on how to manage and develop a trusting relationship and to mitigate conflict with automated systems. Questions included: • How is trust established in and through use? • How do we research the process by which trust is established in human-machine relationships? • How do we ensure that trust is based on knowledgeable decisions by citizens? The third theme concerned the social and political impacts of automation and trust in automated systems. This included both issues of challenging these new technologies as well as the consequences of not challenging these. Overall, there was a focus on the social consequences (positive or negative) of ever-increasing automation of systems and services (whether well-designed or not). Key questions were • How do we assess and evaluate consequences? • What cultures and norms around the trust of technologies have developed or are developing? The ESRC-DSTL workshop issues of trust clearly overlap with the topics from the ESRC-NSF workshop, though presented differently (see Table  24.6). One of the key

Table 24.6  ESRC-DSTL Workshop: Trust and Automation by Level of Impact Wider social

Community and organizational

What automated systems need How will technology transform organizations, their tasks, to be certified in the future and decision-making to ensure true accountability processes in the light of AI for autonomous entities? and automation? Who within the organizations will lead or will do this? What is the responsibility versus accountability of automated systems?

Individual experiences and understandings Trust and accountability and organizations, professions, and work? How does trust in automated systems develop? How does trust in automated systems develop? Do the human-to-human trust models translate to the human-to-artificial intelligence interactions?

674   Simeon J. Yates and Jordana Blejmar c­ oncerns at the ESRC-DSTL workshop was that of governance and oversight of AI and augmentation. At all three levels of impacts the following question was asked: • What are the appropriate oversight and governance models for an automated world?

Complexity and the Scale of the Topic A recurring theme in the discussions was the “complexity and scale of the topic.” Many of the points made were not “research questions” but rather “meta” questions or proposals on how to manage and deliver research in a complex and quickly changing technical and social landscape. As already noted, the workshop found it difficult to focus simply on automation, or workplace technology interaction. The implications of greater automation across work domains and non-work contexts could not be easily unpacked. Digital technologies were seen to be pervasive and have multiple interlinked consequences. This led to concerns and questions about how research topics could be managed, interdisciplinarity, speed of change, slow processes of research funding and execution, need to obtain and provide data for policy “quickly,” and relevance of academic work to a fast-moving industry. These were grouped into four clusters: • Inter- and multidisciplinary issues • Managing scale and rate of change • Undertaking responsive and timely research • Policy or managing the social impacts and asking relevant questions Interdisciplinarity was a consistent theme in the discussions. This was of course the consequence of a strongly multi-disciplinary meeting and event. The need to utilize multiple disciplines in addressing the complexity of the issues was noted—especially across the arts-social science-engineering spectrum. A number of consequences of this were also noted, especially the need to recognize that topics new to one discipline may have already been examined (though potentially in different ways) by other disciplines. This prior work might have already generated data sets available for re-analysis. This pointed to a need for disciplines to support each other, including the sharing of tools, methods, and knowledge, as well as re-analysis of data. As noted earlier, the workshop participants emphasized the scale of the impacts of automation and digital technologies. This awareness raised a set of practical questions about how to manage research in such a dynamic context. A key one was undertaking responsive and timely research. This arises form practical concerns of dealing with a dynamic and changing evidence base, the timeliness of research, the need to develop responses to support policy and action quickly, and the contrast of these timelines with those for academic publishing. This cluster included a mix of thoughts and comments by participants with regard to both the practical and the more theoretical challenges for research on automation that is

ESRC Review: Future Research on the Impacts of Automation   675 relevant to policy or that itself has broader social implications. In particular, ethical questions were raised about what should or should not be researched or developed. Practical questions focused on short- to medium-term policy questions about appropriate research and development. More theoretical questions considered the utopian and dystopian implications of research and development around AI and automation.

Evidence and Research Methods Cross-cutting all discussions were questions around what types of evidence we need in order to understand the social impacts of automation and what methods are needed to gather and assess this data. A recurrent issue in these discussions was the pace of change. The perennial question in the study of the social impacts of technology of “Do we need new methods for studying technologies?” was repeated throughout the discussion in both workshops. This is of course a double-edged question, as new technologies also provide both new methods and new data sets. Both workshops highlighted the importance of valuing different disciplinary positions. This was especially the case in the ESRC-NSF workshop, where participants were from a broad range of disciplines covering all the social and behavioral sciences from anthropology and psychology, through economics and sociology, to human–computer interaction (HCI) and information studies (see Table 24.1). The issues of interdisciplinarity and multidisciplinary research were therefore prevalent throughout the discussion, with a key theme being the need to respect different disciplines approaches to the same topic. This includes both quantitative and qualitative work, especially in the context of studying technologies. Both the tension between and complementarity of these approaches and their underlying disciplines arose because studies supporting the development of these technologies were predominantly quantitative, while research on the impacts of these technologies was sometimes examined through more qualitative case studies. Thus the ESRC-DSTL workshop proposed a simple general question: • Which are the most appropriate methods to address questions at the societal, community, organizational, and individual levels? Both workshops pointed out that this may also reflect disciplinary foci (see Tables 24.2 and  24.3). Psychological methods, for example experimental analysis of personmachine interaction, may be more appropriate at the individual or organizational level. Potentially, however, psychological tools to measure attitudes may be used to address wider social concerns. Sociological survey and ethnographic work might be more likely to be used for social and community studies. Economics is appropriate for modelling wider impacts in such areas as jobs and productivity. Cross-disciplinary work was also thought to be key to a full understanding of issues; for example, psychological studies or purchasing behavior online, or ethnographic studies of workers in highly automated workplaces (or even of workplaces undergoing automation).

676   Simeon J. Yates and Jordana Blejmar The need to support methods that deliver for policy or for commercial research and development was also noted by participants in both workshops. Given the important practical and policy implications of potential research in this area, the participants pointed out that research needs to reach out beyond academia. These questions are not just about how knowledge can be transferred, but how academic social science can and should play an integrated role in shaping the consequences of new research and development in the areas of AI and automation. Concerns were raised about both methods that use AI or big data but also about the use of such analytic methods in society without proper thought and review. Many cases of bias, discrimination, and misinformation as a result of the poor use of AI and data analytics technology across society were highlighted (see chapters 7, 18, 19, and 23). The uses and abuses of personal and social media data were particularly noted. This pointed to a broader social and ethical concern for digital research methods: the need to be highly vigilant for bias in data sets, or bias developed from the use of specific data sets (see chapter 20). It was noted that society can no longer follow the cultural assumption that technology is neutral; rather, all digital systems, including AI, are a product of human action and social processes. Therefore, as researchers we need to be highly aware of the potential bias in data gained from, around uses of, or provided about, AI and automated systems. We also need to help educate the public about such potential bias.

Global Environments Workshop participants did not feel that the impact of automation was confined to either human-machine interaction or to individuals, groups, or organizations. The ESRC-NSF workshop particularly and repeatedly highlighted the impact of automation across a range of global issues, including • • • •

Environment Policy Culture Economics

In terms of the environment, the focus was on long-term impacts, whether beneficial or detrimental on the global environment of the generation of ever more automated systems—either for use by citizens or as tools for cheaper, faster mass production (see chapters 7, 15, 16, and 17). The team considered the extent to which some of the research questions around automation may be cultural, both as topics and in terms of the topics selected. Importantly, might the focus of current research be driven by Western, Global North, US-UK-EU concerns? At the same time, technologies might alter aspects of global culture around intercultural communication or the nature of relationships (see chapters 8 and 10). The likely global economic impact of automation was stressed at a number of points in the discussion. Automation technology as a sector is

ESRC Review: Future Research on the Impacts of Automation   677 itself already global and may not be mainly focused in the United States–United Kingdom–European Union. As economies around the globe are impacted by productivity changes due to automation, we are also likely to see changes in the flows of capital, people, goods, and services. Social research needs to document and understand these changes so as to support policymakers, organizations, and citizens. Following on from the concerns noted earlier were questions about how social science research can support and evaluate global policy to address these environmental, cultural, and economic concerns.

Education, Skills, and Employment/Organizations, Professions, and Work The ESRC-NSF workshop highlighted six areas where social research could contribute to debates over education, employment, and skills: • Education • Training and skills • Careers and employment • Nature of digital work • Workers’ rights, rewards, and trust • Disruption and change In relation to education there are questions about its role an increasingly automated society. What it will mean to be a student, and how our understanding of learning in a digital environment, may need to change. Following on from broader education questions are concerns around the skills citizens will need. Two questions were posed: • How can social research support and help understand what and how skills are needed? • What are the social and economic challenges of ensuring skilled workers in an ever-changing work setting? A key role for the social study of work and employment at all scales—global, national, organizational, and individual—has to be that of understanding new work and career pathways (see chapters 10, 11, 12, and 13). The workshop proposed this key question: • How will work identities, relations between workers and employers, and career progression function in ever more automated workplaces? A key area for social science to study, and where there has already been research, is in the changing nature of the “work” as an activity. This includes

678   Simeon J. Yates and Jordana Blejmar • The places and spaces where we work • Effects of the separation and experience of work “time” • Changes to productivity and types of employment (including work “beyond the organization”) • How we separate issues of work and non-work in a digital context? These concerns lead to questions about workers’ rights, especially in regard to trust in both the technologies and organizations. It was argued that this relationship appears to be potentially threatened by AI and automation. This may be heightened where technologies are involved in the monitoring and assessment of performance. Of course, the most difficult task is that of predicting where and how AI and automation will make significant disruptive impacts. Social research can in part help to understand how disruptions have affected and may affect markets and employment (see chapters 12 and 13). Research may also help examine how organizations can react to and manage disruptive technological change. The ESRC-DSTL workshop covered very similar ground (see Table 24.7), focusing mainly on wider social and on organizational issues.

Inequalities/Community and Social Issues/Social Impacts The pressing social research question posed multiple times in both workshops was • Who are the winners and the losers in an increasingly automated world? In unpacking this question, the ESRC-NSF workshop highlighted further issues such as the unequal distribution of cost, risks, and benefits. Other areas of concern included growing disparities in the uses of technology, in access to digital opportunities and in the distribution of wealth. The questions were grouped into three categories: • Growth in inequalities • Using digital technologies to address inequalities • How to support all citizens post automation Table 24.7  ESRC-DSTL Workshop: Work and Organizational Topics by Level of Impact Wider social

Community and organizational

How will technology transform organizations, their How does technological change impact existing jobs, leading to the emergence of new ones and tasks, and decision-making process? Including their implications. the disappearance of others? How can one manage (semi-) automated teams? How can we include the non-human in our How do these technological changes affect the social theorizing? boundaries between professions?

ESRC Review: Future Research on the Impacts of Automation   679 Overall, there appeared to be consensus that a major consequence of further automation may be further growth in social inequalities in access to work, wealth, education, and  wellbeing. These socio-economic processes and outcomes therefore need to be extensively studied. At the same time, a role was seen for social science in providing data, analyses, and insight that might help ensure digital technologies alleviate or address issues of inequality. A final concern was around how societies might look to support a growing population of under-employed citizens, or citizens being more productive to support the wider population. These concerns are reflected in the outcomes of the ESRC-DSTL workshop (see Table 24.8).

Embodiment and Cognitive Demands/System Design for Being (in) Digital Much of the prior comments have focused on inequalities and automation. A parallel issue is that of augmentation—the adaptive use of technologies by human beings (as we noted earlier, there is a gradient from augmentation through to full automation). The focus here for social and behavioral sciences is on the physical and cognitive capacities of people, their ability to utilize augmentations, and the impacts on them. This discussion had two clear themes: • The impact and assessment of cognitive demands from high levels of digital technology use • The understanding of human behavior to enhance the capabilities and usability of digital technologies

Table 24.8  ESRC-DSTL Workshop: Areas of Inequality by Level of Impact Wider social

Community and organizational

Individual experiences and understandings

How will automation affect inequality? Is automation going to make inequalities worse? Which communities are going to be most affected? Addressing impacts on places? Are there intersectional impacts (gender, age, class, ethnicity, etc.)? Are there differential impacts on domestic and work roles?

How will automation How is life changing in an differentially affect automated society? identity? How will automation affect community and identity? Understanding in context of social challenges/issues? Might automation free up people to focus on social actions? What parts of community/social eco-systems are damaged by automation?

680   Simeon J. Yates and Jordana Blejmar There is a clear role for the social and behavioral sciences in helping to understand the impact of digital technology use on (and by) human cognition—both beneficial and detrimental. Understanding of these issues could help address the everyday and specific demands that citizens face in using such technologies, especially in the workplace, but also in everyday life. While research on our use of technologies may provide insight into human behavior and capabilities, at the same time, understanding human behavior and cognitive abilities may also help us to better develop relevant technologies.

Ethics Ethical issues appeared throughout the discussions in both workshops. We classified these into two groups: • Ethics and social consequences of automation • How do we research and develop an ethics for automation? These questions extensively overlap with issues of inequalities, rights, and values noted earlier. However, they were framed in the workshops in terms of the ethical implications of these factors. A deeper set of questions in both workshops addressed how we develop relevant ethical frameworks through research, debate, and social engagement.

Impactful Social Science As a matter of course, the participants were concerned about how to understand and improve social, economic, political, community, and citizen impacts associated with automation. This includes questions about how best to engage policymakers and organizations. The ESRC-DSTL workshop formulated this as a set of research questions detailed in Table 24.9. Again, different questions arose at the three levels of impact: wider social, community and organizational, and individual experiences and understandings.

Conclusion Given the nature of the work reported here, there is no single clear conclusion to draw. Both workshops highlighted the need for both breadth and depth of research on the impacts of automation, AI, and augmentation. In both cases the experts could not separate out these issues fully from the broader question of the social impacts of digital media and technologies—especially as in nearly all cases this involves some form of automation or augmentation of human practices. Though these issues are clearly in the public eye—especially given the media focus on issues of AI and automation—the workshop participants argued strongly that many research gaps remain to assess.

ESRC Review: Future Research on the Impacts of Automation   681 Table 24.9  ESRC-DSTL Workshop: Research Impact Questions by Level of Impact Wider social

Community and organizational

Individual experiences and understandings

How are different groups of people affected by automation? E.g., those whose jobs are replaced vs. those interacting with the automated system? How do we ensure that future AI is not unknowingly biased in its analysis—ensure systems behave in an unbiased manner?

To what extent and what elements of work do we want to be automated? How will the value of human labor change in an automated economy? How do we predict that people may “game” the system? What could possibly go wrong? How do we anticipate potential crises? What is the relationship between quantity of data and quality of decision making? And understanding.

What does it mean to have meaningful/fulfilling life in an automated world? To what extent is it desirable or acceptable to have automated systems make “objective” or “value-free” decisions about daily lives? How do we verify a system that is self-learning? What can we learn from those who resist the imposition, deployment, or use of automated systems?

appendix 1  Detail of the ESRC-DSTL Research Clusters and Questions 1.1  Social and Cultural Attitudes to Automation 1.1.1  Question Set 1: Social Benefits and Attitudes

• Does everyone benefit equally from automation? • How to attitudes to automation vary by social class, age and ethnic background? • Do differences between national cultures affect attitudes to automation? • What can we learn about historic debates and controversies about automation? • What can we learn from social and cultural anxieties about automation concerning regulation and accessibility of automated systems?

1.1.1.1  what evidence will this generate? what could this be used for?

• Policy regarding education and skills training • Design insight into automated systems that need to oppose e.g. across national cultures • Inform public debates about accountability of automation • Regulation and investment decisions regarding automation

1.1.1.2  which disciplines need to be involved? • Computer science • Design

682   Simeon J. Yates and Jordana Blejmar

• Economics • HCI • Law • Media and communications • Social history • Sociology

1.1.2  Question Set 2: Technology Implementation Attitudes • How do attitudes towards technology and automation shape the development and implementation of technology (acceptance versus rejection)

1.1.2.1  what evidence will generate this? what could it be used for? • Inform business how technology is made, how it can be more inclusive

1.1.2.2  which disciplines need to be involved?

• Business • Design • Economics • Information systems/computer science • Media and communications • Sociology

1.2  Community and Social Issues 1.2.1  Question Set 1: Macro-Level Issues (Society)

• Is automation going to make inequalities worse? • Which communities are going to be most affected and/or effected? • Addressing impacts on places? • Are there gender, age, and other impacts? • Also—domestic versus/and work/roles

1.2.1.1  what evidence will this generate? what could this be used for? • Social and economic impacts • Data for policy planning

1.2.1.2  what disciplines need to be involved?

• Economics • IMF/CS • Social policy • Sociology • Urban regeneration

ESRC Review: Future Research on the Impacts of Automation   683 1.2.2  Question Set 2: Meso-Level Issues (Community) • Understanding in context of social challenges/issues? • Might automation free up people to focus on social actions? • What parts of community and social eco-systems are damaged by automation?

1.2.2.1  what evidence will this generate? what could this be used for? • Social policy • Resilient communities • Ideas for social action

1.2.3  Question Set 3: Micro-Level Issues (Individuals and Workplaces)

• Understanding roles, employee perceptions of role and what can/should be automated? • Understanding from workers perceptions of automated impacts? • Are there variations in perceptions by occupation? • What remains of value to the human/of the human?

1.2.3.1  what evidence will this generate? what could it be used for?

• Help design • Policy • Consequences • P/R • Job design • Workplace conduct

1.2.3.2  what disciplines need to be involved?

• Information studies/CS • Management • Occupational psychology • Sociology • Studies

1.3  System Design for Being (in)Digital 1.3.1  Question Set 1 • Why should takes (as opposed to roles) be automated in a socio-technical system? • What are the benefits of replacing or augmenting or evading automation of tasks? • How do we design “roles” in an automated society that can withstand or resist commodification and the profit motive? • How can we design ways of being digital that respect alterity and difference? • What ethical considerations should be “built into” systems prior to automation?

684   Simeon J. Yates and Jordana Blejmar • What are the effects on “being” in digital spaces? • What are the theoretical and practical contingencies in the move from operator to operated?

1.3.1.1  what evidence will this generate? what could this be used for? • Design rules and approaches, and the consequences of such design with socio-technical ­systems • Inform industry practice by highlighting visible as well as invisible automations of roles • Insight into the effects of automation in/between tasks and roles

1.3.1.2  which disciplines need to be involved?

• Artificial intelligence • Data science • Designers • Human geography • Information systems • Legal • Management and business • Marketing and consumer science • Philosophy of technology • Psychology

1.4  Organizations, Professions, and Work 1.4.1  Question Set 1 • How will technology transform organizations, their tasks and decision-making processes? • How can one manage (semi-) automated teams? • How does technological change impact existing jobs, lead to the emergence of new ones and the disappearance of others? • How do these technological changes affect the boundaries between professions? • How can we include the non-human in our social theorizing?

1.4.1.1  what evidence will this generate? what could this be used for?

• Inform workforce policy and planning • Update professional jurisdiction • Align education with (new) job market demands • To better understand organizational and team boundaries and their processes • Update out theoretical “toolkits”

1.4.1.2  which disciplines need to be involved? • Anthropologist • Data/computer science

ESRC Review: Future Research on the Impacts of Automation   685

• Economy • Industrial relations • Informational systems • Organization science • Organizational psychology • Philosophy • Sociology

1.5  Trust and Accountability 1.5.1  Question Set 1 • How does trust in automated systems develop? • Do the human-to-human trust models translate to the human-to-artificial intelligence interactions? • What automated systems need to be certified in the future to ensure true accountability for autonomous entities? • What is the responsibility versus accountability of automated systems?

1.5.1.1  what evidence will this generate? what could this be used for? • Allow us to engineer systems where trust develops appropriately; enable us to develop legal and governance processes and procedures for future automated systems

1.5.1.2  which disciplines need to be involved?

• Cluster • Computer scientists • Educationalists • Law • Manufactures (including software engineers) • Policymakers • Psychology • Regulators • Safety engineering • Science providers • Sociology

1.6  What Is Human?—What Is the Role of Humans in a Future Society? 1.6.1  Question Set 1 • How are different groups of people affected by automation? For example, those whose jobs are replaced versus those interacting with the automated system • What does it mean to have meaningful, fulfilling life in an automated world? • To what extent and what elements of work do we want to be automated? • How will the value of human labor change in an automated economy?

686   Simeon J. Yates and Jordana Blejmar • To what extent is it desirable and/or acceptable to have automated systems make “objective,” “value-free” decisions about daily lives?

1.6.1.1  what evidence will this generate? what could this be used for? • Could be used to inform policy and governance around automation • Facilitate public conversation around automation • Improve efficiency and productivity through workforce optimism

1.6.1.2  which disciplines need to be involved?

• Anthropology • Economics • Management and business studies • Medicine • Philosophy • Political science • Psychology • Sociology

1.7  Technological Limitations 1.7.1  Question Set 1 • How do we predict that people may “game” the system? • How do we ensure that future AI is not unknowingly biases in its analysis—ensure systems behave in an unbiased manner? • How do we verify a system that is self-learning? • What could possibly go wrong? How do we anticipate potential crises? • What is the relationship between quantity of data and quality of decision making? And understanding.

1.7.1.1  what evidence will this provide? what could this be used for? • Data that can help understand how fair/good outcomes are. Feedback loops checks and balancing • Informs engineers on how systems can improve

1.7.1.2  which disciplines need to be involved?

• Computer/data scientists • Educationalists • Law • Manufacturers • Policymakers • Psychology • Regulators

ESRC Review: Future Research on the Impacts of Automation   687 • Safety engineering • Service providers • Sociology

1.8 “Refuse-nicks” 1.8.1  Question Set 1 • What can we learn from those who resist the imposition, deployment, or use of automated systems?

1.8.1.1  what evidence will this generate? • Everything

1.8.1.2  which disciplines need to be involved? • Everyone

2  Detail of the ESRC-NSF Research Clusters and Questions 2.1 Trust This was a key discussion point throughout the workshop. There appeared to be three themes to this cluster: 1. Citizen knowledge and skills 2. Understanding and addressing human-machine trust 3. Social impacts and consequences of trust in algorithms

2.1.1  Citizen Knowledge and Skills • Implicit in these questions was a requirement to enhance citizens’ digital literacy so as  to support their ability to evaluate how much to trust automated systems and ­services.

original questions • How can we get citizens to understand the data they give to companies and the consequences of handling the data? • How can people be trained to interpret algorithms? • Intelligibility, how do people navigate the world? Understand and use and correct the systems around them? • How can people understand what underlies technological decision making? • How do we expose the fact that tech is not value-neutral? • How do we expose tech presuppositions? • What is the level of public’s understanding of new technologies?

688   Simeon J. Yates and Jordana Blejmar • Digital empowerment? • What shape people’s misconceptions about personal data?

2.1.2  Understanding and Addressing Human–Machine Trust original questions

• How do we resolve human–robot conflict? • Should we trust technology and implementation? • How do we avoid blind trust? • How do we establish trust? Especially in algorithms. • How do we understand the trust people have in machine learning?

2.1.3  Social Impacts and Consequences of Trust in Algorithms original questions

• Trust in outcomes? • When do we challenge the digital? • How do people resist change? Why and when, versus accepting? • Is there a connection between automation, migration, and regressive politics? • Have we been misled by social media? • Trust in the use of personal data? • Relationship between trust and risk: culture, social norms.

2.2  Complexity and the Scale of the Topic we have broken the issues down into four clusters:

• (Inter/Multi) Disciplinary issues • Managing scale and rate of change • Undertaking responsive and timely research • Policy or managing the social impacts and asking relevant questions

2.2.1  (Inter/Multi) Disciplinary Issues original questions • Disciplinary issues: HCI physics, what mechanisms have been developed? • How do we cope with the complex interlocking and networking effects of AI? Breaking silos

2.2.2  Collating Data, Cases, and Methods original questions • Develop resources to support knowledge sharing and tool management • Possibility of re analysis of data • New methods, practices to share

ESRC Review: Future Research on the Impacts of Automation   689 2.2.3  Managing Scale and Rate of Change original questions

• How do we navigate the hugeness of the topic? • What would happen if we only looked at home enhancement? • How do we keep pace with tech that moves faster than research perspectives? • How can our research remain relevant to industry? • How do we cope with such a fast-moving environment?

2.2.4  Undertaking Responsive and Timely Research original issues

• Recognizing emergent issues during projects (by scoping) • Less rigorous evidence base- expectations to change • Half-answered for immediate issue with longer term development • Quasi-results • Academic outputs often take a long time

2.2.5  Policy or Managing the Social Impacts and Asking Relevant Questions original questions or comments

• Unintended messages from research? • Policy implications—a given technology not to be used for what purpose? • Patent issues and restrictions? • What shouldn’t be done? • Human centered is profit centered • Creating collaborative workspaces for humans with intelligent machines for utopia ­experiments • How is utopia being framed? • Extension of machine and human at same time augmented both ways to enhance them together? • “Designing for humanity” appeals to engineers • Are automated vehicles and ride sharing creating dystopian/utopian issues? • How can we use our developing knowledge to solve the big global problems? AI + CO?

2.3  Evidence and Methods 2.3.1 Methods the perennial question in the study of the social impacts of technology is • Do we need new methods for studying technologies? Original questions or comments • Research methods: Are current methods appropriate for understanding data or AI? • Social science: Building new technology to undertake research

690   Simeon J. Yates and Jordana Blejmar • How do we develop new approaches to evaluate the impacts of new technology? • How do we encourage robust evidence around what is happening and what is going to happen? • How are we able to be creative with technology?

2.3.1.1  Valuing different disciplinary perspectives original questions or comments

• How can ethnography win respect in an industry dominated by quantitative approaches? • How is accuracy proved? • What can be learnt from historical discourses of techno-futures? • How do we deal with conflicts in terms and definitions between disciplines? • How do we respect the gaps between disciplines? • How is this respect linked to gender? • What can be learnt from contemporary culture about automation?

2.3.2  Supporting Research Policy and Commercial R&D original questions or comments

• Could we build tools to reason scenarios? • How can research best guide tech developments and their consequences? • Who gets to decide research questions—stakeholder, policy members, etc.? • How can digital technology enable greater user involvement? • How do we reason possible futures? • Understanding the role of academia within wider stakeholders; academics as consultants? • How to create a pro-social agenda? • Is technical research expected to cure ailments in multiple ways? (it’s not happening)

2.3.3  Data and Bias original questions or comments • Data etc. inherently with bias—how do we make progress and avoid reinforcing these biases? • How might we ensure we avoid polarization and spend research time on more strategies to collect representative data? • Language—technologies both connect people but also contribute to miscommunication • How do we overcome obstacles related to language technology? • How do we use existing data when it is programmed by humans? For example, biases in word searches, languages on the Internet, etc.

2.4  Global Environments The impact of automation was not felt to be confined to either human-machine interaction, nor to the individuals, groups or organizations. Rather the meeting repeatedly highlighted the impact of automation across a range of global issues including: • Environment • Culture

ESRC Review: Future Research on the Impacts of Automation   691 • Economics • Policy

2.4.1 Environment original questions or comments

• How do we build technologies with social sustainability? • What is the impact of future work on mobility and transport services? • Climactic disruption • How new ways of living impacting resources we use? • Study existing alternatives • Create resilient products rather than just creative ones

2.4.2 Culture original questions or comments

• The filter bubble in the physical world • Reach out of our US-UK-EU bubble? • Machine translation • Physical relations

2.4.3 Economics original questions or comments

• Scenario planning cost/benefit • Global shifts in labor and automation in the global south • Transport/logistics in uncertain times • Envisioning business models for sustaining life/society under extreme conditions • Embodiment of filter economics • How do platforms impact economic decisions and access and embodiment? • How to ensure value is sustained when platforms/systems/countries cease or get shut down? • How might we understand trans-national north south implications for labor?

2.4.4 Policy original questions or comments • Study scale-bridging (global systems and localized problems or manifestations of global problems) • We need new categories for planning for (planetary, multi-systemic) instability, uncertainty, crisis? • How do we build resilient and agile and adaptive infrastructure and cultures of change for an unthinkable future? • How do we build Intelligence for crisis, not for empire? • Differential impacts of AI in one part of the world or other parts • Understanding technologies in the context of multi-systemic instability • Surveillance access issues: Control (lack of transparency)

692   Simeon J. Yates and Jordana Blejmar • New/work life arrangements to even out regional inequality and for sustainability • How do platforms interact or intersect a policy or national projects (of surveillance, social protection, etc.)? • Global political environment • How are filters playing out in shaping decisions? Choices? Access?

2.5  Changing Education, Skills, and Employment Although the discussion was set to be on new employment pathways, the discussion quickly highlighted six areas where social research could contribute:

• Education • Training and skills • Careers and employment • Nature of digital work • Workers’ rights, rewards, and trust • Disruption and change

2.5.1 Education original questions or comments • What will be the role of education and training in the responses to new digital ­technologies? • How do we create students? (to value and recognize others) • We used to understand how learning happens in human-machine interaction • How to transform education to address social tensions created by automation?

2.5.2  Training and Skills original questions or comments • How do we ensure that there are sufficient resources for people to learn new skills? • How do we help people do research with the focus on how we train people to learn (help) new technologies? • How do we educate and train future workforce, learners, and students? • How do we make sure people are not in knowledge and skills silos? • How do we help people understand their skill level; and how do we help them learn new skills? • How do we study and cope with skill degradation related to increasing use of AI, automation, and robotics?

2.5.3  Careers and Employment original questions or comments • Employability—adaptability resilience • Career paths and work • Work identities

ESRC Review: Future Research on the Impacts of Automation   693 • Matching employers and new recruits. How can we use technology for this matching? • What people are enabled to pursue their career paths in the new world of work? • What will the future of new technological “occupational résumés” be? Records of what we’ve done in work? • The digital career coach: How can we enable people to understand their own skills and contributions? • What is the role of organizations in helping workers adapt to new technologies?

2.5.4  Nature of Digital Work original questions or comments • What are effects on productivity? • How to create new jobs for regular people with tech? • What is the role of jobs that remain nondigital? • What are meanings and boundaries of work? • How are spaces and places where people work changing? • What is the role of digital technology in experience of time in work? • How is work becoming less contained in organization? • What are people’s values and views on the meaning of work? Can digital technology augment this? • What is the role of managers in new work structures? (e.g., flexible work) • What about “not-work” leisure, but also volunteer work and different kinds of unpaid work, especially in “flexible” times? • How do we retain dignity, not thinking only in terms of production but also the meaning of labor?

2.5.5  Workers’ Rights, Rewards, and Trust original questions or comments • Work and reward systems and how they’re evolving; how is the need to be rewarded more over time clouded by financial considerations? • What are the challenges to worker motivations of new digital technologies? • How might we keep focus on how labor, equality, justice, and technology interact? • Are new technologies of AI undermining or improving historical values and trust within different sectors and professions? • How can self-monitoring help people, e.g., in the gig economy?

2.5.6  Disruption and Change original questions or comments • How might we tie into the (fast-growing) service industry to make those jobs?

694   Simeon J. Yates and Jordana Blejmar • How might we help people work in a rapid, dynamic gig economy, and work with or around the systems that will mediate that? For example, overcome or work around discrimination, algorithmic bias, and unfair power structures, embedded or mediated by technology. • How do organizations learn about how new technologies are different and how they are implemented? (unintended consequences) • Emerging technologies and changing work practices: How are emerging technologies introduced, adapted, or rejected in professions and sectors, e.g., eagel, health care and medicine, service sector, business, agriculture? • How can we describe the innovation in these domains? What explains the factors that underpin the dynamics of innovation? • How are emerging technologies introduced, adopted, or rejected into work practice?

2.6 Inequalities The pressing social research question, posed multiple times in the workshop, was • Who are the winners and the losers in an increasingly automated world? The questions appeared to split into three categories: • Growth in inequalities • Using digital technologies to address inequalities • How to support all citizens post-automation

2.6.1  Growth in Inequalities original questions or comments

• How are all benefits, costs, and risks of digital not evenly distributed? • How does inequality of resources play out as inequalities of meaningful work? • Are there growing disparities in the use of digital technology? • How is distribution of wealth affected by growth of digital-technology investments ? Is only little investment needed to disrupt or capture markets? • How does individualization of work (of ratings) of a gig economy (e.g., Uber) affect subsequent work? How to avoid exclusion that could result from this? • How might we consider race, class, gender, and sexuality in access to, use of, and outcomes of technology? • Will technology reinforce inequality and polarization?

2.6.2  Using Digital Technologies to Address Inequalities original questions or comments • How can we create a social and technical world to provide opportunity, income, and personal fulfilment? • How can technologies enable fewer people to work? Help us work less? Help more people work more and longer? • How do we help people make a living?

ESRC Review: Future Research on the Impacts of Automation   695 • Disabled communities have the most to win from inclusive design. How can we get for profit-companies to design them and not just for masses?

2.6.3  How to Support All Citizens Post-Automation original questions or comments • How do we get people to work longer versus raise productivity of others to support the whole? • How paid work of 50% supports 100% of population. How numbers change.

2.7  Embodiment and Cognitive Demands This discussion had two clear themes: • The impact and assessment of cognitive demands from high levels of digital technology use • The understanding of human behavior to enhance the capabilities and usability of digital technologies

2.7.1  Cognitive Demands original questions or comments • Should we demand unlimited cognitive resources? And how we might study the effects on social resources too? • Competition from many sources of information gets your attention. How does that affect demands for cognitive resources and organization of information? • How might we design and manage this overwhelming demand on cognitive resources? • How might we understand shifting demands from a variety of devices? • What impact does high screen use have on the developing brain?

2.7.2 Capabilities original questions or comments • What neuroscience research do we need to design/mimic the human brain? Will this ­matter? • How will technology (and at what point) remove the need to co-present? • What does technology tell us about ourselves, and how do we use it? • Computers recognizing and manipulating emotions. • How can we design and control machines that read or respond to human emotions to enhance our strengths and minimize our weakness?

2.8  Ethics and Research Challenges Ethical issues appeared throughout the discussions, and we have classified these into two groups: • Ethics and social consequences of automation • How we research and develop an ethics for automation

696   Simeon J. Yates and Jordana Blejmar 2.8.1  Ethics and Social Consequences of Automation original questions or comments

• How to have value set that is not an economic driver of what jobs to automate? • Do we assume technologies are all advancing the same way everywhere? • Role of politicians and ethics • Concentration of wealth: Is it inevitable? • What’s going to happen to the workers (including “think- up” entities ASPCA for intelligent machines)? • AI and use of algorithms in decision making

2.8.2  How Do We Research and Develop an Ethics for Automation? original questions or comments

• What are the ethics of technology? • What kind of ethical technology can we build? • Who is accountable when the AI- aided product and services go wrong? • Use of big data in understanding issues of individuals • Do we need new philosophical principles for privacy? • Do we need new philosophical principles for new technology?

2.9  Impactful Social Science original questions or comments

• How we get social science focused from Google? • How can social science be embedded in technology? • How does policy address the inequality? • How do social sciences get a seat at the policy table? • Should technology be used for decisions that are impactful? • How do we social scientists get back our disciplines from Google? • How do we open the category of de-industrialization and social dislocation? But in a socio-tech context? • Creating technologies with universal accessibility built in from the beginning, not as an old one. Potential for accessibility is barriers?

2.10  Technology Development and Adoption original questions or comments • Understand better what algorithms are really doing (and how they work): Big data/Big nets • How do we prepare for a world of intelligent computational aspects? (beyond work issues) • How to make system more intelligent by better adding expects in these domains? • What are the risks and benefits of emerging technologies? • Will we be blindsided by some new technology? Human brain technology. Brain computer interface.

ESRC Review: Future Research on the Impacts of Automation   697

• Are these new technologies that will completely change core thinking? • How do you research citizens that don’t want to adopt technology? • How technology influences sociality. • What is the process through which innovations were implemented? • Important to understand diffusion of digital technologies. • What drives diffusion of technologies; What makes them available? • Digital and biotechnology: Are we undergoing normalization through ubiquity? How does this affect diversity? • How can technology incorporate and facilitate diverse perspectives? • People working longer with technology: Will new technology allow people to work more efficiently or have more leisure time? Is there an absence of utopian thinking about work?

References Anderson, J., & Rainie, L. (2010). Future of the Internet IV: Experts and stakeholders discuss predictions about the future of the internet. Pew Internet & American Life Project. http:// www.pewinternet.org/wp-content/uploads/sites/9/media/Files/Reports/2010/Future-ofinternet-2010-AAAS-paper.pdf Broekens, J., Heerink, M., & Rosendal, H. (2009). Assistive social robots in elderly care: A review. Gerontechnology, 8(2), 94–103. Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. New York, NY: W.W. Norton & Company. Chui, M., Manyika, J., & Miremadi, M. (2015). Four fundamentals of workplace automation. McKinsey Quarterly, 29(3), 1–9. https://roubler.com/au/wp-content/uploads/sites/9/2016/11/ Four-fundamentals-of-workplace-automation.pdf Davidson, P. (2013). Income inequality and hollowing out the middle class. Journal of Post Keynesian Economics, 36(2), 381–384. Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8), 982–1003. Ford, M. (2015). The rise of the robots: Technology and the threat of mass unemployment. London, UK: Oneworld publications. Gorz, A. (1985). Paths to paradise: On the liberation from work. London, UK: Pluto Press. Linstone, H. A., & Turoff, M. (Eds.). (1975). The Delphi method. Reading, MA: Addison-Wesley. Madland, D. (2015). Hollowed out: Why the economy doesn’t work without a strong middle class. Oakland, CA: University of California Press. Manyika, J. (2017). A future that works: AI, automation, employment, and productivity. McKinsey Global Institute Research, Technology Report. https://www.jbs.cam.ac.uk/fileadmin/ user_upload/research/centres/risk/downloads/170622-slides-manyika.pdf McIntosh, S. (2013). Hollowing out and the future of the labour market. BIS Research Paper, (134). http://www.niesr.ac.uk/sites/default/files/files/PDF/Presentations/Hollowing-out%20 and%20the%20future%20of%20the%20labour%20market.pdf Ricardo, D. (2009/1821). On the principles of political economy and taxation (1821). Whitefish, MT: Kessinger Publishing. Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic motivation, and emotion into the technology acceptance model. Information Systems Research, 11(4), 342–365.

698   Simeon J. Yates and Jordana Blejmar Venkatesh, V. & Davis, F.  D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186–204. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478. Zuboff, S. (1988). In the age of the smart machine: The future of work and power. New York: Basic Books.

chapter 25

Conclusion Cross-Cutting, Unique, and General Themes in the Oxford Handbook of Digital Technology and Society Ronald E. Rice, Simeon J. Yates, and Jordana Blejmar

Introduction The range of issues raised by the study of digital technology and society are both broad and deep: moving from research and theory, to design and constant technological developments, through policy and economics, to management, use, adaptation, and effects. Further, these issues are inherently interrelated, as all play a part in shaping, and being shaped by, information and communication technologies. So it should be no surprise that many topics and themes appear in multiple chapters in this Handbook. As a conclusion and review of the themes in the Handbook, we summarize both the common and unique, and the general and specific, topics and challenges presented. We look first to the ESRC Domain chapters, and then to the non-ESRC chapters. Drawing on the chapters and the overarching analysis from the ESRC project, we make some suggestions for areas of medium-term future digital media and society research. Throughout the chapter we will talk about the “social impact of digital”—however, this in no way implies that we are taking a technological determinist stance. As the chapters in the volume clearly demonstrate, the key issues are understanding and unpacking where possible the complex web of interactions between digital media systems, the processes of their design and implementation, the different forms of appropriation by users, and the multiple consequences of their use—intended and unintended. It is this complexity we seek to encompass via the shorthand of “social impacts of digital.” We are very aware of the challenge of not falling into the trap of what Grint and Woolgar (1997)

700   Ronald E. Rice ET AL. called “technism” whereby technological determinism implicitly creeps into social analysis; on the other hand, of course Grint and Woolgar’s Actor Network Theory approach (also see Latour, 2005) is not used by the majority of the writers in this volume. We are also aware that in the analysis of actual use and practice, the limiting and determining features of technologies (what are often called “affordances”) are key to the analysis—whatever their socio-technical antecedents and the social processes behind their original design.

Cross-Cutting Topics and Challenges in the ESRC Review Chapters As we have noted before (especially in chapter 1), the interconnections among topics is a key feature of studying digital society, as the very nature of digital systems is to interconnect content, people, programs, and entities, within and across contexts.

Co-occurring Terms and Cross-cutting Topics To get a sense of the changing prevalence of major concept interconnections, Figure 25.1 shows the most frequent concept pairs from 2000-2004, and Figure 25.2 shows those for 2012–2016.1 Early on, the most frequent concept pairs emphasized the new communication technologies, their users, and their uses. Table 25.1 summarizes those most frequent concept pairs from Figure 25.1 comprising these three foci. By 2012–2016, the most frequent concept pairs had shifted emphasis somewhat. First, technology as a communication medium is the most frequent pair, but communication as a use does not appear much thereafter (compared to the prior period). While combinations among technology, users, and uses still dominate, we now see an increased focus on data, access, and privacy issues (including data collection for research), and interventions and research (studies or projects applying technology to effect changes, typically in health contexts, analysis, student or group participation). Table 25.2 summarizes those most frequent concept pairs from Figure 25.2 comprising these five foci. However, looking at the main topics identified in each domain provides a deeper analysis of shared concerns across the domains. The cross-cutting topics can be dealt with in two ways: either as research areas to be addressed, or as key methods and challenges for future research projects within the domains examined in this book. Several topics and challenges appear across the ESRC Review chapters. Table 25.3 details the most common topics and Table 25.4 the most common challenges. To create these lists the topics and challenges were recoded into a standard format for all domains. Those topics that cross more than three domains are in bold. The highest ranked cross-cutting challenges are common to all the domains.

Conclusion: Cross-Cutting, Unique, and General Themes   701 health/ management interven information/ priva

model/ variabl care/ patient

network/o

relationsh care/health internet/sur

college/ stud

effect/int

communication/

child/ computer

network/suppo site/web

information/s

analysis/ datum group/ participan

communication/ medium

sample/st

discussion/

health/ support

access/internet

child/ parent

site/user

interaction/r

difference/

intervention/st

network/ site

internet/user

internet/use

participant/ stud

group/support

science/ technolo

communication/ group/identi

computer other/people care/int

community/ne

man/woman

group/member

community/m

home/interne

network/tie

group/intera culture/medi

life/people

government/s

citizen/gov

Figure 25.1  All seven domains 2000–2004: Most frequent concept pairs.

Cross-cutting Research Questions In regard to research questions, we would argue that there are two research topics that are strongly cross-domain but that also warrant separate consideration. Nearly all the questions you can ask about the social impacts of digital media start with who has access (digital divides) and what uses they can make of or what can they do with the media (data and digital literacy): • Digital divides and digital inequalities—including the two-way interaction between digital inequities and other areas of social inequity • Data and digital literacy—not just skills but also the depth of understanding citizens have of the systems they use, their use of them and the uses that are made of citizens data All the work we have covered raises questions of digital inequality. Chapter 15 (Yates & Lockley) examines this question in more detail. The key argument is that such inequalities are not just about access. The issue is how variations in access, skills, knowl-

702   Ronald E. Rice ET AL.

intervention

datum/surve

group/ member

internet/skil

chilp/parent datum/surevei medium/news

health/outoo

care/patient group/participan

datum/protect

datum/set

contert/user

home/ internet

analysis/datum

health/support

care/health

information/s

health/intervern

site/user culture/med

datum/source

interverntion/st

internet/user internet/use

communication/medium

collection/

science/ technol intervention

participant/stid

access/datum disease/hea

datum/privacy

network/site

datum/researc

access/internet

information/ priva

model/varuabl group/support network/suppo

care/inerv

intervention/t

college/stud networking/s

prvacy/por network/tie adult/iner

difference

focus/grou

computer/s

Figure 25.2  All seven domains 2012–2016: Most frequent concept pairs.

edge, and practice are in correspondence with other aspects of social inequality—such as access to education or health care, wealth, employment, housing, or cultural and community life. Digital technologies have the possibility to reduce but also to further exacerbate these issues. Within this research topic there are then two broad sets of questions: • How digital media are incorporated into existing systems and processes that generate inequity, the specifics of how they correspond and interact with systems of inequity and distinction, and how they themselves include levels and forms of inequity and distinction. For example, how does access to digital media affect employment opportunities and outcomes? • How are digital media used and deployed by citizens, institutions, organizations and governments to seek to address either digital inequities, or broader social inequities? For example, how can digital media be used to improve educational outcomes?

Conclusion: Cross-Cutting, Unique, and General Themes   703 Table 25.1  Main Themes of Concept Pairs, 2000–2004 Concept pair Internet/use Internet/user Site/web Access/Internet Communication/medium Group/support Man/woman Group/member Community/member Communication/interaction Community/network Group/participant Health/support Child/computer Communication/relationship Care/health Network/support Information/source Interaction/relationship Other/people Network/tie Home/internet Culture/medium Life/people Citizen/government Group/interaction

Technology

User

Use

X X X X X                 X     X         X X      

  X       X X X X   X X   X       X   X       X X X

X         X       X     X   X X X X X   X X X   X X

Note: Concept pairs listed in decreasing order of frequency, as shown in Figure 25.1.

These are not necessarily new questions, but each iteration of technological change brings new possibilities and new consequences—intended and unintended—that need to be understood. We would also argue that two other topics cross cut, but they also need to be considered in and of themselves. These are • Governance, regulation, and legislation in regard to digital media—how societies chose to manage (or not) the development, implementation, and uses of digital media. • Roles and impacts of major platforms—many corporate platforms (e.g., Google, Facebook, Uber), as well as core technology providers (e.g., Intel, Cisco). These scenarios are more likely to be dependent on social, national, or technology contexts. Governance issues vary from such things as debates in the United States over

704   Ronald E. Rice ET AL. Table 25.2  Main Themes of Concept Pairs, 2012–2016 Concept pair Communication/medium Internet/use Care/health Analysis/datum Internet/user Access/datum Datum/privacy Access/internet Participant/student Intervention/study Health/intervention Group/participant Care/patient Medium/news Datum/project Health/support Datum/source Intervention/participant Information/privacy Group/support Network/support Intervention/trial College/student Model/variable Datum/research Collection/datum

Technology

User

Use

Data, access, privacy

Interventions, studies

X X     X     X           X                        

        X       X     X X             X X   X      

  X X               X   X X   X       X X          

      X   X X X             X   X   X           X X

      X         X X X X     X     X       X   X X X

net-neutrality to issues of Internet censorship of different kinds in various nations. These are also driven by the new circumstances of technology use. Some examples include how to legislate for and address hate speech on-line across jurisdictions, defining liability for automated and AI systems, or national and international market regulation of digital products and services. Global digital platforms for services have become a norm in the current digital environment. Though the dominant digital platforms may change over time (Myspace to Facebook) or vary between regions (Facebook/Twitter/WhatsApp vs. Weibo/WeChat) it can be argued that we are currently in period where “platforms” are key to social, cultural, and economic behaviors and outcomes. These platforms have both numbers of members and financial turnover greater than a large proportion of countries in the world. We would strongly argue that understanding the role of platforms is key to understanding our current digital world in all the domains addressed in this volume. The dynamics of a platform economy, of identity management via the limiting constraints of a few platforms, or of the political implications of platform-based social networks, are questions where the role of platforms is key to three of the domains discussed in this volume. We acknowledge that disruptive technological, political, or economic changes might overturn the importance of platforms as currently understood.

Conclusion: Cross-Cutting, Unique, and General Themes   705 Table 25.3  Cross-cutting Topics in ESRC Themes Topics

Percent

Digital divide Privacy Data access and literacy Citizenship Device, environment and service design Participation Methods Governance Education Role and impact of major corporate platforms Mobilization Talk Cyber security

8.0 6.8 6.1 4.5 4.5 3.2 3.2 2.9 2.6 2.6 2.6 2.3 2.3

Note: Topics in bold cross-cut more than three of the seven ESRC domains.

Table 25.4  Most Frequent Cross-cutting Challenges in ESRC Themes Topics Methods Theory Ethics Big data Co-design Multi-platform studies Holistic understanding Digital divide Education Data access Interdisciplinarity

Percent 38.0 12.0 9.5 8.7 5.0 3.3 2.9 2.5 2.1 2.1 2.1

Cross-cutting Challenges In the ESRC project we asked what the key challenges researchers are facing in each of our domains. Looking across the chapters in this volume it is clear that many of these are cross-cutting and pertinent to any work on the social impacts of digital media. We would therefore argue that all near-term projects on digital technology and society should seek to examine or address each of the following: Methods innovation. This should include reflection on and evaluation of: digital tools, analytic approaches, and the digital representation of results. This could and should

706   Ronald E. Rice ET AL. include taking risks based on the efficacy of new tools and methods as they are tried out and tested. Theory testing and evaluation, with theory development where needed. In all the domains, we found a great variety of theory, but much of that was used as a general backdrop without operationalization or evaluation. For example, many of the sociologybased items reference “Network Society” theory without operationalizing this in any clear manner. In contrast, much of the psychology work directly applies theory, but with extensive variety. We would caution about the need to develop new theory for its own sake. As was noted by participants in the consultation workshops, just because digital technologies are new, they may not need new social science theories to understand their uses and implications. There may be a need for greater clarity on “most relevant” theory and on incremental theory development as opposed to a need for “digital specific” theory development. This makes theory testing, new and old, essential. Ethics. Ethics, especially around the use of publicly visible social media data, remain a challenge for researchers, though clearer guidance is being provided by academic organizations (e.g., the Association of Internet Researchers, and the British Psychological Association). There are also considerable ethical questions around what researchers, government, businesses, and organizations do with the data of respondents, citizens, and consumers data. Further, the increasing integration of digital technologies into medical, social, work, and biological contexts raises a wide variety of general as well as novel ethical issues. We would argue that projects should have a component that assesses the ethical challenges faced, in order to help build a knowledge base of best practices and key concerns. In the United Kingdom, the Nuffield Foundation (http://www. nuffieldfoundation.org/), which supports innovative social research, has established the Ada Lovelace Institute to address these research questions and promote ethical practices, importantly by convening “diverse voices to build a shared understanding of the ethical questions raised by the application of data, algorithms, and artificial intelligence.” Similarly, the UK government has established a Centre for Data Ethics and Innovation to advise on data ethics policy. Social research can make considerable impact here through addressing these issues both through research about the ethical challenges and by developing ethical research practices. Big data. Many research funding agencies are currently supporting initiatives that address big data (however that is understood in their disciplines; e.g., chapter 18 (Yates, Robson, Rice, & Carmi), chapter 19 (Hintz), and chapter 20 (Nugent-Floan & Edmond)). We would caution against focusing specifically on this as a methodological issue separate from the context of the discipline (or set of disciplines) within which research is taking place. The key challenge is that many disciplines now face the possibility of documenting, archiving, and analyzing what would have previously been impossibly large amounts of data to assess in analogue format. Separate from methods innovation, we would argue that projects which seek to use “big data” should include a robust element of reflection and evaluation on the usefulness, limitations, tools used to analyze, and representativeness of the examined big data sets.

Conclusion: Cross-Cutting, Unique, and General Themes   707 Multi-platform/holistic studies. Analyses of the literature shows that research has often focused solely on one technology or platform, though with much good work already been done exploring specific technologies—Twitter, Facebook, Google, Uber, mobiles, smart phones, blogs, specific government systems, etc. However, the reviews also note an increasing trend in which research is undertaken in a comparative way between existing and new technologies or platforms. In general people do not use dig­ ital media platforms or technologies in isolation from each other nor separate from other social action (e.g., as covered in most of the Handbook chapters, but especially chapter  4 (Meier et al.), chapter  8 (Yates et al.), chapter  9 (Rice et al.), chapter  10 (Cecchinato & Cox), chapter  15 (Yates & Lockley), chapter  19 (Hintz), chapter  20 (Nugent-Folan & Edmond), chapter  23 (Jacobs et al.), and chapter  24 (Yates & Blejmar)). Such work is necessary to understand the specifics of technologies or sociotechnical contexts. The Delphi responses have strongly argued for the need to look at digital technology use overall, and in combination, to ask broad social science questions and then explore which technologies are relevant to citizens’ actual practice and in what ways. From this we can develop a more holistic picture of the integration of digital into everyday lives (or not, in the case of digital inequalities). This does not preclude single technology studies where this has relevance, but such decisions should have a strong social science basis—not simply one of the accessibility of data or the novelty of a new device or app. For example, there appear to be class differences in the uses of different social media platforms. If this is valid, a case could be made for a project to focus on a specific platform as used by a specific community (i.e., mobile phones in low-income areas). One area where this may be more acceptable would be in the economic domain, as the study of the impact of a platform on a sector might be limited to one technology (e.g., Uber). However, overall, the Delphi and workshop results highlight a contemporary need to “reverse the telescope” and focus on the breadth and depth of citizens’ digital worlds, as they navigate among multiple technologies and platforms. This puts social science questions to the fore within which a mix of digital technologies may play a part.

Missing Areas and Gaps One of the separate policy workshops brought together scholars from across the disciplines covered by this review, as well as from the UK’s media regulator Ofcom, the ICT sector, the UK’s Department for Digital, Culture, Media and Sport, and the UK’s Department of Work and Pensions. Resonating with the gaps identified in the ESRC Domain chapters as summarized above, the workshop identified two other areas where digital facing research may inform policy: impact and policy implications, and digital culture. What are the broader policy implications of the results of our social science work on digital media? Research outputs should not remain isolated and can contribute to ­evidence-based policymaking. This is clearly crucial in all our domains, but especially

708   Ronald E. Rice ET AL. areas such as politics, governance, health, economics, and social inclusion. The ESRC project did not look to explore the links between research and policy, though we feel that work documenting the policy relevance and impact of research on digital media is needed, importantly to identify where high-quality research may not be reaching ­policymakers. Key areas for policy identified in the workshop were: digital inclusion and exclusion; creative and digital industry sector policy/regulation, digital skills, digital and social policy, and arts and culture policy. A related issue was further research needed on the general use of, and also evaluating the effectiveness of, digital tools that support policymaking, as well as how digital tools and media may impact the methods of policymaking—such as the rise of “agile” policymaking. Finally, more work should be done on how digital media are used, with what implications, in the policy delivery. The other gap is the role of digital media in culture. There is of course a whole body of work on Digital Humanities and a vast body of practice around digital arts. But processes of digital consumption and digital cultural practices clearly cut across social science questions. This includes questions such as how digital consumption corresponds with social inequalities; how digital cultural production (from YouTube to more conventional art forms) intersects with questions of community or the role of platforms; how cultural institutions are addressing the impacts of digital systems; or how arts and cultural governance and funding might intersect with new(ish) digital media formats such as games, virtual reality, and augmented reality. There is a need to ensure such questions are considered from both social science and arts perspectives—that they are not solely about the aesthetics or the value of the practices—especially as variations in taste, consumption, and practice may be key to how people utilize digital media, with positive and negative implications.

Cross-cutting and Unique Topics and General Themes in the non-ESRC Chapters As with the ESRC Domain chapters, the non-ESRC chapters also manifest a wide range of themes and subthemes, some of which cross-cut more or less frequently across the chapters, and some of which are fairly infrequent and unique. Further, the patterns of relationships among the themes and the chapters provide some basis for identifying larger, more general themes associated with digital technology and society.

Common and Unique Themes We began with the coding typology developed in chapter 1, based on 89 recent books on digital technology and society. However, we added relevant emergent codes in NVIVO

Conclusion: Cross-Cutting, Unique, and General Themes   709 as we re-read and coded each of the 14 non-ESRC chapters, consolidated “participation, engagement” (non-civic) into C5 Inclusion, exclusion, discrimination, and deleted all of the initial subthemes that did not appear in these chapters. This resulted in 22 themes with 144 subthemes. Table  25.5 presents the number of chapters that included each Table 25.5  Non-ESRC Chapters: Number of Themes and Subthemes by Chapters and by Total Instances Themes and Subthemes A1 Theory Actor-network theory Attitude change, persuasion Behavior change, motivations, perceptions Boundary theory Citizenship theories Collective action theory Diffusion of innovations Digital divide, digital inclusion Domestication theory Dramaturgical approach (Goffman) Economic rationalism Industrialization and capitalism Media mastery Mediation theories Network theory Practice theory Public good Self-determination theory Social capital Social comparison theory Social construction of technology Social exchange theory Social identity theory Social loafing Socio-technical Space vs Place Structuration Technology acceptance model Transactive memory theory A2 Names for new social era Datafication Digital age, society, revolution Digital citizenship Digital natives, immigrants Information, knowledge society Liberation technology Second machine age

# Chapters

# Instances

10 1 1 2 1 2 1 2 1 1 1 1 1 1 2 3 2 1 1 1 1 2 1 1 1 1 1 2 1 1

45 1 1 3 2 2 1 2 1 1 1 1 1 1 2 3 2 2 1 2 1 3 1 2 1 2 1 2 1 1

5 1 2 1 1 1 1 1

9 1 2 2 1 1 1 1 (continued )

710   Ronald E. Rice ET AL. Table 25.5  Continued Themes and Subthemes

# Chapters

# Instances

13 2 1 2 1 1 1 3 1 5 1 2 1 2 4 6 1 1 1 3

58 5 3 2 1 2 1 4 1 7 1 2 1 4 7 9 2 1 1 4

B2 Technology characteristics Affordances Habitual, familiar, updating Materiality Mediation vs. objects, devices, apps Usage types, forms

9 4 2 1 1 3

17 5 3 1 1 7

C1 Content, creation Crowdfunding Design Emotional content, responses Humor, memes, hashtags Knowledge sharing Online expression Producers, users, producers, sharing Storytelling

9 1 2 2 1 1 1 6 1

25 1 4 3 2 1 3 10 1

C2 Big data, data mining, data storage, analytics, user data Attention industry, marketplace, merchants, customers Big data, data mining, data analytics, data definitions Data user, personal, online, digital traces Privacy, surveillance, security, anonymity

7 2 4 2 5

32 4 14 3 11

C3 Civic issues Civic media, citizenship, democracy, public sphere, the news press Digital countercultures, underground

5 4 1

33 7 1

B1 Technology venues Algorithms Artificial Intelligence Computer-mediated communication GPS, locational Infrastructure Intelligent machines Internet of things Knowledge sharing systems Mobility, mobile phones Multiple media, ICTs Other Resource usage feedback technologies Robots & social robots Smart homes, cities, e-government Social media, networking sites Sustainable HCI Ubiquitous computing Visualization Wearable computing, devices, sensors, smartwatches

Conclusion: Cross-Cutting, Unique, and General Themes   711 Engagement, participation civic Free speech, censorship Political, politics Power Social movements & digital activism (incl. feminist activism, play as resistance), collective action

3 1 2 4 2

6 7 2 6 4

C4 Ethics, ethical issues

4

8

C5 Inclusion, exclusion, discrimination Class (social, economic) Digital divide, access Discrimination Gender Inclusion, exclusion; equality, inequality Participation, engagement

8 1 4 1 1 2 1

20 3 6 1 3 6 1

C6 Manage digital experience

4

16

D1 Digitization of self, others Biosensing, quantified self & animals

1 1

1 1

D2 Health Digital health Healthspan and lifespan Online information, interventions Psychological condition or effect (e.g., well-being, depression, stress) Support, coping (psychological, physical)

6 5 1 2 5

28 8 1 2 10

3

7

12 3 2 3 7 2 1 7 5

59 6 5 4 12 2 1 14 15

4 1 1 1 1

6 1 3 1 1

10 3 7 4 3 5

33 5 9 4 4 11

D3 Relationships Community, offline and online Families Friendship, friends Identity, selfhood, self-presentation, self-disclosure Individual, collective; public, private Intimacy Social (interactions, relationships, networks, group identity) With machines, devices D4 User groups College students Elderly User types Women D5 Culture, everyday life, education, learning, training Consumption Culture — organizational, national Education, learning, training Everyday life, practice, contexts Literacy

(continued )

712   Ronald E. Rice ET AL. Table 25.5  Continued Themes and Subthemes D6 Work and organizations Business models Innovation, adoption, acceptance Labor, creative, digital, employment Media use policies Organizations & business Work, work-life boundaries

# Chapters

# Instances

8 1 4 1 1 2 5

26 1 9 4 1 2 9

D7 Law, policy, regulation

7

17

E1 Effects Negative Addiction, problematic use Attention, brain, overload, interruptions Cyberbullying, harassment Danger, harm, risk Disconnection (among people), loneliness Fake news, alternative facts Fragmented media devices and platforms Free riding, social loafing Knowledge sharing costs Multitasking Online hate and shaming Pressure for access, connectedness, response Work difficulty, processes Work-Life conflict

9 3 4 3 4 1 1 1 1 1 2 1 3 1 1

47 7 9 5 4 1 5 1 1 2 3 1 6 1 1

E2 Effects Positive Collaboration, cooperation, sharing Connectivity, connectedness Knowledge sharing benefits Safety Social capital Technology interventions (sustainability) Work ease, effectiveness, efficiency, productivity Work-family enrichment

7 2 3 1 1 1 1 1 1

12 2 3 1 2 1 1 1 1

E3 Effects Societal Crime Economy, economics Environment implications of digital media ICTs for development Institutions Societal impacts

8 1 1 1 1 1 8

17 1 2 3 2 1 8

E4 Effects contradictions, paradoxes, tensions, unintended F1 Future research, methods Methods Research

7

15

12 5 11

39 9 30

Conclusion: Cross-Cutting, Unique, and General Themes   713 theme or subtheme, and the number of instances each theme or subtheme was coded overall (i.e., a given subtheme could be coded multiple times within one chapter, but only if it was presented in a different context). First, we consider common or cross-cutting themes—those (aggregated across their subthemes) appearing in over half (8 or more) of the 14 chapters. These include B1 Technology venues (in all but one chapter); D3 Relationships, and F1 Future research and methods (12 chapters); A1 Theory, and D5 Culture, everyday life, education, learning, training (10); B2 Technology characteristics, C1 Content, creation, and E1 Effects negative (9); and C5 Inclusion, exclusion, discrimination; D6 Work and organizations; and E3 Effects societal (8). The most unique theme, appearing in only one chapter, was Digitization of self & others. Other infrequent and thus unique themes were C4 Ethics, ethical issues; C6 Manage digital experience; and D4 User groups (4 each); and A2 Names for new social era; and C3 Civic issues (5 each). The themes that appear more frequently across the chapters can be interpreted as more pervasive or central themes, while those appearing less frequently can be seen as more unique and less central themes, associated with digital technology and society. This distribution does not necessarily imply that the less frequent themes represent “gaps” or under-researched areas; some themes and subthemes may be just more specialized or more narrowly focused (e.g., digitization of self, ways of managing one’s dig­ ital experience, particular user groups), or more relevant to books about the macro issues of digital technology and society (such as particular names or terms for the era), or of particular interest to only a few of the authors. However, the overall distribution of more or less frequent themes can be useful in identifying more common themes as a guide for literature reviews, or less frequent and more unique themes as opportunities for more in-depth and novel research. For example, we should expect that F1 Future research and methods reasonably appears in all chapters, and B1 Technology venues in all but one. Particularly interesting is that all but two chapters also discuss D3 Relationships. This frequent and common focus highlights that the inherent nature of information and communication technologies and systems involves relationships within individuals (e.g., well-being, identity and self-presentation), among users (e.g., caregivers and the elderly, or online communities), among users and technologies/devices (e.g., between individuals and their smartphones or smarthomes), among users, their media, and third parties (e.g., big data and the attention economy), among different social and cultural groups (e.g., power, digital divides, inclusion/ exclusion, literacy levels), and among systems and devices (such as the Internet of Things). Also, instances of negative, positive, societal, and contradictory effects were all discussed by about the same number of chapters (from 7 to 9), indicating again that neither a fully utopian nor a fully dystopian perspective towards relationships between digital technology and society seem justified (for example, see Katz & Rice, 2002, for an explication of the synthesizing Syntopian perspective, which rejects both utopian and dystopian approaches).

714   Ronald E. Rice ET AL. Perhaps more fundamental is that, as described in chapter 1, the seven guiding ESRC Domains resulted from a comprehensive analysis of the literature designed to identify possible new research areas for funding, and these 14 non-ESRC chapters were submitted to the follow-up conference, and selected, largely as complements to those seven themes. Thus, some topics appearing in recent books simply were not included in the overall project’s domain. Certainly, the literature associated with digital technology and society is vast and would include a much wider array of topics than are included in this Handbook. These would include more technological, computer, software, and engineering aspects; more contexts related to arts and humanities; more legal and policy issues; more business and economic dimensions; more context from other countries and cultures; other use contexts such as gaming, virtual reality, wearable and embedded media; more focus on the needs and uses of groups such as low income, rural, differentially abled, ethnic, feminist, activist, and LBGTQ, among many others; more consideration of cultural differences in access, meaning, use, and implications; and more focus on qualitative and case studies (for just a few examples, see Borah,  2017; Lee, Ho, & Lwin, 2017; Rice & Fuller, 2013; Rice & Leonardi, 2013; Röhle, 2005).

More General Themes Emerging from Relationships among the Chapters In addition to assessing common and unique themes, we can also identify how the nonESRC chapters relate to each other, reflecting a more general view of shared foci in this Handbook. Table 25.6 indicates which themes appeared in which chapters. All but one chapter included from 10 to 16 (out of 22) themes. Thus, most chapters portray over half of the themes, with chapters 6 (Petrie & Darzentas: Digital Technology for Older People), 9 (Rice, Zamanzadeh, & Hagen: Media Mastery for College Students), 7 (Green, Comber, & Kuznesof: A Digital Nexus), 5 (Wagg Cooke, & Simeonova: Digital Inclusion and Women’s Health . . . ), and 12 (Coombs, Hislop, Taneva, & Barnard: Changing Nature of Work with Intelligent Machines) involving the most. Clearly, we would not expect chapters by different authors to engage the same themes, much less all of them. However, partially as a byproduct of the bounded number of themes (22) across the chapters (14), partially because of the interrelatedness of these themes within most any treatment of digital technology and society, and partially because of the focused nature of the ESRC project and the conference call for papers, there should be considerable commonality. Figure 25.1 displays a hierarchical clustering of the 14 chapters, based on the Jaccard similarity coefficients derived from the co-occurrence of the 22 theme codes among all the themes (and their aggregated subthemes) across the chapters (provided through NVIVO 11). We can see two large clusters. The top cluster represents more macro and conceptual or definitional issues, emphasizing knowledge, citizenship and free speech, data, and digital technology venues (sustainable HCI, intelligent machines, and Internet

Table 25.6  Non-ESRC Chapters Including at Least One Instance of Each Theme  

Chapters

Theme

04

05

A1 Theory A2 Names for new social era B1 Technology venues B2 Technology characteristics C1 Content, creation C2 Big data, data mining, data storage, analytics, user data C3 Civic issues C4 Ethics, ethical issues C5 Inclusion, exclusion, discrimination C6 Manage digital experience D1 Digitization of self & others D2 Health D3 Relationships D4 User groups D5 Culture, everyday life, education, learning D6 Work and organizations D7 Law, policy, regulation E1 Effects Negative E2 Effects Positive E3 Effects Societal E4 Effects contradictions, paradoxes, tensions, unintended F1 Future research, methods Total themes for each chapter

X

X

X

X

X

X

X X

X X

X X X X X X

X X X X X 10

06

X X X X X X X X X X X X X X X

X X X 13

X 16

07

09

10

12

13

15

X X X X X X X

X

X X X X           X   X X     X   X X       10

X X X

X   X X         X           X X X X X       9

X

X X X

X X

X X X

X X X X 14

X X X X

X X X X X 15

X

X X X X X X X X X 13

X X X X

17 X X X X X X

19

20

X X X

X

X X X X

X X

X X X X

X X 12

X

X

X

X

X

X X

X

X

X

X 12

X

X X X X X X

X

X

X X

X 12

23

X

X X X X

21

X 5

X X 10

X X X X X X 13

Total 10 5 13 9 9 7 5 4 8 4 1 6 12 4 10 8 7 9 7 8 7 12 —

Notes: 04 = MeierDomahidiGuentherCMCMentalHealth; 05 = WaggDigitalInclusionWomensHealth; 06 = PetrieDarzentasDigitalTechOlderPeople; 07 = GreenComberKuznesofDigitalNexus; 09 = RiceZamanzadehHagenMediaMastery; 10 = CecchinatoCoxBoundariesCommTechsFinal; 12 = CoombsHislopTanevaBarnardChangingNatureOfWorkIntelligentMachines; 13 = YatesLockleyDigitalCultureAtWorkAndTheUptakeOfDigitalSolutions; 15 = YatesLockleySocialMediaSocialClass; 17 = LeeScottBaumannDigitalEcologyFreeSpeech; 19 = HintzDigitalCitizenship; 20 = NugentFolanEdmondDataDefinitions; 21 = HocevarAbeytaRiceMotivationsOnlineKnowledgeSharing; 23 = JacobsEdwardsCottrillSaltGovernanceAndAccountabilityInIoTNetworks

716   Ronald E. Rice ET AL. 21HocevarAbeytaRiceMotivationsOnlineKnowledgeSharing 17LeeScottBaumannDigitalEcologyFreeSpeech 19HintzDigitalCitizenship 20NugentFolanEdmondDataDefinitions 07GreenComberKuznesofDigitalNexus 12CoombsHislopTanevaBarnardChangingNatureOfWorkIntelligentMachines 23JacobsEdwardsCottrillSaltGovernanceAndAccountabilityInloTNetworks 05WaggDigitalInclusionWomensHealth 13YatesLockleyDigitalCultureAtWorkAndTheUptakeOfDigitalSolutions 15YatesLockleySocialMediaSocialClass 10CecchinatoCoxBoundariesCommTechs 06PetrieDarzentasDigitalTechOlderPeople 04MeierDomahidiGuentherCMCMentalHealth 09RiceZamanzadehHagenMediaMastery

Figure 25.3  Hierarchical clustering of non-ESRC chapters based on co-occurrence of coded themes.

of Things). The bottom cluster emphasizes more contexts and implications of digital technology uses, at both social (class, inclusion, digital divides) and psychological levels (work-home boundaries, older uses, mental health, and media mastery). At least two implications for readers and researchers follow from the clustering. First, based on the extent to which the chapters shared codes from the more general coding framework derived in chapter 1, the chapters in this Handbook could be regrouped and resequenced from the current table of contents and sections. Possible sections could be concepts, venues, social issues class and inclusion, and psychological issues of health, well-being, and media mastery. Second, researchers focusing on any of these general areas may wish to use the relevant chapters as initial literature reviews and foundations for further research.

Conclusion The ESRC project was defined as a scoping review to identify key areas for future research. One potential outcome from such a project would have been to identify one or two pressing research concerns. The reality of digital media use is that they pervade nearly all aspects of contemporary society and have touched nearly all aspects of everyday life, and conversely are shaped and adapted by a wide range of actors and contexts. Even those who do not use digital media are directly and indirectly affected, as lack of access or skills creates substantive disadvantage in societies where services and even everyday interaction are predominantly undertaken via digital devices. In each of the main domains we have sought to identify key future research questions and challenges—those issues that we need to better understand in order to get a clearer and more

Conclusion: Cross-Cutting, Unique, and General Themes   717 comprehensive picture of the contexts and implications of digital media use. Focusing on just one area within a domain, or claiming that just one domain has priority, would be both limiting and a false assertion. Having said this, we can see some key commonalities in the overall results, and we would argue that bringing these together provides a broad set of themes that might serve as a medium-term framework for exploring digital media and society. We believe that combing the overlaps between the following areas creates two substantive and relatively integrated domains of study in regard to digital media: • Communication and relationships with communities and identities • Citizenship and politics with governance and security Based on discussions in the workshops, we would argue that a third major mediumterm area of study has to be • Social, economic and cultural impacts of automation, augmentation, and virtual reality We would then suggest four smaller focused areas that could stand alone or cross cut these three main areas: • Digital economy, with a focus on the impact of major platforms • Data and digital literacies • Health and well-being focused on workplace, every day and governance issues • Digital divides and digital inequalities, including the two-way interaction between digital inequities and other areas of social inequity We would also strongly emphasize the need for projects that address the following: • Multi-platform/holistic studies: To ask broad social science questions and then explore which technologies are relevant to citizens’ actual practice and in what ways. To develop a more holistic picture of the integration of digital into their lives (or not, in the case of digital inequalities). • Methods innovation: Including risk taking on digital tools—with a strong methods evaluation component. • Theory development, testing, and evaluation: We are agnostic on the need to inherently develop new theory to understand the everyday uses and impacts of digital technologies. The literature content analysis has found little evidence of consistent dominant theory in the area. There may be a need for greater clarity on “most relevant” theory and on incremental theory development as opposed to a need for “digital specific” theory development. • Ethics: This needs to cover both ethics with regard to methods, but also wider social ethics around social, commercial, and government use of data, systems automation, and human augmentation.

718   Ronald E. Rice ET AL. The work we have undertaken here has highlighted the very large amount of research and scholarship dedicated to understanding the impacts and roles of digital media in contemporary society. At the same time, it is clear that much work remains to be done. It is a concern that funding (from government and industry) remains focused on technology development and implementation—with the clear policy goal being economic growth. For example, the recent UK Industrial Strategy Challenge Fund, the UK Digital Economy program, and the EU Framework and H2020 programs have all funded highquality work focused on the creation of new technologies and their commercialization. This has included funding for creative and digital industries. Yet funding to support the evaluation of the social impacts of these innovations, their ethics and governance, the process of their adoption, and the social process of their creation remains far more limited. We also have some concern that some social science colleagues continue to view digital issues as secondary to, or as “add-ons” to formal social research questions— rather than seeing these as fundamentally integrative to contemporary social research. Understanding contemporary interpersonal interaction or understanding UK benefits policy is not separate from the digital technologies being used to create and maintain these relationships or manage these services. Finally, and most importantly, the key point we draw from the chapters in this Handbook is both exciting and challenging—namely the vast range of research opportunities that the study of digital media still provides and will continue to generate. This includes not only the range of questions that still need answering, nor simply the new innovative methods and data sets being developed and becoming more accessible, but importantly the chance to be part of understanding and influencing some of the most historically important social, political, economic, and cultural changes taking place in contemporary society

Note 1. As part of the review, the Digital Humanities Institute at the University of Sheffield applied concept modelling techniques to a curated corpus of 1,900 journal articles from the period 1968 to 2017. Concept modelling is a computational linguistic process that involves identifying the emergence of concepts, or key ideas, via lexical relationships. For the purposes of the review, lexical relationships were limited to high frequency co-occurrences of terms as pairs and trios. The process is entirely data driven and resulted in 2 million rows of data. The website https://www.dhi.ac.uk/waysofbeingdigital/provides access to the top 50 most frequently occurring pairs and trios through a series of data visualizations. Click on View Data Visualisations at the top. Then check/submit which of the seven ESRC domains you are interested in (including all). Then choose the visualization. These show configurations across selected time frames. Choose bubble chart, tree map, zoomable pack layout, or network diagram, by individual subject or by all seven subjects combined, by document or concept frequency. You can similarly search the analyzed documents (all, by subject, author, concept, concept trio, and year) by clicking on Browse Articles at the top. Also, see https:// waysofbeingdigital.com/literature-analysis-interactive-results/for interactive visualizations with mouse-overs of the main clusters of concepts within each domain, and the relative frequency of concepts associated with each cluster.

Conclusion: Cross-Cutting, Unique, and General Themes   719

References Borah, P. (2017). Emerging communication technology research: Theoretical and methodological variables in the last 16 years and future directions. New Media & Society, 19(4), 616–636. Grint, K., & Woolgar, S. (1997). The machine at work: Technology, work and organization. Cambridge, UK: Polity Press. Katz, J. E. & Rice, R. E. (2002). Social consequences of Internet use: Access, involvement and interaction. Cambridge, MA: The MIT Press. Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford, UK: Oxford University Press. Lee, E. W., Ho, S. S., & Lwin, M. O. (2017). Explicating problematic social network sites use: A review of concepts, theoretical frameworks, and future directions for communication theorizing. New Media & Society, 19(2), 308–326. Rice, R. E. & Fuller, R. P. (2013). Theoretical perspectives in the study of communication and the Internet. In W. Dutton (Ed.), Oxford handbook of Internet studies (pp. 353–377). Oxford, UK: Oxford University Press. Rice, R. E. & Leonardi, P. M. (2013). Information and communication technology in organizations, 2000–2011. In L. Putnam, & D. K. Mumby (Eds.), Sage handbook of organizational communication (3rd ed., pp. 425–448). Thousand Oaks, CA: Sage. Röhle, T. (2005). Power, reason, closure: Critical perspectives on new media theory. New Media & Society, 7(3), 403–422.

Index

Note: Tables and figures are indicated by an italic “t” and “f”, respectively, following the page number. absenteeism 303 abstracts 100 acceptance. See technology acceptance access  428, 701 AI and  431 broadband 128 cognition and  274–75 content 266 to databases  100 digital divide and  370 at home  370, 381t ICTs 118 Internet 116 lack of  432 media mastery and  266 patterns of  426 problematic Internet usage and  267 accountability  480, 605, 646, 647 for algorithms  518 automation and  685 for data  518 definition of  634 governance and security and  622 for intelligent machines  355–56 IoT governance and  628–30, 633–34 limits 649 action 454 active citizenship  533, 539 activism  17, 459, 530, 540 Activity Theory  116, 117t Actor-Network Theory (ANT)  202–3, 206, 619, 700 Ada Lovelace Institute  706 adaptive structuration  251

addiction  21, 254 CMC, mental health and  82, 89 Internet  89, 98, 275 to smart phones  267, 269, 270, 272 ADHD. See attention deficit hyperactivity disorder adolescents  228, 235 chatrooms and  234 independence of  233 media mastery and  252 online self  233 privacy of  229 social competence of  233 socialization of  232–33 social media and  229 social support and  233–34 adoption  398, 671 automation and  696–97 broadband 330–31 of intelligent machines  349, 352–53, 358 Internet  120, 126, 128 of technology  330 adult-child divergence  616 Advancing the Internet of Things in Europe 637 affective publics  479 affiliative tendency  587 Africa, Internet in  111 African Americans  20 age, digital skills and  382 agency  200, 451, 619 citizenship and  527, 533, 539, 541, 649 expression of  529 structure and  201 age of enlightenment  4

722   index aging. See older people AI. See artificial intelligence AICPs. See artificial intelligence care providers AIOTI. See Alliance for Internet of Things Innovation Akshaya e-literacy project  120 algorithms  53, 67, 662–63 accountability for  518 AI 355 clustering 46 data and  518 data mining and  18 Facebook 481 governance and  539 opacity of  481 power of  5 privacy and  645 search and  15, 18, 21 social construction of  518 transparency about  67, 480 trust and  501 of YouTube  481 alienation, of older people  144 All Chile Connected (Todo Chile Comunicado) 120 Alliance for Internet of Things Innovation (AIOTI) 637–38 Alphabet 648 alternative facts  477 alt-right 479 altruism, KSS  578–79 Alzheimer’s disease  148 Amazon 333 ambiguity 482–85 American Civil Liberties Union  484 American slavery as it is (Garvey)  555 analytic approach  71t, 87–88, 101, 145, 461t, 517t, 621t analytics 16–17 Anderson, Benedict  231 Anderson-Rubin method  394 anonymity  223, 414, 483 Anonymous 530 ANT. See Actor-Network Theory anti-establishment views  477 anti-intellectualism 476–77 anti-rationalism 476–77

anti-vaxxers 477 anxiety 267 aphasia 148 appropriate use, of technology  438 apps 369 Arab Spring  467, 530 archivists 568 Arpanet 250 artificial intelligence (AI)  21, 328, 344, 348, 620, 659, 666 access and  431 algorithms 355 communication 357 complexity of  355 cultural attitudes toward  671, 671t culture and  253 cyber security and  352–53 decision making  353 defining 346–47 disruption of  678 emotional support from  358–59 ethics of  346, 353 evidence and research methods  675–76 human intelligence and  347 implications of  675 jobs and  352, 353 legislation 359 policy 359 pseudo 481 social attitudes toward  671, 671t systems 704 trust in  352–53 weak 346 artificial intelligence care providers (AICPs)  354, 355 arts attendance  441t ASD. See autism spectrum disorder Ashton, Kevin  628 Asimov, Isaac  477 assisted care, of older people  149–50 assistive technology  138 association 330 ATMs. See automated teller machines attachment theory  91 attention fragmented 24 inattention management  311

index   723 meta 253 motivated 253 span 2 splitting 273 attention deficit hyperactivity disorder (ADHD) 275 attitudes 198–200. See also cultural attitudes; social attitudes toward digital solutions  383–84 about problematic Internet usage  276 UK workforce  387–88 at work  398 audiences x augmentation  358, 666 authentication 66 authenticity  406, 413 authoritarianism 24 autism spectrum disorder (ASD)  350, 351 automated teller machines (ATMs)  137 automation  429, 431, 620 accountability and  685 adoption and  696–97 assumptions about  673 augmentation and  666 benefits of  358 categorization through  538 cognitive demands of  679–80, 695 communities and  678–79, 682–83 complexity and scale of project  674–75, 688–89 cultural attitudes toward  671, 671t, 681–82 data and  501 disruption of  678 economic context of  661–63 education and  677–78, 692–94 employment and  662, 677–78, 692–94 ESRC-DSTL  660, 663–65, 663t, 667f, 668, 670t ESRC-NSF  660, 663–65, 663t, 665f, 668–69, 669t ethics and  680, 695–96 evidence and research methods  675–76, 689–90 forms of  186 global environments of  676–77, 690–92 governance and security and  617–19 impacts of  43, 323–24, 328, 674 inequalities and  678–79, 679t, 694–95

of knowledge labor  348 levels of  352 policy 674–75 politics and  673 productivity and  662f, 677 professions and  677–78, 684–85 of service work  348 skills and  677–78, 692–94 social attitudes toward  671, 671t, 681–82 social context of  661–63 social impacts and  678–79 social issues and  678–79, 682–83 social science and  680, 696 of social sorting  612 surveillance and  612 system design and  679–80, 683–84 technological limitations and  686–87 technology 676–77 technology development  696–97 topics 668–69 trust and  672–74, 673t, 685, 687 work and  4, 30, 336, 344–45, 429, 677–78, 684–85 The Automation of Future Roles 30 autonomy  351, 451, 619 availability, of self  268 awareness cues 310 features 310–11 self-awareness 585 usage 275–78 of work  310–11 balance  251–52, 301 banks 137 Barlow, John Perry  531 behavior boundary management  314 change  194–98, 207, 208n3 datafication of  537 intention and  198 modification 197–98 online group  413 sexual 276 shared behavioral  200 SIDE model of online group  223 social  98, 222 trolling 92

724   index Bell, Daniel  345 bias  91, 548, 676 big data  558, 620, 706–7, 710t analysis of  16–17, 246 approach  71, 239, 247, 416, 507, 517 data and representation and  507–8 definition of  508, 560 environments 562 human geography and  512 ICTs and  547 objectivity and  547 opacity of  514 scale of  560 social sorting and  538 surveillance and  612 use of  513–14 warnings about  64–65 Big data little data no data (Borgman)  550 binary digits. See bits biosensors  19, 66 Bit by bit (Salganik)  26 bits 4 Blackberries 306 Blackett Review on the Internet of Things 641 blockchain technology  18, 21 blogging 530 books codes 12t, 25f digital society in  9, 10t, 11, 11f, 12t digital technology in  9, 10t, 11, 11f, 12t, 15–16 effects in  21–24 subcodes 12t, 25f themes 12t theory and conceptualization  12t, 14–15 Borgman, Christine  550, 551, 564 bots 480 bottom-up boundary strategies  313–15 boundaries. See also work-home boundaries bottom-up boundary strategies  313–15 identity centrality and  302 integration-segmentation continuum and 302–3 management of  300–301, 306, 312–15 media mastery and  268–71 nature of  302

perceived boundary control  302, 306 permeation of  312 physical 301 preferences 311 psychological 301 roles and  302 social interaction and  300 temporal 301 theory 299–303 top-down strategies for management of 312–13 brainstorming  579, 584 Brexit  452, 467 Bridging the gender gap 122–23 Bring Your Own Device (BYOD)  312 British Empire  3 British Social Attitudes Survey  436 broadband. See also Internet access 128 adoption 330–31 communication 614 bubble map  48f burnout  93, 303, 309 butler lies  311 buzzwords, media  473 BYOD. See Bring Your Own Device cache memory  552 caller ID  314 cancer 64 Capability Approach  127–28 capitalism  192, 194, 335, 536, 537 career planning  581 caring machines  354, 358 Casement, Roger  562 Catlett, Charlie  644, 645 CDA. See Communications Decency Act cell phones. See smart phones censorship government 476 Internet 704 privatization of  480–82 self 482–85 Census 565 Censuswide 373 Centre for Data Ethics and Innovation 706

index   725 challenges. See also organizational challenges change as  vii–viii citizenship and  465, 466t, 528–30 communities and identities  420–21, 420t data and representation  518–23, 522t Delphi process  75–76, 75t, 242–43, 243t, 245t, 339–41, 340t interdisciplinary views and  vi volume of literature and digital tools  vii change 273 Changing Work, Changing Lives in the New Technological World 30 Charlie Hebdo 484 chatrooms 234–35 chatting 92 Chicago, IoT governance in  643–46 Chicago Array of Things  629 child development  354, 414 childhood 20 child mortality rates  126 children. See also adolescents communities and  414 digital image exchange of  616–17 digital media use by  615–17 identities and  414 Internet and  615–17 privacy of  616 risks of  616 China, digital divide in  331 citizenship  v, 29, 113, 118, 235, 246, 406, 434 active  533, 539 acts of  532 agency and  527, 533, 539, 541, 649 challenges  465, 466t, 528–30 classic forms of  529 communication and  535 communities and  413 concept of  455t, 531 concept pairs  453f, 455t consumption and  537 control and  540–41 datafication and  536–41 definition of  526, 527–28 Delphi process and  417, 451, 460–65, 461t digital 246 digital acts and  530–33 digital inclusion and  532

digital literacy and  29, 673, 687 digital restrictions  535–36 DIY 532–33 ecological  199, 208n3 education and  515 empowerment  68–69, 533–35, 540–41 engagement 530 exclusion and  532 feminist 529 governance and  633 governance and security and  610 government and  535 information and  481 international law of  529 Internet and  531–32 involvement and  633, 642 journalism  527, 531 literature analysis  453–60 nationality and  539 norms 237 participation  531, 533, 633 politics and  452, 459 power dynamics and  541 reference points for  534 representation and  642, 646, 647, 649 rights  528, 533, 610 role of  530, 532 sensing 192 smart phones and  414 society and  541 studies 434 surveillance and  535, 542n1, 612–13 theory, method, and approach  460 topics  456, 456t, 458–60, 464–65 Twitter and  458 WordStat tool and  456t CityBridge  648, 649 civic engagement  118, 415, 442, 459, 534, 538 civic infrastructure  629 civic issues  17, 710t civic mobilization  415 civic participation  118 civic well-being  620 civil rights  530 classification, of objects  565 Clementi, Tyler  268 climate change  323

726   index climate control technology  149 closed circuit TV cameras  149 cloud computing  15 clustering algorithms  46 co-designing technology  244 coding typology  708–9 cognition access and  274–75 contradictions and  271–72 distraction and  275 multitasking and  273–74 cognitive computing  348 cognitive demands, of automation  679–80, 695 cognitive evaluation theory  581 collaboration  22, 405, 531 collective action  422, 459, 460, 575 collective costs and benefits  588 collective effort model  586 collectivism 590 college students, Facebook and  614 colonization 187 commodification  334, 536 communality 575 communication  v, 15, 19, 22, 28, 222, 453. See also computer-mediated ­communication; information and communication technologies AI 357 behaviors 245 broadband 614 cancer and  64 channels  390–91, 390f child development and  414 citizenship and  535 communities and identities, media theories and 416 concept pairs  226f, 227f culture and  230 Delphi process for  239–45 density 246 digital literacy and  240t face-to-face 81 of foreigners  539 health and  59, 75, 76 information and  484 intercultural 22

Internet and  515 norms and  240t older people and  144, 145–48, 151 online  478, 539 overload 308 patterns 232 physical aspect of  246 platform affordances and  240t platforms  244, 301 politics and  236–37, 415, 459–60 process of  226 public-by-default nature of  483 relationships and  240t scoping review and  86 smart phones and  266, 307 social relations and  272 studies  49, 466 successful 391 technology, mobility and  304–5 technology, work-home boundaries and 304–11 temporal properties of  311 texting and  277 tools 531 Twitter and  236 work and  312 Communications Decency Act (CDA)  480 communities  v, 28 authenticity of  413 automation and  678–79, 682–83 children and  414 citizenship and  413 concept pairs  408t connectivity and  640 credibility of  413 data and representation and  514–15 Delphi process for  405, 406 digital inclusion and  120–21 dynamics 409 education and  415 empowerment 64 Facebook and  481 formation 441 gender and  414–15 governance and security and  620 identities and  406, 409f, 410 imagined 231

index   727 Internet and  116 knowledge 409 KSS and  580 leadership 409 literature on  407–15 membership to  406, 413, 418t, 422 network 409 online  406, 410–11, 414 participation 409 politics and  529 power relations in  533 of practice  575–76 reciprocity in  583–84 reputation of  588 scoping review for  406 sense of  231 smart cities and  640 smart phones and  414 social capital and  441 social media and  64 topics  407, 407t, 409–10, 409f, 411t, 413–15 Twitter and  231–32 virtual  583, 590 women’s health and well-being in rural  113, 121–22, 125–26, 129 WordStat analysis of  411t communities and identities challenges  420–21, 420t collective action and  422 communications and media theories and 416 Delphi process and  417 future research and scoping review for  417–19, 418t psychological theories and  416 sociological theories and  416 theory, method, and approach to  415–17, 416t computer-mediated communication (CMC)  223, 227, 246–47 definition of  80, 86 growth of  299 identities and  422 platforms 307 social support and  234 temporal patterns of responses in asynchronous 309

computer-mediated communication, mental health and  79 addiction and  82, 89 analytical approach  87–88, 101 articles per discipline  95f changes over time  93 chatting 92 concept operationalization  87t concepts 88–89 core topics  89–93, 90t, 94f CTM 88 cyberbullying 92 databases 87t debate on  83, 98 defining key constructs  80–81 Facebook 90–91 future research agenda for  101 ICT adoption and  92–93 Internet addiction  89 in journals  94f, 95, 95f, 100–101 limitations in sampling and analysis 99–101 manual topic selection and merger  88 mental health concepts  96–98, 96f, 97t, 101 methodology 85–89 network structures and  99 problematic Internet usage  82, 89 publication behavior in field  88, 93–96 publication of research  83–84 publications 100 reduction of  99 relationships and  91–92 research on  81–83, 86, 98, 100–101 results 89–98 sample 86–87 scoping review  85–86, 99 search terms  87t sleep and  93, 99 smart phones and  91 SNS and  82, 90–91 software 89 study on  83–85 texting 92 topic modeling  87, 100 topics  83, 97t work-related 93

728   index Computer-mediated communication in personal relationships (Wright & Web)  26 computers  4, 6 games and mental health  65–66 literacy 125 older people and  141, 143, 144, 153 women and  123 computer science  vi, 522 computing  5, 186 cloud 15 cognitive 348 ethics and  567 high-performance 662 power 15 concept-modelling  46, 47t, 77n1, 247n1, 341n1, 423n1, 467n1, 524n1, 626n1, 718n1 concept pairs  48f all seven domains of  701f citizenship 453f, 455t communication 226f, 227f communities 408t data and representation  503t, 506f Delphi process  325t, 327f economy 327t ESRC project  224t, 226f, 227f governance and security  607t, 610f health and well-being  60f, 61f literature 59t main themes of  703t tree map of  49f conferences, older people and  137–38 confidence at home  380t, 396 knowledge labor and  383 literacy and  440 at work  372, 381t, 396, 400 connective action  460 connectivity  22, 23, 122, 422, 575 communities and  640 conscientious  18, 253 culture of  14–15 FOMO and  271 health and  126, 269 multitasking and  275 problematic Internet usage and  269 social relations and  268–69 Twitter and  231

conscientious connectivity  253 conscious consumption  253 consent 611 conservation, resources and  198 conspiracy theories  478 constant connection  268–69 Constitution, U.S.  484 constraints, media mastery  271–72 consumerism 208n4 consumers 195 consumption  187–88, 207 citizenship and  537 conscious 253 cultural  436, 511 digital 708 domestic energy  193–94, 199, 205, 206 household energy  203 infrastructures of  202 measurement of  194 media and  232 resources and  197, 199, 202 routines 201 sustainability and  192, 199, 201, 208n4 content access 266 analysis 49 creation  16, 710t management, media mastery and  272–74 sharing, freedom of speech and  478–80 viral 478–79 context collapse  551 contextual factors, of KSS  588–90 continuous time  509 contradictions  23, 271–72, 712t Conway, Kellyanne  476 co-occurring terms  700–701 cookies 613 co-operation 337 corporate wellness programs  514 Correlated Topic Model (CTM)  88 costs, of problematic Internet usage  275 counter-terrorism  473, 612 Counter-Terrorism and Security Act  487 country contexts, digital inclusion and  128 creativity  24, 577 credibility, of communities and identities  413 crime 24

index   729 critical thinking  127 cross-cutting topics  700–701, 713 crowdfunding/sourcing  16, 20–21 CTM. See Correlated Topic Model cultural attitudes toward AI  671, 671t toward automation  671, 671t, 681–82 cultural capital  22, 427, 431–32, 435, 437–41, 440f, 445 cultural consumption  436, 511 cultural data  548, 562, 567 cultural identity  565 cultural production  531, 708 culture  20, 429, 479, 711t AI and  253 communication and  230 digital inclusion and  121 inequalities and  445 nature and  187 technology and  676 cyberbullying  92, 268, 271, 274 cybermediaries 333 cyber-optimism 14 cybersalons 534 cyber security, AI and  352–53 cyberspace 532 cyber troops  473 cyborg 19 dangerous ideas  488 data  vi, 29 accountability for  518 acquisition 552 activism 540 algorithms and  518 analytics 710t automation and  501 brokers 536 categorization 563–64 classification systems  566 cleaning  548, 556, 561–62 collected by Facebook  226–28 collection  151–52, 509, 635 context of  556 cultural  548, 562, 567 defining  548, 549, 554–55, 557–61 delimiters 567

digital environments and  556, 558 documentation standards  566 duplicates 552 errors 568 facts and  554–55, 556 false  555, 559 governance 641–42 handling 514 hidden  556, 568 indeterminate 568 informed consent and  514 integration of  569 interpretation  555–57, 568–69 literacy  521, 522, 701 management of  557 manipulation 561 medical 553 methods of data and representation  507–8 minimization 634 mining  16–17, 18, 515, 536, 548, 710t NASA protocol for  562–63 native 550 noise and  553 objectivity and  561 original 558 outdated 552 ownership  610, 640 as performative  566 pre  549–50, 554 pretreatment module  558 processing  548, 554, 555–57, 558 protection 637 provenance record  564 quality 553–54 raw  550–51, 553 rich environments  567 social construction of  518 source 552–54 sources, data and representation  511 storage  16–17, 630, 710t stream 558 structuring 567 surveillance and  481–82 terminology 569 transparency 562 treatment 562–65 truth and  559–60

730   index data (Continued ) types of  560 unprocessed 564 use of  513–14 variants 562–65 visualizations  77n1, 247n1, 341n1, 423n1, 467n1, 509, 510, 524n1, 626n1, 718n1 data and representation big data and  507–8 challenges  518–23, 522t communities and  514–15 concept pairs  503t, 506f continuous time  509 data methods of  507–8 data sources  511 data visualizations  509, 510 Delphi process and  518–21, 519t, 523t ethics and  512, 521 expertise and  509 future research and  518 granularity and  509 heterogeneity and  509, 548 impact and  512 initial comments  501–2 literature analysis of  502–16 mobile and mobilizing technologies and 509 non-coherence of knowledge creation 509–10 objectivity and  510 other domains and  514–16 scoping review and  518 theory, method, and approach  516–18, 516t topics  502–16, 502t, 521t, 522t transactional interactions  508–9 whole populations and  509 WordStat tool  504t databases, access to  100 datafication  5, 517, 527 of behavior  537 citizenship and  536–41 governance and  536–38, 541 implications of  538–39 of life  536–38 pre-data 549 dataism 538 dating 91–92

dating apps  271 DCMS. See Department for Culture, Media and Sport’s dead time  309 debates ix deception 223 Defence Science and Technology Laboratory (DSTL) 43 Delphi process  39, 41–49, 41f, 57, 221 challenges  75–76, 75t, 242–43, 243t, 245t, 339–41, 340t citizenship and  417, 451, 460–65, 461t for communication  239–45 for communities  405, 406 communities and identities and  417–19, 418t concept pairs  325t, 327f data and representation and  518–21, 519t, 523t first round  222t future research and  72–75, 73t, 244–45, 247, 341, 462–63, 462t for governance and security  605 for identities  405, 406 initial comments  323–24 literature analysis  324–36, 324t politics and  417, 451 results 707 scoping review  240t, 241, 337–38, 338t survey response  74t theory, method, and approach  335–36, 335t topics  74, 74t, 241–42, 242t, 324–35, 326t, 338–39, 339t dementia  137, 148, 354 democracy  471, 472, 477–78, 532, 612 democratization  24, 532, 650 Department for Culture, Media and Sport’s (DCMS) 38 Department of Innovation and Technology 645 dependencies  21, 254, 276 depression  79, 90–91, 92, 267, 270, 276 design automation and system  679–80, 683–84 co-designing technology  244 politics of  208 sustainability and  191, 193, 196, 205–6 system 672t

index   731 Developing Innovation and Growing the Internet of Things (DIGIT) Act  639 development 14 Development and access to information 122–23 devices 22. See also smart phones/mobile phones interconnected 508 IoT governances and  645–46 multi-device ecology, work-home spaces and 305–7 privacy and  645, 648 specialization 305 Devolution Agreement  642 DHI. See Digital Humanities Institute dictatorship 471 diet  70, 205 diffusion of innovation  330 Diffusion of Innovation (DOI) Theory  116, 117t DIGIT. See Developing Innovation and Growing the Internet of Things Act digital acts, citizenship and  530–33 digital age  6, 25 childhood in  20 first substantive entry  10t in Nexis Uni (News)  8t in NGram  10t, 11f in Proquest Periodicals Index Online  9t in ScienceDirect  7t in Web of Science  7t digital agenda  614 digital by default  429 digital capital  427, 432–33, 438, 445 Digital Catapult and Future Cities Catapult 641 digital citizenship. See citizenship digital communication. See communication digital consumption. See consumption digital culture. See culture digital development, environmental crisis and 187–88 digital disruption  369–70, 401 digital divides  129, 443, 701 access and  370 in China  331 digital inclusion and  127

digital skills and  332 distinction among  428–29 gender and  111, 122 literature and  115 policy and  118 research on  112, 332 socio-economic aspect of  513 types of  429 digital economy  429 digital efficacy  380–83, 380t, 382t digital engagement  428 digital environments  527, 530, 552, 556, 558 digital era  6 first substantive entry  10t in Nexis Uni (News)  8t in NGram  10t, 11f in Proquest Periodicals Index Online  9t in ScienceDirect  7t in Web of Science  7t digital exclusion  111, 122, 331, 443, 708 digital experience  18, 711t digital feedback systems  196, 205 digital health  29 digital health technology  74, 76 digital humanities  57, 708 Digital Humanities Institute (DHI)  37, 46, 47t, 247n1, 341n1, 423n1, 467n1, 524n1, 626n1, 718n1 digital identities  413, 419 digital impact mediators  331 digital inclusion  17–18, 337, 421, 428, 513, 708, 709, 711t Activity Theory  116, 117t Akshaya e-literacy project  120 citizenship and  532 community and  120–21 country contexts and  128 culture and  121 defining  112, 117–18 digital divide and  127 digital literacy and  127 digital skills and  124, 125 DOI Theory  116, 117t education and  121 elements of  112 frameworks, measurement, and evaluations 123–24

732   index digital inclusion (Continued ) gender and  121–22 government and  120 health and well-being and  125–26 health care and  123 ICTs and  112 illiteracy and  121 information literacy and  111–14, 125–26, 127–28, 129 initiatives  111, 119–25 in journals  127 levels and approaches  119–21 libraries and  121, 123, 128 literature review  115–16 Media Richness theory  117 methods 114–15 mobile technology and  122–23 outcomes-based evaluations of  124 policy  111, 119, 127 rural communities and  119 smart phones and  122–23 social activity and  121 social barriers to  121 terminology 117–18 theory and methods  116–17, 117t, 128–29 3G wireless connections  120 training 124–25 women’s health and well-being in rural communities  113, 121–22, 125–26, 129 digital inequalities. See inequalities digital information  4–5 digital infrastructure  329, 526 digital innovation, wealth and  329 digital interventions  196 digitalization, ethics of  513 Digital Leaders network  39, 43 digital leisure  429 digital lifestyles  76 digital literacy  18, 125, 482, 485, 701 citizenship and  29, 673, 687 communication and  240t cultural capital and  437–38 digital inclusion and  127 digital resources and  433 five central components of  253 inequalities and  427, 428, 440, 444 issues 65

learning 117 need for  20 of women  121 digitally mediated interactions  479 digital media children use of  615–17 education and  443 employment and  702 health and well-being and  71 health care and  65 implementation 699 interpersonal interaction and  45 production 16 role of  718 social class and  428 social interaction and  232 social life and  430 society and  vi, 718 socio-economic impact of  334 use  61, 427, 460, 716–17 weight loss and  69–70 digital networking  5–6 Digital Object Identifiers. See DOIs digital objects  566 digital only  429 digital ownership  334 digital prayer chapels  20 digital products  329 digital resources, digital literacy and  433 digital restrictions, citizenship  535–36 digital rights  539, 540 digital roll-outs experience of  383–88 jobs and  385–87, 385f knowledge labor and  383, 384f negative impacts of  386, 387f organizational challenges and  388–93, 389f organizational size and sector and  374, 375f, 376, 376t, 377f, 388, 389f, 390 positive impact of  386f regression model  397f success of  384f, 391, 392f, 393, 397f, 397t technology acceptance and  384 digital skills  112, 329, 380–83, 380t, 382t, 428, 512 acquisition of  331 age and  382

index   733 digital divide and  332 gender and  124 inclusion and  124, 125 training 129 types of  332 digital society  6, 329, 620 in books  9, 10t, 11, 11f, 12t first substantive entry  10t in Nexis Uni (News)  8t in NGram  10t, 11f in Proquest Periodicals Index Online  9t in ScienceDirect  7t in Web of Science  7t digital solutions attitudes toward  383–84 implementation of  389f organizational culture and  400 organizational size and  389f by organizational size and sector, increase in  377, 378f SMEs and  374 UK organizations and  374–80, 375f, 376t, 377f UK workforce and  399 uptake  370–73, 373t digital storytelling  531 digital technology  6, 27 in books  9, 10t, 11, 11f, 12t, 15–16 control of  151–52 education and  429 first substantive entry  10t governance and  467 health care and  76 interventions 65 negative impacts of  76 in Nexis Uni (News)  8t in NGram  10t, 11f older people and  136–37, 142–45, 151 opportunity structures of  431 policy 614 in Proquest Periodicals Index Online  9t in ScienceDirect  7t social construction of  517–18 society and  31, 714 UK workforce and  387–88 uptake 329–31 in Web of Science  7t

digital terminology first substantive entry  10t in Nexis Uni (News)  8t in NGram  10t, 11f in Proquest Periodicals Index Online  9t in ScienceDirect  7t in Web of Science  6, 7t digital tools  vii, 45–46, 53, 65, 383, 386f, 526 digitization  4–5, 6, 666 of industry  638 intensification of  501–2 of others  18–19 of self  18–19, 711t, 713 disabilities 17 discrimination  17–18, 421, 514, 709, 711t discursive concept  46 disempowerment 540 disinformation  471, 477, 478, 480 display technologies  194 disruption of AI  678 of automation  678 digital  369–70, 401 distractions cognition and  275 health and  275 distribution  194, 195, 629 DIY citizenship  532–33 DOI. See Diffusion of Innovation Theory DOIs (Digital Object Identifiers)  552 domestication theory  254 domestic energy consumption  193–94, 199, 205, 206 domestic utility systems  205 downloads 549 doxa  427, 438, 445 drugs 271 drug trafficking  22 DSTL. See Defence Science and Technology Laboratory Dutton, W.  26 dynamic network analytics  244 Earth Observing System Data Information System (EOS DIS)  563–64, 563t eco-feedback systems  196 ecological citizenship  199, 208n3

734   index ecological governance  201 ecology of transition  267 economic capital  331, 431, 433, 435–37, 436t, 437f, 440f, 444–45 economic change  332–33 economic context, of automation  661–63 economic growth  332–33 economic inequality  427 economics  333, 663, 675 Economics of the Internet 26 economy  vi, 206, 327 concept pairs  327t digital 429 gig 21 information and  333 information resources and  3 political 325 Economy and Sustainability domain  325 ecosystems 196 eco-visualization 194 editability 330 education  14, 20, 113, 711t applications 615 automation and  677–78, 692–94 of citizens  515 communities and  415 digital inclusion and  121 digital media and  443 digital technology and  429 eEducation 128 health 59 identities and  415 opportunities 438–39 technology and  59–60 eEducation 128 effects 25 in books  21–24 of KSS, indirect  587 media 591 negative  21, 712t positive  22, 712t societal  22, 712t effort expectancy  371 eGovernment 128 eHealth  65, 128 EHIL. See everyday life health information literacy

elderly. See older people election  452, 459, 467, 480, 482 electronic intrusion  92 electronic networks of practice  580 electronic pillbox  137 electronic records  549 elitism 488 emails  4, 93, 251, 267, 303, 306, 530 encryption 539 after work  313 emancipation 451 embodied cultural capital  432 emojis 479 emotional contagion  478–79 emotional loneliness  148 emotional support  22, 358–59 emotions 69 empirical approach  71t, 238t, 335t, 416t, 461t, 516t, 620t empirical methods  52 empirical research, on intelligent machines 357 empirical work, theories of  52 employees, productivity of  313 employment automation and  662, 677–78, 692–94 digital media and  702 discrimination 514 opportunity, social networks and  444 empowerment citizenship  68–69, 533–35, 540–41 communities 64 disempowerment 540 of women  121 encounter of knowing  311 encryption  66, 483, 539 energy  202, 327 Energy Saving Trust  195 engagement  17, 458, 709 Engineering and Physical Sciences Research Council (EPSRC)  629 enjoyment and entertainment, KSS  578 environment digital development and  187–88 human desires and impact on  196 environmental agency  200 envy 79

index   735 EOS DIS. See Earth Observing System Data Information System epistemological approach  70t, 238t, 335t, 416t, 516t EPSRC. See Engineering and Physical Sciences Research Council ESRC-DSTL  660, 663–65, 663t, 667f, 668, 670t, 672, 674–75, 678, 678t, 681t ESRC-NSF  660, 663–65, 663t, 665f, 668–69, 669t, 675, 677, 678, 687–88 ESRC project  vii–ix, 26–31, 77n1, 221. See also specific topics analysis concepts ranked  224t approaches for review  40–52 challenges 43 concept pairs  224t, 226f, 227f cross-cutting challenges  705–7, 705t cross-cutting topics  705t definition of  716 Delphi methods  39, 41–43, 41f initial comments  57–58 key authors  42–43 key literature  42–43 key topics  43 literature analysis  223 methods 40 participants 36–39 review chapters  44 scoping review  39–40, 39t, 42, 222t, 323 stakeholder engagement  38–39, 43–44 Steering Group  37t, 41 systematic literature reviews  44–52 team 36–37 theory 40 theory, method, and approach  238–39, 238t topics 223–37 Wordstat analysis of topics  225t, 325 workshops 43–44 ethics  18, 244, 247, 711t, 717 of AI  346, 353 automation and  680, 695–96 computing and  567 data and representation and  512, 521 of digitalization  513, 521 governance and  637 intelligent machines and  353–56, 358–59 Internet and  613

of LAWS  355 older people and  150 social media and  706 workplace 514 EU Framework  718 European Commission  614 European Commission’s Digital Competence Framework for Citizens  124 European Union, top-down IoT governance in  636–38, 638t evaluation, of theories  706, 717 everyday life  20, 434–35, 612, 711t everyday life health information literacy (EHIL) 118 exclusion  17–18, 421, 532, 709, 711t expertise data and representation and  509 KSS 579–80 self-presentation and  276–77 traits and  277 extrinsic motivation  574–75, 577, 590 Facebook  232, 235–37, 246, 267, 270, 308, 333, 334 algorithms 481 API 482 bias 91 college students and  614 community and  481 Community Standards of  481 data collected by  226–28 depression and  90–91, 276 election and  480 loneliness and  79 norms and  69–70 older people and  146 political engagement and  442, 459 privacy 229 privacy breach scandals of  482 surveillance and  17 use 327 face-to-face communication  81, 234, 267 facilitating conditions  371 facts, data and  554–55, 556 fairness 429 fake news  471, 477–78, 487 false data  555, 559

736   index false positives  86, 89, 100 familiarity  151, 152 family older people  148 planning 126 relationships 20 work and  303–4, 306 fascism  471, 487 FCC. See Federal Communications Commission fear of missing out (FOMO)  271, 276 Federal Communications Commission (FCC) 639 feedback, KSS and  580 feminist citizenship  529 feminist rights  528 Fioretti, Alderman Robert  644 First Amendment  480, 484 Five Star Movement  472 flexibility 577 FOMO. See fear of missing out food preparation by robots  153 provisioning  197, 207 waste  197, 204 Food and You 195 4chan 483 Fox, Martha Lane  38 fragmentation 305 freedom of speech  17 central implications of  485–87, 486t challenges 486t content sharing and  478–80 crisis 472 democracy and  477–78 fake news and  477–78 findings about  476–85 future research  487–88 hate speech and  476 keywords  473, 474t memes and  479 political correctness and  476 privatization of censorship  480–82 Supreme Court and  484 tension points about  485 Twitter and  479 universities and  472

free riders  575 fridge-cams  197, 204 friendship 91–92 networks  273, 414 Furie, Matt  479 future research  712t for communities and identities  417–19, 418t for computer-mediated communication, mental health and  101 data and representation and  518 Delphi process and  72–75, 73t, 244–45, 247, 341, 462–63, 462t freedom of speech  487–88 for intelligent machines  356–59 KSS  584, 590–91 on older people  152–53 scoping review and  72–75, 73t gaming communities  253–54 gaps 713 Garvey, E. G.  555 GDPR. See General Data Protection Regulation gender  18, 276 classification 564 communities and  414–15 digital divide and  111, 122 digital inclusion and  121–22 digital skills and  124 equity 122 ICTs and  115, 122 identities and  414–15, 564–65 IM and  230 inequality 487 Internet and  111, 115, 616 smart phones and  111 transgender 564 General Data Protection Regulation (GDPR)  612, 638 geofencing 512 geographic information systems (GIS)  511–12, 514 geotag 512 geoweb 512 Germany 313 gig economy  21

index   737 GII. See Global Information Infrastructures GIS. See geographic information systems global culture  506 global environments, of automation  676–77 global financial crisis  472 Global Information Infrastructures (GII)  333 global information/network society  14–15 globalization  186, 187, 206, 208n2, 528 global markets  332 global position system of satellite (GPS)  22, 137 Global Voices  531 Gmail 308 GMDSP. See Greater Manchester Data Synchronization program goals, of robots  355 Good Things Foundation  431 Google  334, 556, 648 Google Scholar  44 Gore, Al  477 gout management  67–68 governance  vi, 29, 235, 458, 466, 501, 528. See also IoT governance algorithms and  539 citizenship and  633 data 641–42 datafication and  536–38, 541 definition of  630 of digital health technology  74 digital technology and  467 ecological 201 ethics and  637 gaps in  629, 650 grass-roots 632 IoT, accountability and  628–30 IoT levels of  631–33, 632f issues 703–4 models 633 participatory 633 principles of IoT  630–36 social 635 structures of  201 technology and  629 urban 633 governance and security accountability and  622 automation and  617–19

children use of digital media  615–17 citizenship and  610 communities and  620 concept pairs  607t, 610f concepts 606t Delphi process for  605, 621–24, 622t digital systems used by state  610–13 Internet regulation and  613–15 literature analysis of  606–21 personal information and  612 research challenges  624–25, 625t theory, method, and approach to  619–21, 620t topics  606–8, 607t, 610–19, 623t trust and  622 WordStat tool  608t government censorship 476 citizenship and  535 digital inclusion and  120 intervention 329 punishment 476 role of  vii surveillance and  15, 515, 535 GPS. See global position system of satellite GPS monitoring  482 Graham, N.  26 granularity, data and representation and  509 grass-roots governance  632 gratification 272–73 Greater Manchester Data Synchronization program (GMDSP)  642 Great Firewall of China  535 green behavior  205 grey literature  45, 114–15, 121, 124, 126, 129, 473 groups belonging, KSS and  586–87 dynamics, politics and  413 identities  271, 587–88 identity, KSS and  587–88 membership 413 online group behavior  413 SIDE model of online group behavior  223 user  20, 711t H2020 programs  718 habit, social interaction and  273

738   index hacking 22 Haddon, L.  616 HaLow 643 Handbook of children and the media 26 Handbook of digital games 26 Handbook of mobile communication studies 26 Handbook of porous media 26 Handbook of research on computer-mediated communication 26 Handbook of the psychology of communication technology 26 harassment, online  22 hashtags  236, 479 hate crime  22, 459 hate speech  472, 476, 484 HCI. See human-computer interaction health and well-being  vi, 19, 113, 711t choices and  276 communication and  59, 75, 76 concept pairs  60f, 61f connectivity and  126, 269 delivery, privatization of  57–58 digital  28, 29 digital inclusion and  125–26 digital media and  71 distractions and  275 education 59 inequalities 73–74 personal information and  274 privacy and  66, 75 problematic Internet usage and  268, 270–71 in rural communities  125–26 social media usage and  251 studies 57 unintended consequences with  272 vulnerability and  270–71 health apps, social media and  67 health care digital inclusion and  123 digital media and  65 digital technology and  76 evidence-based 65 literature 60–64 relationships 67 robots and  354 smart phones and  66–68

social media and  64 social support and  68–69 health information literacy (HIL)  118 healthism 67 healthy living  67 Heider, Fritz  301 heterogeneity, data and representation and  509, 548 hidden data  556, 568 high culture  440 higher education  522 high-performance computing  662 HIL. See health information literacy hipster fascism  487 Hogg, David  478 homes access at  370, 381t confidence at  396 smart  143, 149, 186, 207, 629 technology and  399 horizon scanning  v horizontal information seeking  557 household energy consumption  203 HRP Socio-Economic Classification  437f human-computer interaction (HCI)  138, 188–94, 190t, 196–97, 200, 203–7, 299–300, 675 human relations  508 intelligent machines and  346, 347, 349–51, 357–58 resources and  201 with robots  352 humans capital 124 computer/object relationships  203 desires, environmental impacts of  196 geography, big data and  512 intelligence, AI and  347 life 187 machines and  22, 346, 349, 353–56, 661, 666, 676 resource relationships  207 rights  473, 530 robot dynamics  357 robot hybrid teams  351 robot interaction  349–50 robots and  359

index   739 role of  685–86 social relationships  203, 208n2 society and  509, 685–86 technology and  357 human-to-human interaction  350 hyperlinks 552 ICTs. See information and communication technologies identities  v, 14, 19, 28 authenticity and  406, 413 centrality 302 centrality, boundaries and  302 children and  414 CMC and  422 communities and  406, 409f, 410 credibility of  413 cultural 565 Delphi process for  405, 406 digital 413 education and  415 experimentation 252 faceted 307 gender and  414–15, 564–65 group  271, 587–88 online  252, 418t, 422 self and  307 self-expression and  483 smart phones and  414 theft 22 verification of  413 virtual 413 IEBT. See Internet and e-business technologies IHDs. See In-Home Displays illiteracy, digital inclusion and  121 illness 65 IM. See instant messaging imagined communities  231 Impact Factor  138 impacts of automation  43, 323–24, 328, 674 data and representation and  512 social  678–79, 699, 718 implementation 703 digital media  699 of digital solutions  389f

IoT governance  649–50 perceived stress of  396–98, 397t smart cities  643 inattention management  311 incentives, KSS  581 Incheon Free Economic Zone  646 inclusion. See digital inclusion independence, of adolescents  233 indeterminate data  568 India, smart phones in  117 individual benefits and costs  582 individualism  459, 534, 590 industrialization  192, 194 industrial revolution  3–4, 14–15, 661 Indymedia 531 inequalities  29–30, 421–22, 512–13, 701–2 automation and  678–79, 679t, 694–95 culture and  445 definition of  427, 428–29 digital literacy and  427, 428, 440, 444 in everyday life  434–35 formation of  435 forms of  427 health and well-being  73–74 impact of  433, 443 patterns 431 policy and  430 political 453 research 430–35 social distinctions and  426, 431–32, 434 structural 331 information capital  427, 438, 445 citizenship and  481 commodification of  334 communication and  484 digital 4–5 disinformation  471, 477, 478, 480 divide  428, 512 economy and  333 encoding of  4 gaps 428 misinformation  4, 476, 477 norms 482 overload  vii, 22, 45 quality 582

740   index information (Continued ) research 45 resources 3 science 118 search 582 seeking  268, 557 sharing  576, 629 societies 3 studies  236, 522 technology 208 warfare, of Russia  485–86 information and communication technologies (ICTs)  80, 83, 85, 92–93, 99, 331, 699 access 118 big data and  547 digital inclusion and  112 divide 428 gender and  115, 122 information literacy and  118 infrastructure 124 positive aspects of  250–51 work-home boundaries and  300 information-based rights framework 128 information literacy  482, 485, 532 categorization of  114–15 defining 112–13 digital inclusion and  111–14, 125–26, 127–28, 129 ICTs and  118 in journals  118 rural communities and  114 informed consent, data and  514 infrastructure civic 629 of consumption  202 digital  329, 526 ICTs 124 technology 647 In-Home Displays (IHDs)  197, 205 innovation  21, 329, 333 diffusion of  330 methods 717 social impact of  718 insomnia 267 Instagram 79

instant messaging (IM)  230, 308 awareness cues and  310 gender and  230 relationships and  230–31 role of  230–31 instant personalization  515 institutionalized cultural capital  432, 435, 440 integration-segmentation continuum  302–3 intellectualism 488 intelligent machines acceptance of  352–53, 358 adoption of  349, 352–53, 358 applications of  355 augmentation of  358 context of  355, 356–57 cross-cutting requirements of  356–57 defining 346 empirical research on  357 ethics and  353–56, 358–59 future research agenda for  356–59 human relations and  346, 347, 349–51, 357–58 human-robot hybrid teams  351 human-robot interaction  349–50 jobs and  344–45 literature 348–49 multi-disciplinary approach  356 responsibility and accountability for 355–56 safety and risks of  354–55 subversion via practice  358 technology acceptance of  358 intention, behavior and  198 Interactive topic modelling graph  50f, 51f interconnected devices  508 interdisciplinary research  84 interdisciplinary views  vi–vii intergenerational interaction, with older people 147 intermediality 254 International Bibliography of the Social Sciences 44 International encyclopedia of digital ­communication and society (Mansell et al.)  26 internationalization 528

index   741 International World Wide Web Standards Community W3C  568 Internet  4, 5, 79. See also problematic Internet usage access 116 addiction  89, 98, 275 adoption  120, 126, 128 in Africa  111 broadband access  128 censorship 704 children and  615–17 citizenship and  531–32 communication and  515 community and  116 connection quality  275 ethics and  613 filtering 334 gender and  111, 115, 616 governance and security and regulation of 613–15 interventions 61–62 literacy 125 loneliness and  233–34 mental health and  82 older people and  144, 153 openness of  334 perceived benefits  329 politics and  516 rural communities and  117 self-expression and  531 skills 125 smart phones and  123 social class and  438f social resources and  82 of Things  250 usage, socio-economic aspect of  616 use  82, 252 women and  122 Internet and e-business technologies (IEBT) 329 Internet studies 26 interpersonal interaction, digital media and 45 The intersectional Internet (Noble & Tynes) 26 intersex 564 Intertopic Distance Map  50f

intimacy  19–20, 23, 268 intrinsic motivation  574–75, 577, 581, 590 investments digital infrastructure  329 venture capital  329 invisibility 482–85 involvement, citizenship and  633, 642 IoT governance accountability and  628–30, 633–34 advancement 637 case studies of regional and national  636–42, 649–50 challenges 631 in Chicago  643–46 democratization and  650 deployments 632 devices and  645–46 European Union top-down  636–38, 638t implementation 649–50 legitimacy and representation  633 levels  631–33, 632f local deployment  643–49, 650 in media  629 in New York, United States  648–49 policy  631, 640 principles of  630–36, 649 requirements 632 social 635 in Songdo, South Korea  646–47 standardization  637, 640 surveillance and  634 theme usage  635–36, 636t tools  631, 632f transparency and  634–35, 638, 651 trust and  638 UK top-down  641–42, 642t United States top-down  639–40, 640t IT skills gap  370 Jam & Justice project  633 jobs AI and  352, 353 digital roll-outs and  385–87, 385f intelligent machines and  344–45 journalism, citizen  527, 531 Journal of Big Data  558–59, 559t

742   index journals 25 CMC, mental health and in  94f, 95, 95f, 100–101 digital inclusion in  127 information literacy in  118 older people in  137–38 psychological 84 Juniper Research  643 Keen, Andrew  477 keyword assignment  566, 568 keyword searches  556–57 Kids Online (Livingstone & Haddon)  616 K-Means clustering  381, 382t knowledge 569 bias and  548 communities 409 creation, non-coherence  509–10 labor  4, 345, 348, 383, 384f organizational knowledge management system 586 societies 3 knowledge sharing systems (KSS) altruism 578–79 anonymous 587 behavior 590 collective costs and benefits expected  588 community and  580 components of  574 contextual factors of  588–90, 592 cost reduction with  573 digital 591 distinctions  574–75, 592 enjoyment and entertainment  578 expected individual benefits and costs  582 expertise 579–80 feedback and  580 framework 574–76 frequency 578 future research  584, 590–91 group belonging  586–87 group identity  587–88 incentives 581 indirect effects of  587 literature 589 motivations for  573, 592f online 584

other-oriented motivations  582–88 public goods and  575–76, 590 quantity 583 reciprocity  583–84, 590 reputation and  580–81 rewards 586 self-efficacy and  588–89 self-oriented motivations  576–82 social comparison  584–85 sociality and  586–87 social loafing  575, 582, 584, 585–86 socially-translucent 586 technical aspects of  574 trust and  589 venue of  589–90 labor knowledge  4, 345, 348, 383, 384f market  332, 338 of robots, slave  661 landlines 232 language, thought and  46 laws  21, 528, 605, 712t LAWS. See Lethal Autonomous Weapon System leadership  371, 390, 391, 392f, 400 communities 409 private sector  639, 649 by sector  393f learning  20, 711t digital literacy  117 of machines  344–45 older people and  143, 144 social media and  271–72 legislation  359, 703 legitimacy and representation, IoT governance 633 Lethal Autonomous Weapon System (LAWS) 355 leveraging on technology  314 lexical relationships  77n1, 247n1, 341n1, 423n1, 467n1, 524n1, 626n1, 718n1 LGBTQ 18 LGBTQ rights  528 liberal media  485 liberation technology  530–31 libertarianism 480

index   743 librarians 568 libraries, digital inclusion and  121, 123, 128 life datafication of  536–38 opportunities 438–39 satisfaction, social media and  269 work and  21, 23 lifestyles  76, 198–200, 207 limited users  439t linguistic differences  223 linguistic DNA  46 LinkNYC 648 literacy  222, 427, 440 literature analysis 58–70 analytic approach  71t citizenship 453–60 on communities  407–15 concept pairs  59t concepts ranked  58t of data and representation  502–16 Delphi process  324–36, 324t digital divide and  115 digital inclusion  115–16 empirical approach  71t epistemological approach  70t ESRC project  42–52, 223 grey  45, 114–15, 121, 124, 126, 129, 473 health care  60–64 intelligent machines  348–49 measures and measurement  64–66 media mastery  256 smart phones  66–68 social support  68–69 theory, method, and approach  70–72, 71t topics  58–70, 58t, 62t volume of  vii weight loss  69–70 live-research 516 Livingstone, Sonia  615, 616 local deployment, IoT governance  643–49 Logan, Jessica  268 loneliness Facebook and  79 Internet and  233–34 of older people  148–49 LoRaWan 643

loss 273 lurking 586 machines. See also intelligent machines; robots humans and  22, 346, 349, 353–56, 661, 666, 676 learning of  344–45 networked 344 technology 350 malinformation 477 Mansell, R.  26 manual topic selection and merger  88 manufacturing, robots and  15 marketing 613 mass media  472 materiality  201, 202–3, 206, 569 Matilda 350 measures and measurement  64–66 media buzzwords 473 comparison, social influence and  277 consumption and  232 dual nature of  252 effects 591 hybrid systems  460 ideologies 254 IoT in  629 liberal 485 literacy  253, 485 manifold 254 mass 472 moral panic and  232 polymedia 254 studies 452 medialife 254 media mastery  250 access and  266 adolescents and  252 balance and  251–52 boundaries and  268–71 complexity of  278 concept 252–56 constraints 271–72 content management  272–74 context of  252–53, 257 definition of  251–52

744   index media mastery (Continued ) factors 257 literature 256 materials and coding  256–66, 257t, 260t obstacles to  274–75 paradoxes, tensions, and contradictions in  252, 712t sample description  259, 260t, 266 structurational theory and  251–52 typology  257, 257t usage awareness  275–78 Media Mastery Project  255 mediapolis 254 Media Richness theory  117 mediatization 254–55 medical data  553 medical information systems  60 memes 479 mental health  275. See also computer-mediated communication, mental health and AICPs and  354 CMC, mental health and  96–98, 96f, 97t, 101 CMC and  79–80 computer game and  65–66 definition of  84 impairment of  84–85 Internet and  82 of older people  153 SNS and  91 mental illness  80–81, 85 mental thriving  81 meta-attention 253 metadata  100, 557, 565–68 methods innovation  717 microboundaries 314–15 microprocessors via integrated circuits  5 millennials  399, 400, 472 mindfulness 253 misinformation  4, 476, 477 missing areas  707–8 mobile phones. See smart phones mobile technology data and representation and  509 digital inclusion and  122–23 mobility communication technology and  304–5 work-home boundaries and  304–5

Mohammed, Shaheed Nick  477 monetization 536 monitoring technology  149–50, 151 mood management  85 moral panic, media and  232 mothers 122 motivated attention  253 motivational theories  577 motivations 198 extrinsic  574–75, 577, 590 intrinsic  574–75, 577, 581, 590 for KSS  592f other-oriented motivations KSS  582–88 rewards 581 self-oriented KSS  576–82 MUDs (multi-use dimensions)  414 multi-device ecology, work-home spaces and 305–7 multidimensional networks  330 multidimensional scaling  50f multi-disciplinary approach  356 multimodality  142, 243–44 multi-organization structures  333 multi-platform ecologies, work-home boundaries and  307–9 multi-platform/holistic studies  707, 717 multiplayer games  414 multitasking  272–74, 278 multi-use dimensions. See MUDs musculoskeletal symptoms  272 MyMadison.io 645 Myspace 614 narrative exchange  531 NASA protocol, for data  562–64 national borders  534 national consciousness  528 nationality, citizenship and  539 National Science Foundation  30, 43, 324 native data  550, 566 native speakers, of chatrooms  234–35 natural phenomena, measuring  564 natural resources, depletion of  187 nature, culture and  187 negative effects  21, 712t negative impacts of digital roll-outs  386, 387f of digital technology  76

index   745 neoliberalism 529 Netflix 333 Net Inclusion Summit  127 net neutrality  515, 704 networked individualism  534 networked machines  344 networked privacy  229 networking 15 networks  200–203, 452, 454 communities 409 dynamics 459 friendship  273, 414 of practice  576 sociability 407 society  416, 458, 460, 706 theory  14, 330 New Public Management  620 new social era  709t New York, United States, IoT governance in 648–49 Nexis Uni (News)  8t NFC-based mobile payment system  646 NGram Viewer  9, 10t, 11f Nix, Alexander  477 Noble, M.  26 noise, data and  553 nomophobia 91 non-coherence of knowledge creation 509–10 non-ESRC themes  659, 699, 703–14, 709t, 715t, 716f. See also specific themes non-users  438–39, 438t, 440 nonverbal cues  478 norms 198 citizenship 237 communication and  240t Facebook and  69–70 information 482 work 302–3 notifications 576 NRS social grades  436t, 438f NS-SEC classifications  436, 436t, 437f Nuffield Foundation  706 obesity 69–70 objectified cultural capital  432 objectivity big data and  547

data and  561 data and representation and  510 objects, classification of  565 occupational psychology  315 Occupy Data  540 Occupy Wall Street  459 OECD. See Organization for Economic Co-operation Development Ofcom Media Literacy Survey  381, 436, 438 offline, online and  433, 485 older people  20 alienation of  144 assisted care of  149–50 communication and  144, 145–48, 151 computers and  141, 143, 144, 153 conferences and  137–38 data collection and  151–52 digital technology and  136–37, 142–45, 151 ethics and  150 Facebook and  146 facilitating interaction with  146 familiarity and  151, 152 family 148 interaction with mainstream technology  139–42, 151 intergenerational interaction with  147 Internet and  144, 153 in journals  137–38 learning and  143, 144 loneliness of  148–49 mainstream technology and  139–50 mental health of  153 monitoring technology and  149–50, 151 motivations for interaction with  146–47 multimodality and  142 physical health of  153 physical interaction with technology  141 PSR 136 quality of life in  143, 149–51, 618 reminiscing of  148 research on  137–39, 138t, 140t, 150–54 robots and  350 search engines and  144 security and  142, 145 self-efficacy of  144–45 smart phones and  141, 143, 148 SNS and  145–46, 151 social interaction and  145–48, 151

746   index older people (Continued ) social networking and  145 spoken dialogue interaction and  141–42 stigma and  145 tablets and  141 technology acceptance and  143–44 technology types and  143 technology value and  144 television and  153 terms for  139t working age and  136 online collaboration 531 communication  478, 539 communities  406, 410–11, 414 expression  16, 23 group behavior  413 harassment 22 identity  252, 418t, 422 interaction 223 KSS 584 leisure activities  266 news sites  552 offline and  433, 485 privacy 229 public sphere as  17 radicalization 476 research 64–65 security issues, older people and  145 self 233 social interaction  69 third spaces  472 Open Data  540 Open Data Executive Order  644 OpenGov Foundation  645 open source software  581 opportunity structures, of digital technology 431 optical character recognition  550 orality 222 organizational challenges digital roll-outs and  388–93, 389f of SMEs  389 technology acceptance and  372 organizational culture digital solutions and  400 measures of  394, 394t, 396

positive 398 strategy and  399–400 technology acceptance and  372 transformation of  401 organizational factors  396 organizational knowledge management system 586 organizational rewards  586 organizational size, digital solutions and 389f organizational size and sector digital roll-outs and  374, 375f, 376, 376t, 377f, 388, 389f, 390 digital solutions increase  377, 378f Organization for Economic Co-operation Development (OECD)  124 organizations  20–21, 712t original data  566 origins 27–28 othering 472 others, digitization of  18–19 outdated data  552 overjustification hypothesis  581 overuse 267 Oxford handbook of Internet psychology 26 The Oxford handbook of Internet studies 26 Oxford Internet Institute  473 Palmer 195 Palo 649 paradoxes 23 parallelism 305 parental monitoring  20 parents 615–16 Parkland school shooting  478 Paro  354, 356 participation  14, 17, 23–24, 458, 709 citizenship  531, 533, 633 civic 118 communities 409 in democracy  532 political  459, 533 participatory governance  633 partisan interaction  459 PAS. See Publicly Available Specification

index   747 patient safety  64 PDA (Personal Digital Assistant)  314 peer-to-peer file and bookmarking system 585 perceived boundary control  302, 306 perceived stress, of implementation  396–98, 397t performance expectancy  371 persistence 330 personal availability  309–11 personal coaching  70 Personal connections in the digital age (Baym) 26 Personal Digital Assistant. See PDA personal experience  372 personal health  67 personal information governance and security and  612 health and  274 self-presentation and  274 sharing 228 persuasion 208n3 PETRAS IoT research hub  642 phones. See smart phones phubbing 91 physical boundaries  301 physical health  81, 153 physical interaction with technology, older people and  141 physical robots  348 picture-based authentication  142 piracy 268 place, space and  305 platforms 704 affordances, communication and  240t capitalism 536 economics 333 multi-platform ecologies and work-home boundaries 307–9 multi-platform/holistic studies  707, 717 sharing 536 social media  228–29, 232 plausible accounting  310–11 plausible deniability  310–11 Pointwise Mutual Information  46 polarization 23–24

policy  21, 466, 660, 712t, 714 AI 359 automation 674–75 digital divide and  118 digital inclusion  111, 119, 127 digital inequalities and  430 digital technology  614 economics 663 evidence-based 707–8 implication 707–8 IoT governance  631, 640 privacy  334, 613, 645 research 430 work-home boundaries  313 political capital  433 political correctness  472, 476 political debate  531 political economy  325 political engagement  442, 452, 459, 526 political inequality  453 political interventions  531 political participation  459, 533 political parties  472 political science  466 political studies  452 politics  v, 29 automation and  673 citizenship and  452, 459 climate of  471 communication and  236–37, 415, 459–60 communities and  529 Delphi process and  417, 451 of design  208 group dynamics and  413 Internet and  516 radicalization of  471 social media and  478 truth and  477 Twitter and  458, 467, 516 polymedia 254 polysemy 557–61 population demographics  193 population growth  187 populism  471, 487, 488 pornography 274 positive effects  22, 712t positive impact, of digital roll-outs  386f

748   index postmodernism 483 Potential Support Ratio. See PSR power dynamics, citizenship and  541 Powering the nation (Palmer & Terry)  195 power relations, in communities  533 practices 200–203 pre-data  549–50, 554, 566 predictive risk models  537 preemption 538 pregnancy 126 PREMIS Data Dictionary for Preservation Metadata 566 privacy  247, 276, 451, 605, 611, 630 of adolescents  229 algorithms and  645 breaches 642 breach scandals  482 of children  616 concerns 17 devices and  645, 648 enhancing tools  539 Facebook 229 FCC and  639 health and well-being and  66, 75 networked 229 online 229 policy  334, 613, 645 protection 638 rights  541, 616 risks 615 robots and  18 SNS and  146, 244, 576 social implications of  518 social media and  229 steganography and  483 technology and  644 WiFi and  648 private sector leadership  639, 649 privatization of censorship  480–82 of health delivery  57–58 problematic Internet usage  98, 254 access and  267 attitudes about  276 computer-mediated communication, mental health and  82, 89 connectivity and  269 costs of  275

gratification and  273 health and  268, 270–71 self-regulation and  277–78 procrastination  272, 273 production scales  201 productivity 315 automation and  662f, 677 of employees  313 professionalism, reputation and  581 professions, automation and  677–78, 684–85 propaganda  4, 473, 480 Proquest Periodicals Index Online  9t protest 530 proximate 512 PSR (Potential Support Ratio)  136 psyche 479 psychological boundaries  301 psychological journals  84 psychological theories, communities and identities and  416 psychological well-being (PWB)  81, 84–85, 96, 99, 101 psychology  84, 195 psychopathology (PTH)  81, 84–85, 88, 89, 92, 96, 99, 101 public goods, KSS and  575–76, 590 public health interventions  61 public libraries  116 Publicly Available Specification (PAS)  641 public-private partnership model  647 public safety  643 public space  305 public sphere, as online  17 punishment, government  476 purpose 27 PWB. See psychological well-being Python 47 quality of life, in older people  143, 149–51, 618 racism  471, 479 radicalization  473, 476 rational actor theory  591–92 rational choice  194–98 raw data  550–51, 553, 564, 566 “Raw Data,”  551 readers x reciprocity, KSS  583–84, 590

index   749 reemployment 662 regression model, digital roll-outs  397f regulation  21, 703, 712t related work  25–26 relationships  v, 19, 23, 222, 276, 711t, 713 CMC, mental health and  91–92 communication and  240t family 20 health care  67 human-resource 207 human-social 203 IM and  230–31 lexical  77n1, 247n1, 341n1, 423n1, 467n1, 524n1, 626n1, 718n1 management of  240t multimodality and  243–44 social media platforms and  232 themes about  714–16 WEF and  188, 193, 204 reminiscing, of older people  148 REPLICATE project  631–32 representation  vi, 642, 646, 647, 649. See also data and representation Republican Tea Party  459 reputation of community  588 KSS and  580–81 sexual 617 research. See also challenges; future research on CMC, mental health and  81–83 cross-cutting questions  701–5 on digital divide  112, 332 eHealth 65 inequality 430–35 information 45 integration 86 interdisciplinary 84 methods 72t on older people  137–39, 138t, 140t, 150–54 online 64–65 policy 430 resilience 206–8 resources 188–89 conservation and  198 consumption and  197, 199, 202 distribution of  194, 195 human relations and  201

sustainability and  206–8 usage of  192–95 responsibility, for intelligent machines 355–56 revenge porn  274 rewards  581, 586 RFID tags  5, 146 rights citizenship 528 civil 530 to disconnect  637 feminist 528 human  473, 530 LGBTQ 528 privacy  541, 616 risks of children  616 of intelligent machines, safety and  354–55 privacy 615 Robotic Process Automation (RPA)  347, 352 Robotized Stereotactic Assistant (ROSA)  350 robots  143, 149, 344 autonomy of  351, 619 child development and  354 defining 347–48 food preparation by  153 goals of  355 health care and  354 human relations with  352 humans and  359 manufacturing and  15 mobile 345 nannies 354 older people and  350 physical 348 privacy and  18 slave labor and  661 social  347–48, 354 trust in  352–53 roles boundaries and  302 of citizenship  530, 532 conflict  309, 312 of digital media  718 of government  vii of humans  685–86 IM 230–31 work-home conflict and  303

750   index ROSA. See Robotized Stereotactic Assistant Routledge handbook of Internet politics 26 RPA. See Robotic Process Automation rule of law  612 rules  314, 528 rumors, social media and  477 rural communities digital inclusion and  119 health in  125–26 information literacy and  114 Internet and  117 women’s health and well-being in  113, 121–22, 125–26, 129 rurality 119 Russia, information warfare of  485–86 safety and risks, of intelligent machines 354–55 Sage handbook of social media 26 Salganik 26 salon events  41, 43, 606 sanitation 187 ScienceDirect 7t scoping review  80 CMC, mental health and  85–86, 99 communication and  86 about communities  406 for communities and identities  417–19, 418t data and representation and  518 Delphi process  240t, 241, 337–38, 338t ESRC project  39–40, 39t, 42, 222t, 323 future research and  72–75, 73t Scopus 114 SCOT. See social construction of technology screen time  99 screen use  22 search algorithms and  15, 18, 21 keyword 556–57 older people and  144 terms, CMC, mental health and  87t secondary audiences  64 security  vi, 66 cyber 352–53 by default  641 FCC and  639 older people and  142, 145

self availability of  268 digitization of  18–19, 711t, 713 identity and  307 saturated 483 self-awareness 585 self-broadcasting  268, 269–70 self-categorization theory  583 self-censorship 482–85 self-concept 583 self-construction 539 self-control 278 self-diagnosis apps  67 self-disclosure  91, 266, 270 self-efficacy  144–45, 330, 587, 588–89 self-esteem  267, 577 self-expression 276–77 identity and  483 Internet and  531 memes and  479 self-injury 271 self-monitoring 70 self-oriented motivations  576–82 self-presentation  268, 269–70, 585 expertise and  276–77 personal information and  274 on SNS  82 self-preservation, choices and  276 self-regulation  277–78, 637 Seoul 646 service delivery  429 service sector  345 service work, automation of  348 sexting  276, 616–17 sex trafficking  22 sexual behavior  276 sexual intercourse  271 sexual reputation  617 shared behavioral routines  200 shared space  305 sharing platforms  536 Short Warwick-Edinburgh Mental WellBeing Scale (SWEMWBS)  124 SIDE model of online group behavior  223 Silicon Valley  480 simulations 354 Singapore 647

index   751 singularity 19 skills, automation and  677–78, 692–94 Skype 137 slave labor, robots and  661 slaves 565 sleep  93, 99 sleep deprivation  272 slut shaming  617 Small and Medium Sized Enterprise (SME)  38, 328–29, 337, 370, 374, 389 Smart and Sustainable Cities and Communities’ Coordination Group (SSCC-CG) 637 Smart Chicago Collaborative  644–45 smart cities  629, 640, 642, 643, 645, 651n22 smart homes  143, 149, 186, 207, 629 smart machines  348 smart metering  204 “Smartphone by default” internet users 123 smart phones/mobile phones  6, 23, 79, 369 addiction to  267, 269, 270, 272 awareness cues and  310 citizenship and  414 CMC, mental health and  91 communication and  266, 307 communities and  414 costs of  275 digital inclusion and  122–23 emotional attachment to  91 gender and  111 health care and  66–68 identities and  414 in India  117 Internet and  123 intimacy and  19–20 literature 66–68 location of  482 market growth of  307 older people and  141, 143, 148 overuse of  267 self-monitoring 70 social space and  509 students and  267–68 use 91 women and  123 work and  300, 306, 308–9 smart scales  70

Smart Seoul  646 smart TV  537 SME. See Small and Medium Sized Enterprise SnapChat 270 Snowden, Edward  529 SNS. See social networking sites social activity, digital inclusion and  121 social anxiety  91 social attitudes toward AI  671, 671t toward automation  671, 671t, 681–82 social barriers, to digital inclusion  121 social behavior  98, 222 social capital  22, 91, 237, 268, 270, 331, 427, 431, 441–42, 445 social change  14, 186, 458 social class  419 digital media and  428 formation of  426 groupings 444 Internet and  438f social media and  437f, 442, 443f, 444f social comparison  79, 584–85 social compensation  91 social competence, of adolescents  233 social construction of algorithms  518 of data  518 of digital technology  517–18 of technology  255 work and  302–3 social construction of technology (SCOT)  300, 315 social context  93, 229, 661–63 Social Credit System  482 social cues  222–23 social displacement  82 social distinctions, digital inequalities and  426, 431–32, 434 social dynamics  513 social environment  445 social era, names for new  14–15 social exchange theory  577–78 social exclusion  331, 426–27, 434 social expectations  269, 309 social facilitation  585 social governance  635

752   index social identity theory  583, 588 social impacts  678–79, 699, 718 social implications, of privacy  518 social inclusion  124, 513 social influence  371 contradictions and  271 media comparison and  277 multitasking and  274 social media and  271 social interactions  14, 80 boundaries and  300 digital media and  232 habit and  273 management of  231 older people and  145–48, 151 online 69 social media and  269 social involvement  82 social issues, automation and  678–79, 682–83 sociality, KSS and  586–87 socialization  20, 232–33, 271 social justice  118 social life, digital media and  430 social loafing  575, 582, 584, 585–86 social loneliness  148 social media  473 adolescents and  229 class and  442, 443f communication 23 community and  64 companies 480 contradictions of  271 debates 526 ethics and  706 health apps and  67 learning and  271–72 life satisfaction and  269 notifications 253 platforms  228–29, 232 politics and  478 privacy and  229 regulations 480–81 rumors and  477 self-esteem and  267 social class and  437f, 444f social influence and  271 social interactions and  269

students and  267–68, 270 traffic 536 transitions and  270 usage 251–52 visualizations 512 social movements  415, 485, 533 social networking sites (SNS)  64, 137, 253, 704 analysis  226, 227–28, 230, 235–37, 459, 512 computer-mediated communication, mental health and  82, 90–91 defining  235, 511 employment opportunity and  444 functions 61 key characteristics of  511 mental health and  91 older people and  145–46, 151 privacy and  146, 244, 576 self-presentation on  82 social capital and  441–42 social psychology  195 social relations  266 communication and  272 connectivity and  268–69 gratifying 272–73 loss and  273 usage and  276 social resources, Internet and  82 social robots  347–48, 354 social science  v, 45, 196, 200, 208n2, 510 automation and  680, 696 theories 706 Social Sciences Citation Index  256 social shaping, of technology  255 social sorting  538, 541, 612 social space, smart phones and  509 social support  82, 91, 268, 271 adolescents and  233–34 CMC and  234 health care and  68–69 literature 68–69 social theory, viii social welfare system  430 social wellbeing  270 societal effects  22, 712t society. See also digital society citizenship and  541 digital media and  vi, 718

index   753 digital technology and  31, 714 humans and  509, 685–86 networks  416, 458, 460, 706 softwarization of  15 Society & the Internet (Graham & Dutton)  26 socio-economic status  442 of digital divides  513 of digital media, impact of  334 of Internet usage  616 socio-emotional context  223 sociological theories, communities and identities and  416 sociology  236, 517, 706 socio-technical networks  202, 203 softwarization, of society  15 Songdo, South Korea, IoT governance in 646–47 source data  552–54, 566 space, place and  305 space shift, work place and  304–5 spatialities 512 speech  46, 223 spoken dialogue interaction, older people and 141–42 spontaneity 577 SSCC-CG. See Smart and Sustainable Cities and Communities’ Coordination Group stakeholder engagement  38–39 state, digital systems used by  610–13 status updates  268 steganography 483 stigma  143, 145 storytelling 16 strategy, organizational culture and  399–400 stress 309 structural inequality  331 structurational theory  251 structure, agency and  201 structure-agency dualism  201 students  267–68, 270 successful communication  391 suicide  100, 268, 269, 274 superconnected cities  641 super-setting 333 supply chain technology  628 supporting materials  ix Supreme Court, U.S.  484

surveillance  17, 150, 334, 518 automation and  612 big data and  612 capitalism 537 centralized 612 citizenship and  535, 542n1, 612–13 data and  481–82 everyday life and  612 Facebook and  17 government and  15, 515, 535 IoT governance and  634 surveys 64–65 sustainability  vi, 323, 337, 341 consumption and  192, 199, 201, 208n4 design and  191, 193, 196, 205–6 HCI and  188–94, 190t, 196, 197, 200, 203–7 lifestyles and  199 resources and  206–8 technology and  325 SWEMWBS. See Short Warwick-Edinburgh Mental Well-Being Scale symbolic analysts  345 syntopian perspective  713 system design  672t, 679–80, 683–84 tablets 141 technism  vi, 700 technocratic rationality  192 technological determinism  vi, 233, 518, 699 technological limitations, automation and 686–87 technological unemployment  662–63 technology 138 adoption of  330 appropriate use of  438 automation 676–77 change of  703 characteristics  16, 710t co-designing 244 culture and  676 dependence on  150 development, automation and  696–97 education and  59–60 governance and  629 home and  399 human and  357 infrastructures 647

754   index technology (Continued ) leveraging on  314 machines 350 older people and mainstream  139–50 older people interaction with ­mainstream  139–42, 151 privacy and  644 rules for  314 social behavior and  98 social construction of  255 social shaping of  255 standards 333 sustainability and  325 types, older people and  143 value, older people and  144 venues  15–16, 710t work and  399 technology acceptance  370 digital efficacy  380–83, 380t, 382t digital roll-outs and  384 ESRC-DSTL 672t factors of  371–72 of intelligent machines  358 key elements of  382 models  371, 387, 398 older people and  143–44 organizational challenges and  372 organizational culture and  372 survey and analysis methods  372–74 in UK workforce  369, 372 Technology and engagement (Rowan-Kenyon and Alemán)  267 techno-resistance 271–72 technostress 93 teenagers. See adolescents telecare  143, 145 telecommunications vi telephone  4, 232 telepressure 93 television  85, 142, 146, 148, 153 teleworkers 304–5 temporal boundaries  301 temporal properties, of communication  311 tensions 23 terrorism  24, 459, 487 Terry, N.  195

Tesler, Larry  347 texting  92, 277, 539 theories  viii, 70–72, 416, 709t books, conceptualization and  12t, 14–15 conspiracy 478 Delphi process  335–36, 335t development of  52, 706, 717 of empirical work  52 evaluation of  706, 717 motivational 577 social science  706 testing  706, 717 thought, language and  46 thoughtfulness 24 3D printer  5–6 3G wireless connections  120 Todo Chile Comunicado (All Chile Connected) 120 top-down strategies for management of boundaries 312–13 touchscreens 141 trade 333 training 711t traits, expertise and  277 transactional interactions  508–9 transactive memory theory  577 transgender 564 transitions, social media and  270 transnationalization 528–29 transparency  646, 647, 649 about algorithms  67, 480 data 562 IoT governance and  634–35, 638, 651 tree map, of concept pairs  49f troll farms  480 trolling behavior  92 Trump, Donald  452, 482 trust 605 in AI  352–53 algorithms and  501 automation and  672–74, 673t, 685, 687 governance and security and  622 IoT governance and  638 KSS and  589 in robots  352–53

index   755 truth data and  559–60 function 555 politics and  477 Turing Deceptions  355–56 Twitter  228, 235, 246, 334, 436, 441–42 citizenship and  458 communication and  236 community and  231–32 connectivity and  231 freedom of speech and  479 metrics of interaction on  236 politics and  458, 467, 516 Uber 480 unavailability 311 unemployment, technological  662–63 unintended consequences, with health  272 unintended effects  23 United Kingdom (UK)  193, 208n1 Defence Science and Technology Laboratory (DSTL)  43 Digital Economy program  718 Economic and Social Research Council  v, 27–28, 30, 36 Industrial Strategy Challenge Fund  718 organizations, digital solutions and  374–80, 375f, 376t, 377f top-down IoT governance in  641–42, 642t United Kingdom (UK) workforce attitudes 387–88 digital solutions and  399 digital technology and  387–88 organizational barriers and  371 perception of digital-roll outs success in 397f, 397t technology acceptance in  369, 372 United States (U.S.)  484, 639–40, 640t universities freedom of speech and  472 as pre-criminal space  487 University of Liverpool  30 unprocessed data  564 urban culture  506 urban governance  633 urbanization 187–88 URL references  552

usage awareness  275–78. See also problematic Internet usage user data  16–17, 710t user groups  20, 711t utility companies  193 utopia 660 values  198–200, 207 venture capital investments  329 venue, of KSS  589–90 verification, of identities  413 viral content  478–79 virtual communities  583, 590 virtual identity  413 virtual reality  347 virtual settlement  231 visibility 329–30 visualizations  ix, 194 data  77n1, 247n1, 341n1, 423n1, 467n1, 509, 510, 524n1, 626n1, 718n1 tools 205 voicemail 314 voting 459 vulnerability, health and  270–71 Walport, Mark  641 Wards clustering method  381 Waste and Resources Action Program (WRAP) 196 wasting time  24 Water, Energy, and Food (WEF)  187, 202 production of  205 relationships and  188, 193, 204 resources 195 sustainable HCI and  203–6 tipping point in  188 use 199 waterbot 194 water power  661 ways of being  222 “Ways of Being in a Digital Age,”  v, 28, 36 wealth  4, 329 webcam activity  536 web literacy  515 Web of Science  7t, 44, 114 WEF. See Water, Energy, and Food weight loss  69–70

756   index Weiser, Mark  628 well-being  vi, 29 WhatsApp  308, 310 whistleblowing 530–31 white nationalism  479 whole populations, data and representation and 509 WiFi  641, 648–94 Wikipedia  531, 578, 580, 584 women computers and  123 digital literacy of  121 empowerment of  121 health and well-being in rural communities, digital inclusion of  113, 121–22, 125–26, 129 Internet and  122 smart phones and  123 Women and the Web 122 WordStat tool analysis  47–48, 52f, 326t analysis of ESRC project  225t, 325 citizenship and  456t of communities  411t data and representation  504t governance and security  608t topics analysis  62t work  20–21, 712t attitudes at  398 automation and  336, 344–45, 429, 677–78, 684–85 awareness of  310–11 CMC, mental health and  93 communication and  312 confidence at  372, 381t, 396, 400 digital tools at  383 emails after  313 end to  660 environments  93, 299, 353 expectations and personal availability 309–10 family and  303–4, 306

flexibility 304 life and  21, 23 norms 302–3 place, space shift and  304–5 pressures 309 smart phones and  300, 306, 308–9 social-constructionism and  302–3 spaces 304–5 technology and  399 work-home boundaries  299–300, 301, 315 awareness of work and personal availability 310–11 boundary theory and  301–3 communication technology and  304–11 conflict  303–4, 309 enrichment 304 ICTs and  300 mobility and  304–5 multi-device ecology and space in  305–7 multi-platform ecologies and  307–9 policy 313 productivity and  313 segmentation of  312–13 spaces, multi-device ecology and  305–7 terminology about  300–301 violations of  314 work expectations and personal availability in  309–10, 314 working age, older people and  136 workplace digital culture  394 ethics 514 workshops 43–44 WRAP. See Waste and Resources Action Program writing 223 young people  228, 232–35 YouTube  478, 481 Zuckerberg, Mark  483