Black Communication in the Age of Disinformation: DeepFakes and Synthetic Media 3031276957, 9783031276958

This book explores the consequences of the changing landscape of media communication on Black interactions in the virtua

611 89 5MB

English Pages 223 [224] Year 2023

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Black Communication in the Age of Disinformation: DeepFakes and Synthetic Media
 3031276957, 9783031276958

Table of contents :
Foreword
Acknowledgments
Contents
Notes on Contributors
List of Figures
List of Tables
Chapter 1: Of Deepfakes, Misinformation, and Disinformation
The Open Curtain to Digitalization in Africa
How Did We Get Here?
Online Disinformation and Africa
Elections Coverage and the War of Fakes
“Fake News” Palava and E-journalism post-McLuhanism
Media Bubbles, Polarization, and Fake News
Ethics, Fake News, and Real News Tug of War
Rationale for This First Edition on Deepfakes
References
Chapter 2: The African Youth and Social Media at the Crossroads of Information, Misinformation, and Disinformation
Introduction
Youth Bulge and Social Media Usage in Africa
Youth Social Media Usage in Context
“Development” Versus “Envelopment” of Africa in the Digital Age
The Productive Dimensions of the Youth-Social Media Nexus
Overview of Social Media Use in Africa
Social Media, Youth Innovations, and Development
The Counterproductive Dimensions of the Youth-Social Media Nexus
Misinformation and Its Consequences
Disinformation and Its Consequences
Cyber Bullying, Abuse, and Harassment
Conclusion and Recommendations
References
Chapter 3: Artificial Intelligence for Social Evil: Exploring How AI and Beauty Filters Perpetuate Colorism—Lessons Learned from a Colorism Giant, Brazil
Introduction
Artificial Intelligence and Deep Fakes
Deep Fakes and Social Media
Efforts to Address the Issue
Deep Fake, Filters Beauty Standards, and Colorism
Colorism and Societal Issues
Colorism in Other Parts of the World
Colorism in Brazil: Lessons Learned
Conclusion
References
Chapter 4: Deepfakes as Misinformation: The Next Frontier of Sports Fandom
Exploring Deepfakes: An Emerging Communications Crisis
The COVID-19 Pandemic’s Role in Increasing Deepfake Culture
Creating Celebrity
Fueling Fanship and Fandom
Digital Fanship and Fandom
Deepfakes as Fandom
Conclusion
References
Chapter 5: The Influence of Social Media Use in the Wake of Deepfakes on Kenyan Female University Students’ Perceptions on Sexism, Their Body Image and Participation in Politics
Introduction
Methodology
Findings and Discussions
Influence of Social Media Use on Kenyan Female University Students’ Perception on Their Body Image and Identity
The Influence of Social Media Use in Spread of Disinformation and Misinformation and Its Effect on Kenyan Female University Students’ Participation in Politics
The Influence of Social Media in Promoting Sexism
Conclusion
References
Chapter 6: The Ogre and the Griot: Culturally Embedded Communicative Approaches Addressing ‘Deep Fake’ COVID-19 Narratives and Hyperrealities in Kenya
Introduction
Hyperrealities
Methodology
Findings
Re-humanizing the Clinician
Applying a Culturally Competent Communicative Style
Who Is a Griot?
Conclusion
References
Chapter 7: “The Medium is the Massage/Message”: Functions of Synthetic Media in Sense-Making Conditions
Introduction
Production, Spread, and Processing of Deepfakes
Managing Health Information Needs in the Era of Deepfakes
Conclusion
References
Chapter 8: Deepfakes: Future of Sports and Questioning Video Evidence
Introduction
Theoretical Framework
Artificial Intelligence and Enhancing Deepfake
Finding Solutions
Conclusion
References
Chapter 9: Examining the Role of ‘KOT’ in Reinforcing Organizations’ Voices Against Misinformation in Kenya
Introduction
Twitter, Misinformation and Organizations’ Online Engagement
Theoretical Grounding
Methodology
Findings and Discussions
RQ 1. Ease of Use of the Interface and Facilitation of Discourse
RQ2. Impact of Dialogic Loops and Feedback
RQ3. Usefulness of (Mis)information on Twitter to Policy Change
RQ4. KOT and Organizations’ Use of Twitter to Avert Miscommunication and Misrepresentation
Conclusion
References
Chapter 10: Akata Night Masquerade: A Semblance of Online Deepfakes in African Traditional Communication Systems
Introduction
Theoretical Perspectives
Defining African Communication Systems/Indigenous Media
Deepfake as a Modern Concept
Systems of Indigenous Communication in Africa
Origin of Akata Night Masquerade
Akata as a Form of Indigenous Media
The Purposes Akata Media Served in the African Community
Why Deepfake Is Not New in the African Society
The Akata Night Masquerade as a Deepfake
Conclusion and Recommendation
References
Index

Citation preview

Black Communication in the Age of Disinformation DeepFakes and Synthetic Media Edited by Kehbuma Langmia

Black Communication in the Age of Disinformation “This book is timely in our era of deep fakes and misinformation. It is especially necessary from the context of the Global South as democracy can get squeezed out as extremes and binary oppositions take center stage in the social media age.” —Glenda Daniels, Wits University, Johannesburg, South Africa “DeepFakes and Synthetic Media: Black Communication in the Age of Disinformation on digital spaces has extended the conversation about misinformation on platforms. Those unaware of how Twitter, Facebook, Instagram, WhatsApp, TikTok, and other digital social media platforms are changing socio-­ political and cultural interconnections in parts of Africa will glean strategic messages from this collection.” —Emmanuel K. Ngwainmbi, Volume Editor, Dismantling Cultural Borders Through Social Media & Digital Communications “This book views the “disinfodemic” of fake news and misinformation through diverse cultural lenses, providing us with nuanced understandings of  challenges multifarious consumers face navigating information in a digital world.” —Audrey Gadzekpo, University of Ghana, Accra, Ghana

Kehbuma Langmia Editor

Black Communication in the Age of Disinformation DeepFakes and Synthetic Media

Editor Kehbuma Langmia Legal & Management Communication Howard University, Department of Strategic Washington, DC, USA

ISBN 978-3-031-27695-8    ISBN 978-3-031-27696-5 (eBook) https://doi.org/10.1007/978-3-031-27696-5 © The Editor(s) (if applicable) and The Author(s), under exclusive licence to Springer Nature Switzerland AG 2023 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Palgrave Macmillan imprint is published by the registered company Springer Nature Switzerland AG. The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

This first edited volume on deepfakes and new media in Black communication is dedicated to all and sundry in the media world fighting to maintain hygiene in all digital communication platforms.

Foreword

The digital media revolution has exponentially multiplied and expanded the simulacrum phenomenon or the making of copies of which there is no original, as Jean Baudrillard observes. The increasing sophistication and diffusion of artificial intelligence software that can generate graphic images of which are difficult to distinguish between the natural and the synthetic launch society into uncharted ethical territories. These are new and exciting territories opening heretofore unimagined vistas of human creativity and technological innovation. In this well put together, we find that the old is new again, and the new is old. The authors have shown that the narrative tradition of blurring boundaries between the real and the imaginary, bending reality through imaginative storytelling predates modern technology and digital media communication. From the griot to the Wawan Sarki, or court jester, we’ve seen objects and characters take on inorganic and unnatural properties. Artificial Intelligence (AI) has stretched the limits and ease of applying these techniques. With every new communication technology or evolutionary phase in the development of new media has come the question of balancing between the ethical question of can and ought. It is fair to argue that two of the most essential competencies needed to navigate today’s media environment are digital and cultural literacy. Most academic discourses tend to examine one or the other, due mainly to the synthetic boundaries that divide academic disciplines and intellectual pursuits. By bridging this gap and helping readers appreciate the intersections between technologies of power and the power of technology in shaping Afro-cultures and vii

viii 

FOREWORD

Afro-futures, the authors offer language and lens through which to analyze how DeepFakes shape identity, interaction and cultural engagement. There is an invocation of the old debate on whether media technology is deterministic or neutral. Does the proliferation of DeepFakes exemplify the neutrality, pliability and malleability of new media technology, or does it affirm the essentially deterministic nature of technology? The entrance of synthetic images in African media and culturescape tells a conflicting and nuanced story. Often, the new technology is greeted with excitement and enthusiasm for its novel and glittering possibilities. From the invention of the alphabets through the different modes of communication, media ecologists find that changes in technology and culture have been both externally engineered, as well as self-engineered. The engineers of new modes of communication are obsessed with technique, hoping the use, application, regulation and impact will either be neutral or limited to its original intent. Technology, as history has proven, is not only physical and material; it is also social and cultural. With Web 4.0, embedded AI, and the Internet of Things, what is artificial and what is human? Long ago, Jacques Ellul argues that with advances in technology, humans are acquiring the essence of their technological inventions, even as new technologies are taking on human-like attributes. Through complex algorithms, media are not only becoming “extensions of man,” as Marshall McLuhan opined, but humans are being robotized as well. The convergence and interoperability of one with the other is making the virtual and the real more and more indistinguishable. The cultural debate continues to revolve around how these advances empower and/or disempower. From both technological and cultural literacy perspectives, this work invites us to examine the consequences of the Five A’s of DeepFakes, access, acquisition, application, analysis and affect, in the context of pan-African media ecology. Thanks to the digital revolution, African and African Diaspora communities have essentially leapfrogged into the Information Age. As users of digital technology and consumers of social media content, Black Digital Cultures and Digital Humanities have created avenues for access to glocal communication in new spaces. Afro-Black culture performances are gaining more access and visibility in the world stage. This has allowed for endo-exogenous storytelling from and about pan-African cultures. While the digital divide between technology-rich and technology-poor countries and communities has not been completely eliminated, it has been narrowed, in terms of access and acquisition. With respect to innovation and

 FOREWORD 

ix

application, the dependency relationship that characterized earlier evolutionary phases remain. It is encouraging that computer coding and program writing institutes and centers are growing and being encouraged among African and African Diaspora communities. Yet, it remains an emerging, rather than and embedded skill. That means that the language by which these stories are being told remain largely foreign. From bio-technology to business transactions, culturally appropriate representations require advocacy, persuasion and sensitization. Afro-cultures continue to function as digital non-migrants with respect to language, representation and portrayal. To effectively bridge the gap and engage with relative agency will require more than acquisition and access. It calls for responsible and responsive decision-making and application. This discourse sheds light on African cultures as users and abusers of new media technology, particularly with the rise and deployment of DeepFakes in social, cultural, political and even religious spaces. Thus, the essence of critical and analytical thinking as integral elements of media and cultural literacy. Users and abusers of media content range from innocent uninformed participants to deliberate malevolent actors. Access and acquisition are the easiest digital literacy skills to acquire, while higher-level skills such as analytical thinking, responsible application and proper affect require more deliberate effort, commitment and dedication. The latter have more steep learning curve. Not many have the motivation to commit to such self-education nor appreciate the importance of being discerning consumers of digital media content. Those whose intent is to peddle DeepFakes for particular agenda prey on the ignorance, vulnerability and lack of diligence of the part of unsuspecting and unskilled consumers. By capitalizing on consumers’ media and technological illiteracy, content creators de-educate, exploit and disempower users. DeepFakes provide classic examples of such dysfunctions. An even more important issue is the ethical crisis of truth-seeking and truth-telling in a post-truth culture. DeepFakes raise the question of whether “seeing is believing?” For long, it has been a taken-for-granted position that “show me and I will believe.” This mantra no longer holds. If the new slogan is “Viewer Beware,” what will be the standard for determining the credibility and veracity of a message? As DeepFakes become more sleek and sophisticated, it is more difficult for the audience to discern between the natural and the synthetic.

x 

FOREWORD

Media producers and content creators see numerous values in using avatars as newscasters, weather anchors, or alter-egos of dead ancestors and historical figures. There is cost consideration and the ability to replicate experiences that could not be created otherwise. With proper ethical guidelines and transparency, synthetic media can serve useful social functions. The danger, as demonstrated by authors in this volume, lies in the misuse of these new technological tools. When do media serve their users and when do users become puns being manipulated by the media. In seeking empowerment through authentic telling and portrayal of global African cultural renaissance and experiences in the digital eco-­ system, creators and consumers must appreciate that use and abuse are often both sides of the same coin. DeepFakes demonstrate what is possible. The limits of what out to be are not integrated in the program codes, neither are the ethos of responsible use included in the packaging box of the tools. That is what makes works like these necessary. Technologies that are intended to emancipate can instead enslave, if they mislead, misinform and de-educate their users. Cultural and media literacy require strengthening communicator’s agency, not weakening them. Deception and misinformation in the social, political and interpersonal communication undermine the foundations of trust essential for healthy community interactions. The many examples highlighted in this text speak volumes. The work of sensitization requires researchers, industry leaders, policymakers and parents alike, to recognize the power of media to shape culture and worldviews. Particularly, young people, who are innovators digital natives, when it comes to adopting new media technology, will need cultural and media literacy skills to identify and respond properly to misinformation being disseminated through DeepFakes and other synthetic messages. The future suggests that DeepFakes will only proliferate. They cannot be limited by regulation or censorship. The responsibility for checking their negative impact lie more with individuals and communities. By arming communities, particularly those with the least tools to control the narrative, effort can be made to sensitize, if not inoculate them against the virus of disinformation and misinformation. The scope of this research, cutting across different Black and African cultures, offers readers a broad understanding of diverse issues, challenges and opportunities in the new digital media environment. The scholar’s well-considered conceptual, theoretical and methodological approaches offer insightful analytical lenses through which to interrogate the role and

 FOREWORD 

xi

impact of DeepFakes in the changing media ecology of different diasporic cultures and communities. The contributors invite consumers and critics of popular culture to awaken their senses and become savvy consumers and observers of media and culture. Tools and resources of this nature are much needed, if society is to harness the maximum benefits and derive the most positive influences from the digital media revolution. Intentionality and willingness to challenge prevailing notions, paradigms and beliefs are needed to equip and empower users and consumers of social and new media. This volume, hopefully, sets the agenda and tone for holistic and rigorous thinking about the functions and dysfunctions of media habits and practices. It will serve pop culture consumers, media analysts, culture critics, policymakers and industry leaders, concerned with best cultural practices and outcomes. Azusa, USA

Bala A. Musa

Acknowledgments

I am pleased to offer olive branches to all the contributors to this volume who allowed raindrops to dirty their garments as they dance to the tune of my constant, often unrepentant requests for their submissions. All these established and up and coming Black digital media scholars on the continent of Africa and the USA deserve my sincere appreciation for yielding to the silent voices in their cognitive universes that the time is now to populate the academic world with the role and contributions of Black communication scholarship. I am forever indebted to the Vice Chancellor of Daystar University, Professor Ayiro Laban, Provost Anthony Wutoh of Howard University, Dean Lawson Borders of Howard University, and Dean Levi Obonyo of Daystar University for allowing me to spend the 2022 Summer Semester at Daystar University where I had the time to edit this volume. Lastly, without the oceanic love from my wife and my two boys, I will only pine away like a bee caught between the harmattan wind. Shalom K. Langmia

xiii

Contents

1 Of  Deepfakes, Misinformation, and Disinformation  1 Lydia Radoli and Kehbuma Langmia 2 The  African Youth and Social Media at the Crossroads of Information, Misinformation, and Disinformation 15 Mohamed Saliou Camara, Hyeladzirra Banu, and Jean Claude Abeck 3 Artificial  Intelligence for Social Evil: Exploring How AI and Beauty Filters Perpetuate Colorism—Lessons Learned from a Colorism Giant, Brazil 51 Juliana Maria Trammel 4 Deepfakes  as Misinformation: The Next Frontier of Sports Fandom 73 Monica L. Ponder, Trayce Leak, and Kalema E. Meggs 5 The  Influence of Social Media Use in the Wake of Deepfakes on Kenyan Female University Students’ Perceptions on Sexism, Their Body Image and Participation in Politics 89 Nancy Mayoyo

xv

xvi 

Contents

6 The  Ogre and the Griot: Culturally Embedded Communicative Approaches Addressing ‘Deep Fake’ COVID-19 Narratives and Hyperrealities in Kenya105 Oby Obyerodhyambo and Wambui Wamunyu 7 “The  Medium is the Massage/Message”: Functions of Synthetic Media in Sense-­Making Conditions131 Jean Claude Kwitonda and Symone Campbell 8 Deepfakes:  Future of Sports and Questioning Video Evidence147 Unwana Samuel Akpan and Chuka Onwumechili 9 Examining  the Role of ‘KOT’ in Reinforcing Organizations’ Voices Against Misinformation in Kenya165 Masaki Enock Magack 10 Akata  Night Masquerade: A Semblance of Online Deepfakes in African Traditional Communication Systems179 Unwana Samuel Akpan Index205

Notes on Contributors

Jean Claude Abeck  is an African Affairs Consultant in the Department of Defense, Office of the Under Secretary of Defense for African Affairs at the Pentagon. He is a Policy Fellow in the Africa Policy Accelerator (APA) program at the Center for Strategic and International Studies (CSIS) in Washington, D.C. Jean Claude holds a Master of Science in Terrorism and Security Studies from American University in Washington, DC and is currently working on a PhD in African Studies at Howard University, where his research interests include African youth, Pan-African Security studies and U.S.-African security relations. His research analyzes growing security issues in national and international security and the spread of the private military industry in sub-Saharan Africa. Unwana  Samuel  Akpan  is a media scholar-practitioner with over two decades of broadcast experience. He is a lecturer in the Department of Mass Communication, University of Lagos, Akoka, Nigeria. He is the editor of The University of Lagos Communication Review. He has been a visiting scholar in the Department of Communication, Culture and Media Studies, Howard University, Washington DC, USA, where he completed his Postdoctoral Research. He authored and edited Nigerian Media Industries in the Era of Globalization. Akpan’s articles have appeared in some international academic journals, and he has also contributed to book chapters. Hyeladzirra Banu  research contributes to the study of Africa pertaining to its political economy and the intersection of conflict, state formation and statecraft in West Africa. Her expertise includes policy analysis, crisis and conflict management, public and interagency diplomacy. She has xvii

xviii 

NOTES ON CONTRIBUTORS

Master’s in International Economics and Relations from the Johns Hopkins University, SAIS; has Bachelor’s in International Relations, Political Science and Media Studies from Wartburg College; and is currently pursuing a PhD in African Studies at Howard University. She has a passion for global development, conflict mitigation and resolution, public knowledge and information accessibility and has obtained awards from the Economic Community of West African States, the Non-Profit Leadership Development Initiative at Johns Hopkins University, and other institutions. Mohamed Saliou Camara  is Professor of History, Philosophy, and Mass Communication in the Department of African Studies at Howard University. He received his PhD from Northwestern University in History on a Fulbright Scholarship. He also holds a Diplôme d’Études Supérieures from the University of Conakry in History and Philosophy, and an Advanced Professional Degree in Journalism from the University of Dakar. Professor Camara taught at Embry-­ Riddle Aeronautical University in Florida where he also served as Speaker of the Faculty Senate and Associate Vice President for Academics. In 2009, he won the Outstanding Teacher of the Year Award. In Guinea, Camara worked as a lecturer and associate chair of the Department of Philosophy at the University of Conakry, a journalist for the National Radio Television Network, a correspondent for Radio France International, president of the University of Conakry Press and a speechwriter for the Press Bureau of the Presidency of the Republic. He is the author of seven books and numerous chapters and articles on African political history, political communication, religion and philosophy, civil-military relations, human security and intra-African foreign relations. His current research centers on Religion, Spirituality, and Africa’s Struggle with Meta-imperialism from the Cold War to the Global War on Terror. Symone  Campbell  is a PhD candidate at Howard University in the Department of Communication, Culture and Media Studies. Symone’s research takes the critical and cultural approach to analyzing media, particularly new and emerging digital media. Symone is currently completing her dissertation that focuses on the ways mainstream K-12 educational technology perpetuate dominant ideologies of race, and how Blackoriented and Black woman-owned educational technology platforms engage in counter-discourses to reinforce culturally relevant learning experiences for Black students. At Howard University, Symone is a Ronald E.  McNair Scholars research assistant and the president of the Cathy Hughes School of Communication Scholars Forum. Symone holds a BA

  NOTES ON CONTRIBUTORS 

xix

in Sociology from SUNY Buffalo State College and an MA in Africana Studies from the University at Albany. Jean  Claude  Kwitonda  is an assistant professor in the Cathy Hughes School of Communications, Department of Communication Studies. He holds a PhD in Health Communication from Ohio University, an MA in International Development and Communication Studies and an MSc in Social and Scientific Research Methods. His research interests are at the intersection of health communication, social identity and health disparities. Kehbuma  Langmia  is a Fulbright Scholar/professor and chair in the Department of Strategic, Legal and Management Communication, School of Communications, Howard University. A graduate from the Communication and Media Studies Program at Howard University, Langmia has extensive knowledge and expertise in Information Communication Technology (ICT), Intercultural, Cross Cultural and International Communication, Black Diaspora Communication Theory, Decolonial Media Studies, Social Media and Afrocentricity. Since obtaining his PhD in Communications and Media Studies from Howard University, he has published 16 books, 17 book chapters and 10 peerreviewed journal articles nationally and internationally. Last year 2020, he received the prestigious NCA Orlando Taylor distinguished research scholar award as a top scholar in African and African American communication publications. He is a visiting scholar at Daystar University, Kenya, and MUBS, Makerere University, Uganda. In November 2017, Langmia was awarded the prestigious Toyin Falola Africa Book Award in Marrakesh, Morocco, by the Association of Global South Studies for his book titled Globalization and cyberculture: An Afrocentric Perspective. For the last four years, he has been selected by Howard University to act as a scholar coach for the Howard University Summer Writing Academy. In 2019, he was selected among the 35 USA professors chosen from a competitive pool of over 100 applicants to serve in the Visiting Professor Program at Fordham University in New York organized by ANA. In addition, he regularly gives keynote speeches on Information Communication Technology, Black Diaspora Mediated Communication, and Social Media in prominent national and international universities, including the Library of Congress, the National Intelligence University (Department of Defense, USA) and National Defense University ( Department of Defense, USA); Morgan State University ( Maryland, USA); Bowie State University (Maryland, USA); Melbourne University (Australia); Buea University

xx 

NOTES ON CONTRIBUTORS

(Cameroon); Madras Institute of Technology (India); ICT University (Cameroon) and Covenant University (Nigeria), Makerere, University Business School, MUBS (Uganda); and Temple University, Pennsylvania. He was the 2017 Maryland Communication Association Keynote Speaker holding at College of Southern Maryland, Waldorf, MD, and Communication Educators’ Association Conference at Winneba, Ghana, in 2019. Some of his books are Black/Africana Communication Theory, published in 2018 by Palgrave/Macmillan Press; Globalization and Cyberculture: An Afrocentric Perspective published in 2016, and Social Media: Culture and Identity published in 2017 by Palgrave Macmillan Press and the latter co-edited with Tia Tyree published by Lexington Books. He has recently published, Digital Communication at Crossroads in Africa: A Decolonial Approach in 2020 with Agnes Lucy Lando of Daystar University, Kenya. His most recent publications are Black Lives and Digiculturalism (2021), Paradise of Love and Pain (2021), and Decolonizing Communication Studies (2022). Website: drlangmia.net Trayce Leak  currently serves as a health communication specialist for the Centers for Disease Control and Prevention’s Division for Heart Disease and Stroke Prevention. She also serves as a science writer in Howard University’s Cathy Hughes School of Communications. Leak is an internationally accredited public relations professional (APR) with nearly 20 years of professional, industry experience. Prior to joining CDC and Howard University, Leak served as a public relations professor for more than 12 years, with tenures at Georgia State University and Clark Atlanta University. She also holds Doctor of Philosophy in Communication Studies from Georgia State University, Master of Science in Public Relations from Syracuse University, and Bachelor of Science in Sociology, with a minor in marketing, from Florida State University. Masaki  Enock  Magack  is currently the Marketing Communications Manager for Lyle Kenya, CEO and Founder of Digital Marketing Solutions. He is a PhD candidate at Daystar University. He obtained Master’s in Communication Studies focusing on Digital Communication from the United States University of Africa (2020) and Bachelor’s in Mass Communication from St. Paul’s University (2016) both in Nairobi, Kenya. He has served as a bank executive customer consultant in one of the leading banks in Kenya—Stanbic Bank, where he was involved in acquisition and provision of solutions to customers on the bank’s digital platforms-­ internet and mobile banking and other bank-related services. He was also

  NOTES ON CONTRIBUTORS 

xxi

privileged to serve as the deputy head of Radio and head of Productions at USIU Radio 99.9 FM for two years. At the same time, he hosted Morning Madness Show every day from 8–10 a.m. and Sports Zone on Monday and Saturday from 3–5 p.m. He also volunteered to do vox-pops for the leading radio station in Kenya—Citizen Radio for a show that is aired from 1:00 p.m. to 4:00 p.m. where he led several discussions centering on youth, SME and their digital experiences. He has also been involved in the Citizen TV (the leading national television in Kenya) shows on Friday mornings as a panelist on digital trends. He has presented two papers in two conferences. One, on USA-China relations in Africa during the Third ASAA Biennial International Conference held at USIU-A in 2019 and two on Uses and Gratifications of LinkedIn by its users in Nairobi County at the 2nd International Symposium on social media at USIU-Africa in November of 2020. His work has been published in the Journal of Communication & Media Research—released in October 2021. His research interests lie in areas of digital communication, corporate and strategic communication, journalism studies, mass communication and new communication technology. He has participated in discussions on social media usage by the youth and startup businesses in different shows on leading television stations such as KTN, CITIZEN and NTV. Nancy Mayoyo  holds PhD from Kenyatta University and is Lecturer in Sociology of Education at Kenyatta University, School of Education, Educational Foundations Department. Her research interests include sociology of education, higher education and impact of computer technologies on education. She is interested in promoting use of social media responsibly to improve teaching and learning. Kalema E. Meggs  who holds a Master of Arts in Public Relations from the University of Miami, and a Master of Science from Lynn University, is a second-­year doctoral student in the Department of Communication, Culture, and Media Studies at Howard University. Her research interests are in communication, social and cultural matters among African American/Black and Brown athletes and how they are framed within media channels and platforms. Oby  Obyerodhyambo  is a Communications researcher and Strategic/ Health Communication practitioner. His research interest revolves around Health Communication theory with a focus on Constructivist Grounded Theory. He combines interest in research and academia with active work in the field Social and Behaviour Change Communications, developing

xxii 

NOTES ON CONTRIBUTORS

communication and advocacy strategies, training curricula and tools and developing communication materials. Oby is passionate and has published about the exploration of post-colonial representations and decoloniality in Communication praxis. Oby is affiliated to Daystar University where he is a PhD in Communication candidate writing on meaning-­making in Sexual and Reproductive Health messages by Adolescent Girls and Young Women. Chuka Onwumechili  is Professor of Strategic, Legal, and Management Communication at Howard University in Washington, DC (USA). He is author of more than ten books focusing largely on several communication issues that pertain to Africa. These books touch on varied subjects including sports, telecommunications, culture and development. His most recent book is a college-level text titled Sport Communication: An International Approach. His other works on similar subjects have been published in peer-reviewed academic journals. Presently, he serves as editor-in-chief of the Howard Journal of Communications and serves on the Editorial Board of the Communication & Sport journal. It is important to note that before joining Howard University, Onwumechili served as vice president for the Digital Bridge Institute (DBI) in Nigeria where his brief included executive visioning, managing and overseeing training programs at different levels for employees in both the Nigerian telecommunications industry and other interested markets. Monica L. Ponder  is Epidemiologist and Assistant Professor of Health Communication and Culture at Howard University. Ponder’s research offers crisis communication recommendations for public health and healthcare organizations seeking to understand, reach and engage historically marginalized and underrepresented groups during public health emergencies. She is the creator of The Henrietta Hypothesis, an interdisciplinary model for crisis communication. It is a 16-construct model aptly named  The Henrietta Hypothesis  in honor of Henrietta Lacks’ iconic healthcare case. The goal of her research is to advance traditional health and crisis communication practice to center cultural perspectives and environmental context in its design. As a scholar-activist, Ponder has led many successful community initiatives, including advocating for the establishment of lactation rooms (pods) at Hartsfield-Jackson Atlanta International Airport, as well as leading plans for lactation support services for the 2017 National Women’s March (Washington, DC).  She leverages, in her teaching, her ten+ year career in health communication at the Centers for Disease Control and Prevention (CDC). Ponder holds BS and MS degrees

  Notes on Contributors 

xxiii

in Chemistry from Clark Atlanta University, an MSPH in Epidemiology from Emory University and a PhD in Communication from Georgia State University. Lydia  Radoli  is a journalist, social researcher and media lecturer at Daystar University’s School of Communication, where she heads the Media and Film Studies (MFS) Department. She is a research fellow and visiting scholar at the Brandenburg University of Technology (BTU) in Cottbus, Germany. Her current research work looks at Suppressed Narratives Through the Life Cycle of Visual Rhetors (Journalists Covering Violent Crimes from East Africa). This research will expand to address the Trauma Phenomenon of Visual Rhetors through the Analysis of Assorted Visual Images from East Africa, a project under the Leuphana Institute of Advanced Studies in Culture and Society (LIAS-CAS) at Leuphana University-Lüneburg. She has worked as a senior broadcast journalist and producer, in Kenya, Uganda and Rwanda, and contributed articles to Mkenya Ujerumani—a Diasporic Media outlet for Kenyans in Germany and We Will Lead Africa journal for African innovators based in the USA. She also established Diaspora Radio, an online platform to highlight Transnational African Narratives. She has served on the Media Council of Kenya (MCK) Taskforce and vice chair of the Kenya Diaspora Youth Council (KDYC). She is also a fellow of the African Good Governance Network (AGGN) and has served as its vice chair. A member of the Association of the Media Women in Kenya (AMWIK), she serves on its Media and Publicity committee. Her research interests are around media, migration, development, global cultures, political economy and human rights. She has presented papers at international and local conferences and chaired sessions at the International Communication Association (ICA)— Regional Hub, Nairobi, and virtually at the Research Association for Interdisciplinary Studies (RAIS), UN Center—New York. Her recent publications: Radoli, O.L. (2022). Switching to SIDE Mode— “COVID-19 and the Adaptation of Computer-Mediated Communication (CMC) Learning in Kenya.” In Online Learning Instruction and Research in Post-Pandemic Higher Education in Africa. Lexington Publishers ISBN 928-1-66691-606-5 and a co-authored book chapter: Lando, A.L. and Radoli, O. L. (2021). “To Cover or Not to Cover? A Critical Discourse Analysis of Mainstream Media News Framing of Children in Kenyan Care Homes.” In Routledge Companion for Media and Poverty.

xxiv 

Notes on Contributors

Juliana  Maria  Trammel  is Tenured  Associate Professor of Strategic Communication and program assessment coordinator in the Department of Journalism and Mass Communications at Savannah State University. Her research interests include the intersection of gender, media, race and human communication, with a special focus on social media, women and early childhood communication in Brazil. Her most recent publications include “Breastfeeding Campaigns and Ethnic Disparity in Brazil: The Representation of a Hegemonic Society and Quasiperfect Experience” published by the  Journal of Black Studies,  and she co-­authored a book chapter titled “Social Media, Women, and Empowerment: The Issues of Social Media Platforms by WNGOs in Jamaica and Brazil,” published by the  Studies in Media & Communications, among other publications. In 2016, she also served as a contributing writer for PR News. In addition to her scholarship, she has over 15 years of communication-related experience including social advocacy on the Capitol Hill, higher education administration, teaching and consulting. Wambui  Wamunyu PhD, is Senior Lecturer in Media Studies and researcher based at Aga Khan University in Nairobi, Kenya. She has conducted qualitative and mixed methods research studies in areas including disinformation, gender and media, and the news media’s use of digital technologies.

List of Figures

Fig. 2.1

Fig. 6.1

Internet penetration rates in 12 select African countries from 2000 to 2019. (Source: ICTworks, “Internet Freedom in Africa.” Retrieved May 22, 2022, from https://www.ictworks. org/wp-­content/uploads/2022/01/Internet-­Freedom-­in-­ Africa.pdf)32 Health insurer AAR provides a health cover for possible side effects, which the doctor views as a callous profit-driven effort to play on patient fears 120

xxv

List of Tables

Table 2.1 Table 2.2 Table 2.3

Orange Guinea’s money transfer tariffs How the Internet is viewed in Sub-Saharan Africa Social media statistics in Africa between June 2021 and May 2021 Table 2.4 Social media penetration and usage across Africa by region Table 8.1 Proposed football video rating system Table 10.1 Modes of indigenous Nigerian communication media

23 29 31 33 157 190

xxvii

CHAPTER 1

Of Deepfakes, Misinformation, and Disinformation Lydia Radoli and Kehbuma Langmia

The Open Curtain to Digitalization in Africa The landscape of media communication is transforming human interaction in the virtual space. Deepfakes, misinformation, and disinformation, the tendency to falsify information through texts, images, and visual physical appearances as well as audio communicative rendition for humans on various digital spaces on the Internet, have further complicated message encoding and decoding. It has therefore corrupted humankind’s sense of perceptual reality in multiple given contexts. For example, the African digital communicative landscape witnessed a tsunami of online disinformation and misinformation due in part to the effects of COVID-19 pandemic that increased message traffic on the Internet. It spiraled into

L. Radoli Daystar University, Nairobi, Kenya e-mail: [email protected] K. Langmia (*) Howard University, Washington, DC, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_1

1

2 

L. RADOLI AND K. LANGMIA

downright confusion about the nature of the new disease making governments wrestle with the truths and mitigate the negative effects on the people. In fact, Radu (2020) has aptly referred to it as “infodemic.” While the broadcast and print media (television, newspapers, and radio) were putting up data from the World Health Organization (WHO), Donald Trump, the former president of the United States, the democratic role model country for the rest of the world, the home of the brave and the land of the free, was accusing WHO of falsifying data to scare the public. As if that was not enough, the social media platforms were being populated by unverified and exaggerated news about the pandemic whose source, purported from Wuhan, China, was still being debated. This debate also unearthed unfounded claims, especially about people of color. Africans especially were perniciously attacked for causing this disease alongside other Blacks living and studying in China in 2020. They were being branded as having imported the disease from the African continent to China (Adebayo, 2021). The danger of such viral misinformation circulated on social media and was acted upon by some in the Chinese government. This created one of the most painful hate crimes committed against Black people in China. This act reminded people of African descent and those on the continent of Africa of the action of the rest of the world during the Ebola crisis. In December when the Omicron variant broke in South Africa countries in the West were quick to issue travel bans, as well the media framed the virus as the new, dangerous, and “heavily mutated” COVID-19 variant that had been identified in South Africa (Mbabazi, 2021). The Black color became the enemy to most people in the Western hemisphere. On the continent of Africa, the news about COVID-19 on electronic and digital media platforms was following divergent paths to further put the populace under a perpetual state of confusion and distrust. Imagine if this information was disseminated through word of mouth from trusted elders and seniors within the African community who, by dint of their ages and wisdom, have been entrusted with the mantle of truth, vision, and wisdom, the damage from misinformation could have been mitigated to a lesser degree. Worse still, hyperreality (the tendency to hype or exaggerate through images or memes), orchestrated by the new virtual forms of communication unprecedented in any mode of human communication, has gone a long way to complicate in-person communication that Africans and people of African descent for centuries have been used to.

1  OF DEEPFAKES, MISINFORMATION, AND DISINFORMATION 

3

Communicatively speaking, mankind now lives in a perpetual state of two-ness. Two selves within one body. We are simultaneously online and offline interacting on issues on great importance whereby a little mistake could create a topsy-turvy effect on both worlds. Anything posted online will outlast our generation because of its viral nature. This is the new way of sending messages to most digital immigrants in cyberspace and since a vast majority of them live on the continent of Africa where Western language and digital technology are still a challenge, faking news or misinforming them has become the norm. In fact, it has been normalized. These digital immigrants (village elders and uneducated youths in the cities and villages) have joined the social media public sphere. They must belong in this new world with this new form of communication, otherwise, they will continue to live on the margins of life waiting for Big Brother, the West, to provide solutions. But the challenges are enormous. The internal dynamics of digitization come with infinite complications that could be difficult to unravel in the not-too-distant future. If a cyber guru armed with artificial intelligence tools can alter the universe of perception of one person on the Internet to appear like another person and the Internet consumers are unable to decipher the falsification that has taken place, not only is the public paying a serious price but the targeted individual or individuals face insurmountable obstacle to debunk the falsity and restore “truthful” reality. Already facial recognition technology has disproportionately affected people of color, especially people of African descent (Langmia, 2021). With the emergence of deepfakes synthetic media (DFSM) onslaught, Black interhuman communication (Africans on the continent of Africa and those of African descent in the Diaspora) are certainly affected (because of historical African ancestral psycho-cultural influences) in a myriad of ways. This is what could impact paradigm shift in communication that is constantly in crisis because of constant changes from in-person to virtual interactions on WhatsApp, Facebook, TikTok, Twitter, Instagram, and so on. Statistics show that Africans on the continent of Africa and those in the Diaspora are heavy contributors and consumers of social media content (Langmia, 2021). But they do not control the Internet’s structural dynamics because according to Kress (2009), Gray (2012), and Daniels (2013), the Internet “has a White male omnipresence” (Langmia, 2021, 6).

4 

L. RADOLI AND K. LANGMIA

How Did We Get Here? The twenty-first century is probably going to go down in the annals of history as one of those epochs in human history that have unearthed a barrage of technological innovations that have helped to improve human socio-cultural and economic conditions as well as contributed to its futuristic demise and uncertainty. This uncertainty has by and large been triggered primarily by the plethora of blunders that some of the start-ups beginning with the dotcom bonanza experienced in the early part of the twenty-first century. The onslaught of private initiatives and industry-led technological innovations in the last two decades in biomedical sciences, telescopic imaging devices, laser beams that can scan human anatomy in seconds, facial recognition detecting devices, artificial intelligence with geo-stationary devices like GPS to blur boundaries and squeeze time and space into one predictable entity, mankind is on the roll. But this too like any other innovations that have championed the progress of human endeavors in telescopes, television, facsimiles, and so on, the future of human communicative interactions from the standpoint of the Internet remains unfortunately uncertain, and chaotic. This chaos can be partly attributed to the regulations or the lack thereof of the Internet forms of communications that seem to be murky from region to region and country to country. The Western countries seem to set the pace for the rest of the economically developing countries in the subaltern but at the same time, the rules from the International Telecommunication Union (ITU) (like the deadline they set for the transformation from analog to digital signals on June 17, 2015, for all telecommunication devices in the world) did not seem to jive with the insurmountable technical difficulties that some of these countries in Africa, Latin America, Caribbean, and South East Asia were confronting daily. For instance, the spam solicitation of money from some anonymous Internet users to other users, especially in the Western world, has not been abated by any regulations from any government. Granted, the Internet should be free as in free speech articles enshrined in most independent nation’s constitutions all over the world, but that freedom remains undefined because of its slippery slope nature. Not all freedom is “free,” and freedom does not mean infringing on citizens’ privacy. A lot of private spaces for most individuals on the Internet have been invaded by scams and spam messages and this has threatened

1  OF DEEPFAKES, MISINFORMATION, AND DISINFORMATION 

5

the future of this unprecedented platform for human communication because people now lack trust for some sources of messages in their inboxes for their email and social media accounts.

Online Disinformation and Africa Deepfakes and other misinformation messages planted on the pathways of platforms like Twitter, Facebook, Instagram, WhatsApp, and TikTok have caused manifold socio-political and cultural relationships on the continent of Africa. These social media platforms are omnipresent on Africa’s social mediascape. These platforms, none of which is founded, created, or managed by an African or Black in the diaspora, have seen the intensive communicative presence of people of color. Like all traffic jams on major non-virtual highways in Africa, messages have mushroomed in the cyberspaces of these platforms in Africa and so looking for an opening for meaningful traffic flow from all gender, ages, and classes of people is daunting. Consequently, messages posted on these sites in non-Western languages may relatively not receive adequate translatory or interpretive attention like the Western languages of English, French, Portuguese, or Spanish. In that case, messages intended to convey meanings rooted in the socio-­ political and cultural contexts of Africa or within the Black communities abroad are subject to misinterpretations which contribute in no small way to misinformation when it is transmitted from person to person online and offline. Additionally, since a platform like Twitter is used frequently in Africa with little control of the ISPs from Africa, when messages are posted meant to disinform, disparage, and cause panic during electioneering periods they can stir uncontrollable trouble. Take the case of the Twitter post in South Africa during the Zuma era as opined by Wasserman (2020). He states in his article on fake news in South Africa that “a series of badly photoshopped, sexist images of Ferial Haffajee, editor-at-large of the South African Huffington Post was tweeted…” (7). The intentions of the tweet were obvious: to damage the public reputation of a journalist, Ferial Haffajee. Whatever the case, imagine when photoshopped images that do not in any way represent the actual person are made viral on the Internet, Internet users are prone to quickly react to messages without fact-­checking them for accuracy.

6 

L. RADOLI AND K. LANGMIA

Elections Coverage and the War of Fakes An election process throws the media into the spotlight, and susceptibility to bias, any influence by political parties, any unfair or selective coverage, or any suppression of a minority voice will be clear for all to see (Berg & Betcheva, 2014). Fake news and exaggerations often appear in narratives of opponents during the last days of political campaigns to sow seeds of doubt in the minds of the electorate. This is what has come to represent human communication and so the truth becomes a victim and since half of our lives are buried on the smartphones on our palms, we are prone to react immediately when such images are posted on our sites. This is the fight of our times and politicians have lost elections in this Internet era from such practices. In the recently concluded elections in Kenya, the election war and conflicts that had been witnessed in sparks of violence and physical conflicts were taken online and pushed by the so-called keyboard warriors. The constant spewing of stories winded in allegations, mockery, and trolling of one political divide against the other, from the camps of the two hugely competitive presidential candidates. As the Western media may have looked forward to, there were no buildings flaring up in the fire, tear gas and rubber bullets dispersing rioters after the contested results, but there was real and vicious war online. Different tallies from the main television stations (KTN and Citizens) did not help the situation, as the online users used the results to compare, belittle, and ridicule the opponents. The raging social media wars spewed propaganda regarding the two key political parties and their presidential candidate. In another scenario, African Diasporic digital media spaces provide a unique and creative way to utilize the new media to address challenges such as integration, economization, marginalization, and participation in the political spheres. African Diasporas face and utilize Diasporic media platforms to overcome marginalization, voice perceptions, and transfer knowledge and skills to migration-sending countries. The Diasporic media spaces have also been used to cause divides through reifying classism and Europeanness online. The digital age can be traced to the migration of media technologies from analog to digital that came into effect in July 2015 heralded by the International Telecommunication Union (ITU). The objective was to produce high-quality uniform digital content. This shift has provided a marketplace for ideas and broadcast production. While the move presents an avenue for growth in the communication sector, there are visible challenges to users from emerging and developing

1  OF DEEPFAKES, MISINFORMATION, AND DISINFORMATION 

7

economies. With significant traffic toward internet broadcasting, social media, video blogs, and citizen journalism concerns of misinformation and misrepresentation have risen. Diasporic media producers sometimes replicated dominant discourses in the mainstream media, therefore underutilizing the possibilities of using the platform for alternative discourse (Radoli, 2019).

“Fake News” Palava and E-journalism post-McLuhanism Marshall McLuhan never foresaw the conundrum that could pit media messages and platform multiplicity for human communicative growth after his 1960 dictum that the medium is the message. His cognitive echo chamber of human-machine interaction in the not-too-distant future failed to factor in manipulative politically motivated messages with media oligarchs holding the steering wheel of fortune of the consumer’s mind and behavior. His vision was solely anchored on the impartial role of the media in message dissemination regardless of their effects. Today, years after his prediction, information overload has become the desert for credible knowledge dissemination because those at the helm of power are trafficking both information and knowledge to maintain power for its own sake. Fortunately, or unfortunately, those in the subaltern nation-states of Africa in general and Kenya are drinking and drowning in the dregs of fake news mediated and disseminated by global transnational media houses in Western capitals through various social media platforms. Anything that rings the bell of objectivity has been termed “fake news” to dissuade consumers and passive bystanders from following through. What even makes it worse is the fact that the extension of man’s metaphorical insinuation by this Canadian media guru has become an absorption and control where humans involved in digital communication in cyberspace now live double lives: one on the cybernetic metaverse and the other on the in-person universe where time and space are seemingly yoking themselves with virtualization. This has made e-journalism the paradox of all our lives and the confusion of what counts as real, surreal, and hyperreal has become anyone’s guess but those who control the media, control the mind, according to the American songwriter and poet Jim Morrison (Deshpande & Wahab, 2019, 1), and they seem to be getting the upper hand at controlling the discourse. Deepfakes and synthetic media especially on the continent of

8 

L. RADOLI AND K. LANGMIA

Africa coupled with conscious media disinformation have complicated journalism, e-journalism, and citizen’s journalism thereby plunging the paradigm shift in communication into more crisis, especially with respect to Black communication tenets. Granted, the daily ritual of scrolling tons of messages from TikTok, Instagram, LinkedIn, Facebook, WhatsApp, email, and text messages from multiple platforms early in the morning as human beings wake up has become a chore that defines the day. Messages from the traditional media of radio, newspaper, and television have become secondary or abandoned to Gen X who cannot still decipher fake news from “real news.” But the advent of the COVID-19 pandemic has reversed that trend. When it comes to social media and news consumption, African news junkies on the continent do face another dilemma. They do not know what to believe anymore and that is a problem.

Media Bubbles, Polarization, and Fake News The political consciousness of a country depends on social media. Alter (2019) poses that, media bubbles are recurring owing to polarization and the isolation of individual beliefs from alternative ideological viewpoints. In so far, fake news on social media takes on an outfit of trending stories masking the attributes of truth and objectivity that so informed the practice of media production and dissemination. They have spewed stories thriving on click baits; half-truth spiral content quickly brewed in a potent pot devoid of any semblance of ethical ingenuity and became viral with no editorial control or fact-checking in place. In this age and era, the number of clicks, views, and followership are determining factors of what sets the agenda of the news unlike McLuhan (1975) and those after him, McCombs (2004), who diligently propagated the ideas amassed in agenda-setting theories, prioritization, and salience of news items. Viewers are guided by what they watch and the frequency of engagement with the content, mostly accessed online, and rarely are opinions influenced by what they do not see. This argument supposes that fake news can spread like a wildfire with its continuous click ability, viewability, and shareability. Therefore, social media used to fashion their own media filters by watching shows that match their views and then filtering out “non-biased” material, due to confirmation bias, which is the rejection of information that does not support one’s preconceptions (Anderson & Tompkins, 2015). Over time, algorithms and filters on social media sites like Facebook and Google have exponentially increased the idea of media bubbles. Information on user

1  OF DEEPFAKES, MISINFORMATION, AND DISINFORMATION 

9

search histories and personal preferences presents users with a palate of content to consume that corresponds with expected interests (Bakshy et al., 2015). Therefore, fake news becomes a “Selective exposure behavior” and “confirmation bias,” and progressively results in a culture of “fake news,” which facilitates an individual’s predisposition to blame those who are different from them (McChesney, 2014). Secondly, there is a corresponding linkage between accessing “false news” and feelings of powerlessness, loneliness, and cynicism toward others. This has created a public that distrusts the mainstream media and drives them toward political polarization, leaving users believing just what they want to be true (Johnson & Kelling, 2018). It is seen as a major threat to democratic ideals and processes rendered illegitimate in the presence of fake news (Pressman, 2018). The environment within which fake news exists itself is destabilized, thus heightening citizens’ mistrust of the legacy media, policies, and state functions. Spohr (2017) and Williams and Stroud (2020) have intimated that such uncertainty could cause tensions among residents and stifle negative citizen-media relations.

Ethics, Fake News, and Real News Tug of War Late Steve Biko of South Africa once said that it may become a herculean task synonymous with the myth of Sisyphus to grapple with the ultimate question of how do we empty and drain the mind of the colonized citizen of all the deodorized epistemological contents heaped on him or her from the West? The African media consumer in this tail end of the twenty-first century drinks from the gourd of digitized social media content laced with a plethora of fake news content for which s/he cannot decipher right versus wrong ones. The mind is under control and media literacy becomes an arduous task to undertake to filter the grains from the chaffs that s/he consumes. Ethical consideration of what is dished out from ISPs in nation-­ states in Africa becomes another issue worthy of concern. When the glove will be pulled inside out then truth and falsehood with respect to media dissemination would become exposed to the media consumer who at that moment in time will be able to “fact check” sources but how easy that can be done early in the morning ritual of news consumption on mankind’s handheld gadget is another million-man march question with little or no immediate answer. But keep on with the onward march to the future.

10 

L. RADOLI AND K. LANGMIA

Rationale for This First Edition on Deepfakes As a result of the above-mentioned digital communication landscape, this first edition has assembled a plethora of seasoned and budding Black communication scholars who have examined this paradigm shift in the landscape of digital communication from Africa, North America, and Latin America. They have examined the tapestry of this vexing twist in digital communication in their various regional, provincial, and national contexts to ascertain its impact on the communication discipline. This introductory chapter is followed by Chap. 2 titled “The African Youth and Social Media at the Crossroads of Information, Misinformation, and Disinformation” by Mohammed Camera, Hyeladzirra Banu, and Jean Claude Abeck. These scholars have portrayed the extent to which misinformation and intentional disinformation of messages on selected social media platforms that are popular in Africa like Facebook, WhatsApp, and Twitter have affected the youth population. Given that the youth population constitutes the majority of social media consumers on the continent of Africa, these authors have been able to document through interviews and archival data how consumption of misleading information can have a deleterious effect on them in the not-too-distant future. Chapter 3 takes the reader to Latin America where Juliana Trammel has taken up the discourse of colorism within the context of AI and how deepfakes within these contexts on social media have perpetuated the notion of inherent racist bias especially in Brazil’s cyberspace public sphere. On the other hand, Monica L. Ponder, Trayce Leak, and Kalema E. Meggs have introduced the reader to a new concept in Chap. 4 called fanship. They want the readers to look at this growing phenomenon on various digital spaces where digital fanship and fandom are gaining traction. That traction is further complicated by the presence of deepfakes which according to the authors pose threads to veracity and reality. It is that same theme that is gleaned throughout Chap. 5, where Nancy Mayoyo has examined the vexing issue of body image perceptions by Kenyan female university students when it comes to the cyberspace. She has unearthed data that shows how these female students seem to be conflicted from online body image perception that runs contrary to their original pre-social media self-image. Similarly, in Kenya, the concept of altered reality (in Chap. 6) as seen in the African Griot has been beautifully analyzed in the context of digital media communication in the age of hyperreality triggered by the online spaces of communication. According to Oby Obyerodhyambo and

1  OF DEEPFAKES, MISINFORMATION, AND DISINFORMATION 

11

Wambui Wamunyu, this phenomenon especially during the COVID-19 pandemic where most people took to cyberspace communication in Kenya, tensions were created because people were no longer giving accurate representations of who they say they are especially within the context of health communications. In Chap. 7, Jean Claude Kwitonda and Symone Campbell have revisited McLuhan’s dictum of the “medium is the message” by stretching the expression to become “the medium is the message/massage” because of the plethora of media that now exist as opposed to six decades ago when he came up with that expression. The presence of deepfakes synthetic media, according to Kwitonda and Campbell, in digitally mediated contexts is the reason why the post-­ modern media consumer finds herself or himself in a perpetual state of doubt as to what is real and what is unreal. From health communication to sports communication in Chap. 8, Unwana Samuel Akpan and Chuka Onwumechili have pinpointed that video manipulation and video tampering can have serious repercussions especially in the sports world. To them this vexing issue can affect fair play and objectivity in any given context where competing sports are present. Masaki Magack in Chap. 9 takes the reader to the Twitter platform communication to show how Kenyans on Twitter (KOT) used this platform especially during the COVID-19 pandemic period to spread misinformation leading to several repercussions for some companies as some of their hashtags were used by infiltrators to spread news about them. The last chapter by Unwana Akpan takes the reader back to the African cultural symbol known as the “Akata night masquerade” in Chap. 10. The author seeks to show that the issue of reality manipulation did not only originate with the emergence of digital modes of communication in the twenty-first century. The offline world of deepfakes has preceded the online deepfakes before this open era, thereby challenging all of us to be extra careful in given contexts to discern truth from falsehood.

References Adebayo, K.  O. (2021). Racial Discrimination in Uncertain Times: Covid-19, Positionality and Africans in China Studies. Journal of African Cultural Studies, 33(2), 174–183. Alter, I. (2019). Populist Times and the Perils of ‘Neutral’ Journalism. Center for Journalism Ethics, 1(1), 1–22.

12 

L. RADOLI AND K. LANGMIA

Anderson, K.  E., & Tompkins, P.  S. (2015). Practicing Communication Ethics: Development, Discernment, and Decision-Making. Routledge. Bakshy, E., Messing, S., & Adamic, L.  A. (2015). Exposure to Ideologically Diverse News and Opinion on Facebook. Science Magazine, 348(6239), 1130–1132. Berg, A. C., & Betcheva, R. (2014). Legal Focus for EBU Principles for Election Coverage in New and Developing Democracies. The European Broadcasting Union. Daniels, J. (2013). Race and Racism in Internet Studies: A Review and Critique. New Media & Society, 15(5), 695–719. Deshpande, A., & Wahab, A. (2019). Alternative Media: Choice of Unheard Voices in India. In International Conference on Media Ethics. Gray, K.  L. (2012). Intersecting Oppressions and Online Communities. Information, Communication & Society, 15(3), 411–428. Johnson, B.  G., & Kelling, K. (2018). Placing Facebook: ‘Trending,’ ‘Napalm Girl,’ ‘Fake News,’ and Journalistic Boundary Work. Journalism Practice, 12(7), 817–833. Kress, T.  M. (2009). In the Shadow of Whiteness: (Re)exploring Connections Between History, Enacted Culture, and Identity in Digital Divide Initiatives. Cultural Studies of Science Education, 4(1), 41–49. Langmia, K. (2021). Black Lives and Digi-Culturalism. Lexington Press. Mbabazi, D. (2021). The Media Coverage of the Omicron Variant is Biased Against Africa. Daily Nebraska. https://www.dailynebraskan.com/opinion/ opinion-­the-­media-­coverage-­of-­the-­omicron-­variant-­is-­biased-­against-­africa/ article_c687d9b6-­53df-­11ec-­b22b-­0f3ad600f849.html McChesney, R. W. (2014). Blowing the Roof off the Twenty-First Century: Media, Politics, and the Struggle for Post-Capitalist Democracy. Monthly Review Press. McCombs, M. (2004). Setting the Agenda and Public Opinion. Cambridge Polity Press. McLuhan, M. (1975). Understanding the Media: Extensions of Man. London and New York. Pressman, M. (2018, November 5). Journalistic Objectivity Evolved the Way It Did for a Reason. Time. Retrieved July 12, 2021, from https://time. com/5443351/journalism-­objectivity-­history/ Radoli, O. L. (2019). Narratives of Migration and Development as Discourses in Transnational Digital Migrant Media: The Case of Kenyan Migration to Europe. Retrieved September 2022, from https://opus4.kobv.de/opus4-­btu/ Radu, R. (2020). Fighting the ‘Infodemic’: Legal Responses to COVID-19 Disinformation. Social Media and Society, 6(3), 1–4. Spohr, D. (2017). Fake News and Ideological Polarization: Filter Bubbles and Selective Exposure on Social Media. Business Information Review, 34(3), 150–160.

1  OF DEEPFAKES, MISINFORMATION, AND DISINFORMATION 

13

Wasserman, H. (2020). Fake News from Africa: Panics, Politics, and Paradigms. Journalism, 21(1), 3–16. Williams, K., & Stroud, S. R. (2020, July 24). Objectivity in Journalism. Center for Media Engagement. Retrieved July 12, 2021, from https://mediaengagement.org/research/objectivity-­in-­journalism/

CHAPTER 2

The African Youth and Social Media at the Crossroads of Information, Misinformation, and Disinformation Mohamed Saliou Camara, Hyeladzirra Banu, and Jean Claude Abeck

Introduction The far-reaching influence of social media in interpersonal, intra-, and inter-group communication has become a fully accepted fact and a way of life. This new way of life evolved from Six Degrees, the first identifiable platform created in 1997 which enabled users to create a profile page, connect with and befriend other users of common interests. The growth of the social media sector coincided with the emergence of the generations known as Millennials (people born between 1981 and 1996) and Gen Z (short for Generation Z, or people born between 1997 and 2012). This coincidence (assuming that it is one) has a particular significance in and for

M. S. Camara (*) • H. Banu • J. C. Abeck Howard University, Washington, DC, USA e-mail: [email protected]; [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_2

15

16 

M. S. CAMARA ET AL.

Africa, because of the exponential growth of the continent’s youth population. In fact, according to the Fact Sheet prepared in support of the 2010–2011 International Year of Youth, Africa is the world’s youngest continent with 70 percent of its population being under the age of 30 in 2010. This means that nearly 70 percent of Africa’s population were Millennials or Gen Z (United Nations Programme on Youth (UNPY), 2011). Since then, the youth bulge has continued unabated to the point that as of September 2020, African Union data published in conjunction with the Africa Youth Month 2020 estimated that 75 percent of Africa’s 1.2  billion inhabitants were under the age of 35 and that 453  million Africans were aged 15–35 years (African Union, 2020). With such a large and increasing youth demographic in the world’s second largest and second most populous continent, which also happens to contain one of the fastest growing numbers of cell phone and social media users, social media is poised to influence the continent in no small ways. Such influence has positive as well as negative dimensions, potential and actual alike. This is so, especially in view of the proven existence and proliferation of fake news and similar forms of misinformation and disinformation on social media platforms to which young Africans are attracted. Hence, the authors of this chapter study the increasing popularity of social media in Africa, particularly among the continent’s youth population, with a focus on the productive and counterproductive impacts of social media usage. They do this by presenting and analyzing pertinent perspectives of young African activists, entrepreneurs, and other social media influencers with a view to properly contextualize the information-­ misinformation-­disinformation imbroglio that young Africans face daily while on Facebook, Twitter, WhatsApp, and similar platforms. The assumption hypothesis in this study is threefold. First, just like most tools of human communication, social media possesses complex potentials of yielding constructive outcomes if used properly and purposefully. Second, just like all means of group and mass communication generated and controlled by a privileged elite that garners income, power, and influence from the manipulation of those means with little to no regulatory oversight, social media has the power to harm its users in a variety of ways. Third, Africa is particularly poised to benefit from social media just as it is extremely vulnerable to its manipulative tentacles because of its unique appeal to young people and the continent’s massive and growing youth population.

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

17

Correspondingly, the following central questions are addressed throughout the chapter: What are the constructive aspects and outcomes of social media usage by young Africans? What are the destructive aspects and outcomes of the said usage? How vulnerable are young Africans to social media-generated fake news and its corollary of misinformation and disinformation and why? What, if anything, are African leaders doing/ what can they do to further augment the constructiveness of social media usage and mitigate its destructiveness? This study is anchored in a dual theoretical framework: situational extrospection and intentional rectification, two theories that Camara has been experimenting and developing in his scholarship not only on media communication, but also on transformational leadership and integrative citizenship.1 According to the situational extrospection theory, external predatory forces create attractive situational trends that they leverage in an attempt to rob individuals of their agency and turn them into powerless consumers by manipulating their vulnerability to the insidious facets of the trends. Such predatory forces include persons, groups, and organizations that generate and propagate fake news and other types of false information; hackers and hoaxsters; cyberbullies; and so on. Scholarly research and studies2 by reputable media organizations such as CNN, BBC, AFP, Africa

1  On these theories, see also Mohamed Saliou Camara, “Media, Civil Society and Political Culture in West Africa,” in Ecquid Novi: African Journalism Studies Volume 29(2), pp.  210–229; “No Effective Leadership without Responsible Citizenship: Exploring the Need for Accountability in Twenty-First Century African Governance,” and “Integrative Citizenship, Collective Self-Reliance, and Optimal Strategic Independence for a Second Reclamation of Africa.” In upcoming book Politics Matters: African Leadership for the Twenty First Century, edited by Akwasi P. Osei, Peter Lang Publisher. 2  On this see “‘Fake News’ and Cyber-Propaganda in Sub-Saharan Africa: Recentering the Research Agenda” by Admire Mare, Hayes Mawindi Mabweazara & Dumisani Moyo. In African Journalism Studies, 2019, 40(4), 1–12; “Mapping Fake News in Africa” by the Africa Center for Strategic Studies, April 26, 2022; “An Explanatory Study of ‘Fake News’ and Media Trust in Kenya, Nigeria and South Africa” by Herman Wasserman & Dani Madrid-Morales. In African Journalism Studies, 2019, 40(1), 107–123. Also see research reports on the uneasy social media-mass media relations in Nigeria, South Africa, Kenya, Egypt, and Somalia followed by case studies of Swaziland, Mozambique, and Zimbabwe in Journalism and Social Media in Africa: Studies in Innovation and Transformation. Edited by Chris Paterson. London and New York: Routledge, 2015.

18 

M. S. CAMARA ET AL.

News, and The Time of Israel3 have demonstrated that although these predatory forces fail to turn their targets into passive receptacles of disinformation and misinformation, they do influence sociopolitical processes. These processes include, but are not limited to, high-stake elections, the proper management of the COVID pandemic, youth engagement in national and regional programs, and conflict prevention in the Sahel region. Accordingly, a judicious study of such trends and their full impacts on the target population ought to be extrospective in essence. In other words, the researcher must prioritize a deconstructionist study of the messaging system employed by the external forces to influence the target population and, as much as possible, a critical analysis of the motive behind the deployment of the system in question. As for the intentional rectification theory, it argues that individuals and groups that are at the receiving end of the manipulative schemes of social media generators still possess the power to rectify the destructive effects of those schemes by being intentional in their usage of the platforms with a view to turning them into tools of constructive behavior that can produce positive outcomes (Camara, 2008, 210–229). The authors researched the chapter using a content study of materials extracted from social media platforms and Internet outlets such as YouTube, including interviews and documentaries generated by mainstream mass media, and a critical analysis of the relevant scholarship. This secondary-source analysis is complemented with questionnaire-based interviews and less formal personal communications with stakeholders on the ground. The statistical data and qualitative materials thus collected are analyzed with due consideration of perspectives expressed by young African social media influencers, entrepreneurs, and activists.

3  See “Russia’s ‘troll factory’ is alive and well in Africa” by Mary Ilyushina of CNN, November 1, 2019. Accessed November 17, 2022, at https://www.cnn.com/2019/10/31/ europe/russia-africa-propaganda-intl/index.html; “A Year in Fake News in Africa” by Ashley Lime, BBC Africa, Nairobi, 12 November 2018. Accessed November 17, 2022, at https:// www.bbc.com/news/world-africa-46127868; “Fake News Floods Sahel as Disinformation Wars Escalates” by Africanews with AFT. February 15, 2022. Accessed November 16, 2022, at www.africanews.com/2022/02/15/fake-news-floods-sahel-as-disinformation-wars-­ escalate/; and “Facebook Says Israeli Company Used Fake Accounts to Target African Elections” by Donie O’Sullivan and Hadas Gold, CNN Business, May 18, 2019. Accessed November 16, 2022, at https://edition.cnn.com/2019/05/16/tech/facebook-takedown-­ israeli-company/index.html.

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

19

The chapter is structured in three sections. Section “Youth Bulge and Social Media Usage in Africa” discusses the conceptual and empirical meaning of “youth bulge” in the African context and examines meaningful ways in which youth social media usage brings to bear authentic longstanding African approaches to the discovery and implementation of knowledge. This conceptual and epistemological contextualization is followed by two sections in which the authors delve into the real-life, real-­ time impacts of social media in the personal, communal, academic, and professional lives of young Africans and, by extension, the public sphere of the society in which they exist and function. Thus, section “The Productive Dimensions of the Youth-Social Media Nexus” discusses the notion of engagement versus information that many young Africans have espoused by way of minimizing the use of mainstream mass media and maximizing engagement through social media. From that standpoint, the section explores how social media has been instrumental in furthering innovation, entrepreneurship, and contribution to development among the African youth. That is followed by an exploration of African youth political engagement and human capacity building through social media. Section “The Counterproductive Dimensions of the Youth-Social Media Nexus” presents and analyzes the negative impacts of misinformation, disinformation, and cyber bullying. Lastly, within the frame of the conclusion, the authors offer recommendations on ways in which African stakeholders can and should optimize the productive dimensions of the youth-social media nexus and abate the counterproductive dimensions of it.

Youth Bulge and Social Media Usage in Africa The concept of youth bulge seems to convey a neutral characterization of a somewhat disproportionate or unprecedented growth of the youth demographic in a historical period of a given region of the world. However, a mere statistical interpretation of the concept can easily occult the loaded narrative in which youth bulge in the so-called developing world is packaged. This is particularly true when one considers the economic, social, and even cultural connotations that references to “youth bulge” carry in contemporary foreign discourses aimed at Africa, the continent that much of the world depicts, rightly or wrongly, as “the world’s poorest continent” and the “weakest link” in the global chain. Indeed, as mentioned earlier, Africa has been recognized as the world’s youngest continent with 70 percent of its population being under the age of 30 as of 2010 (UNPY,

20 

M. S. CAMARA ET AL.

2011); and an estimated 75 percent of its 1.2  billion inhabitants being under the age of 35 as of 2020 (UNPY, 2011). This sustained explosion of the youth demographic, coupled with the increasing unemployment rates across the continent and its corollary of massive youth migrations to Europe, North America, and beyond, has prompted pundits and observers alike to equate Africa’s bulge with youth crisis. One must look beyond this superficial interpretation if one is to gain a properly contextualized understanding of the dynamics of the youth-social media nexus in Africa. It would be disingenuous to downplay the challenges that the interfacing of youth demographic explosion, widespread unemployment, mass migration, and the vulnerability of African youths to the predatory activities of rebellious groups, terrorist organizations, and human traffickers pose for Africa. The multilayered challenges are real and getting more and more pronounced as the trend perdures. Nevertheless, reducing Africa’s youth bulge to this doom-and-gloom facet would be missing a crucial component of the dialectical essence of the phenomenon; namely, the fact that youth bulge has two interdependent dimensions: bulge burden and bulge dividend. Youth Social Media Usage in Context One critical aspect of bulge dividend related to social media usage in Africa is that by and large the engagement of young Africans on the most popular platforms adheres largely to a longstanding African epistemological philosophy and practice: that is, the utmost value of knowledge— whether emanating from endogenous or exogenous sources, whether theoretical or technological/practical—resides in the ability of its human carriers and implementers to apply its powers toward solving real problems and advancing human well-being. This is what Camara attempts to elucidate in a study of knowledge and theory of knowledge in the African experience when he writes: “African knowledge systems place a strong emphasis on life experience, practicality and practicability because they are geared toward practical problem solving” (Camara, 2014, 76). In other words, it is true that like most youths do in other parts of the world, young Africans engage in activities that individuals of older generations may view as mere gaming, gossiping, and the like. However, and quite significantly, they also purposefully use social media as a resource for knowledge discovery, knowledge creation, knowledge dispensation, and knowledge implementation in accordance with problems to be solved,

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

21

visions to be pursued, projects to be planned, and goals to be attained that are germane to African needs, challenges, and prospects. With this deliberate and utilitarian approach, sizeable and still growing numbers of young Africans have embarked on innovative journeys in business, education, social empowerment, political engagement, and so on, as amply demonstrated in the next section. From this standpoint, it is fair to aver that young Africans seem to heed the alarm that analysts like Roberto Bissio have raised concerning what they consider Global North multinational corporations’ abusive manipulation of intellectual property rights in the digital age. Bissio makes the point by pointing out that traditionally the knowledge and technological innovations embedded in seeds and medicines were exchanged between communities of healers in a process of mutual benefit. He contrasts that with multinational corporations crying foul because, by detaching the information component from modern industrial products where this information is embedded, ICTs challenge the privatization of information that the industrial system has used as a key motivator for innovation (Bissio, n.d.). All 54 Africans on the upper end of youth (20–35 years of age) consulted for this study showed keen awareness of the knowledge economy doctrine and about 60 percent of them possess academic and/or professional understanding of its pillars and ramifications for Africans of their generation. The commodification of higher education and vocational training, academic knowledge production, and scientific innovation by corporations, institutions of higher education, and related stakeholders in the developed Global North does not surprise or even bother many of these youths. What bothers them is the fact that in the middle of the multifaceted commodification, the same developed Global North further intensifies its exploitative penetration of African resources, a pattern that knowledge economy is, in many regards, designed to perpetuate. Our young African research interlocutors explained their frustration with what they consider a double talk/double standard by contrasting the World Bank’s definition of knowledge economy and the actual commercialization of academic research and basic science which, according to researchers, is rooted in governments seeking military advantage (Irzik, 2013). To be sure, the World Bank articulates knowledge economy through the following four pillars: (1) Education and Training, which stipulates that an “educated and skilled population is needed to create, share and use knowledge”; (2) Information Infrastructure, meaning that “a dynamic

22 

M. S. CAMARA ET AL.

information infrastructure—ranging from radio to internet—is required to facilitate the effective communication, dissemination and processing of information”; (3) Economic Incentive and Institutional Regime, according to which a “regulatory and economic environment that enables the free flow of knowledge, supports investment in Information and Communications Technology (ICT), and encourages entrepreneurship is central to the knowledge economy”; (4) Innovation Systems implies that a “network of research centers, universities, think tanks, private enterprises and community groups is necessary to tap into the growing stock of global knowledge, assimilate and adapt it to local needs, and create new knowledge” (World Bank, n.d.). Young Africans have been reclaiming and defending their “rights” to fair access to ICT resources that foreign corporations established in the continent manipulate to make higher than reasonable profits while depriving African users of the freedom to fully enjoy the services for which they are charged exorbitant fees. One such instance of assertive reclamation and defense occurred in March 2022 in Guinea and Niger. In Guinea, the movement launched under the name of Mouvement Boycott Orange Guinée when a young activist by the name of Aboubacar Camara circulated a petition denouncing the excessive subscription fees and related unfair tariffs that the French telecom company Orange Guinée had imposed on Guinean customers for years and calling for nothing short of bringing the powerful corporation to its senses or on its knees. The petition underscored, among other abuses, the fact that Orange was charging Guinean subscribers 18 euros for a 15-gigabyte connection that does not even last a week. Some of the petition signatories expressed their reasons for signing. Abdoulaye Djigué Barry said simply and uncompromisingly, “Down with Orange Guinée’s excessive fees!” Mohamed Lamine Camara stated, “Orange Guinée scams us with their fees.” Paul Sampil elaborated a little more, “I am signing this petition to express my anger over the thefts and deceptions that Orange Guinée inflicted on us, its clients.” Affane Diomandé declared vehemently that “Orange Guinée scams the Guinean people.” Hadjiratou Baldé said, “I am signing because I am suffering,” and Daniel Philippe de Sainte Marie castigated Orange Guinée’s “faulty network, its credit and password thieving, and its spreading of deceptive ads.” Gilbert Tounkara followed the early actions of the Boycott Movement and the quick reactions of Orange Guinée in a short article published on March 6, 2022, noting that under the mounting pressure, Orange Guinée made swift double-­talk concessions by lowering some of its subscription fees while

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

23

Table 2.1  Orange Guinea’s money transfer tariffs Amount received in Guinean francs

Previous rates

New rates

100,000 300,000 500,000 1,000,000 2,000,000 4,000,000 6,000,000 8,000,000 10,000,000 Total

5000 8000 12,000 20,000 34,000 46,000 55,000 68,000 90,000 338,000

2000 6000 10,000 20,000 30,000 40,000 60,000 80,000 100,000 348,000

Source: Gilbert Tounkara, “Opération de charme d’Orange Guinée: le mouvement de boycott pas satisfait et s’en prend à certains influenceurs pour…” in Guinée Fun Show, March 6, 2022. Retrieved June 5, 2022, from guinéefunshow.com

maintaining or even raising others in the critical area of e-banking. Tounkara offers Table 2.1 showing the fees for receiving money through the company’s electronic money transfer services known as Orange Money enacted on March 6, compared to the previous rates.4 It is easily noticeable in this table that Orange Guinée manipulated the numbers to appease the majority of Orange Money users whose transactions generally fall within the lower tiers of 100,000–400,000 GNF while recuperating its “losses” from those of the upper tiers of 6,000,000–10,000,000 GNF. No wonder, then, as Tounkara and many other observers pointed out, Orange Guinée’s fake concessions fooled no one inside the Mouvement Boycott Orange Guinée which continued to pressure the powerful company on various fronts. Ultimately, the president of Guinea’s junta transition government, Colonel Mamady Doumbouya, addressed the issue on June 8, 2022, and that was followed by an executive decision by which his government reversed Orange’s newly enacted high money transfer tariffs (ARPT-RG, 2022). Meanwhile, Orange has been facing similar unwavering challenges from its subscribers in Senegal and Niger who have vehemently denounced 4  The revealing comments by Abdoulaye Djigué Barry, Mohamed Lamine Camara, Affane Diomandé, Hadjiratou Baldé, Daniel Philippe de Sainte Marie, and Gilbert Tounkara under the theme “Boycott d’Orange Guinée” can be accessed on Change.com, a blog dedicated to petitions for change. Retrieved June 7, 2022, from change.org.

24 

M. S. CAMARA ET AL.

what many depict as digital neocolonialism on the part of the French company through its local African subsidiaries. In both countries, as in Guinea, the youth are taking the lead in the struggle for the decolonization of digital information and communication that has become vital for people of their generation. In fact, the author of this section heard in Dakar (Senegal) what a young female activist told him earlier in Guinea. She stated in our conversation: With all due respect to you our elders, you must understand that the youth of my generation are categorically opposed to all forms of neocolonialism, including those being sinuously pushed through science and technology, not to forget literature, cinema, and the like. We in Francophone Africa in particular, are fed up with the shenanigans that older generations would not or could not nip in the bud and tell France: “enough is enough”, our national flags are not mere ornament in front of our embassies around the world; they represent the freedom and sovereignty that our forebearers fought for.5

With contagious and inspiring determination and vision in her voice and body language, the young lady proceeded to underscore the vision of today’s young African women within the struggles of the African youth in general. Let me tell you, we the young African women of today understand that women are the power behind our domestic economic life in the families and the communities, even if we have yet to be fully present at the national and international levels. We further understand that the internet and social media can be potent instruments for economic growth and liberation for our families and communities if we can empower ourselves by becoming innovative contributors to a transformative application these technologies in the continent. We can and we will do that, and no one will stop us; the cat is out of the bag. We will fight with our African brothers, and not against them; for, there is too much at stake for our continent in this struggle and no one group can do it alone.6

5  Hadjiratou Baldé is a Guinean college student and a social media activist focused on the empowerment of youth and women. Personal communication with Mohamed Saliou Camara. Conakry, June 4, 2022. 6  Ibid.

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

25

In the meantime, Orange Money is meeting its match in the form of an African built electronic platform called Wave. The stated ambition of Wave is “to make Africa the first cashless continent.” Its founders explain their motivation as follows: “When mobile money succeeded in Kenya, it lifted about a million people out of poverty. And yet, over 10 years later, most Africans still lack access to affordable ways to save, transfer or borrow the money they need to build businesses or provide for their families. Wave is solving this problem by using technology to build a radically inclusive and extremely affordable financial network” (Wave, n.d.). The young visionaries further comment that in “a continent where over half the population has no bank account, Wave is building the first modern financial network—no account fees, instantly available, and accepted everywhere!” (Wave, n.d.). A fundamental challenge facing Africa today and in the future is whether Africans will move the continent beyond the age-old handicap of Western envelopment perpetuated through imitation and toward true development that is hinged on utilitarian innovation and collective self-reliance, or will we continue to become what Glenn Sankatsing has aptly dubbed “trailer society”. This question is scrutinized in the next pages that correspond to the second and last part of the present section. “Development” Versus “Envelopment” of Africa in the Digital Age The thought-provoking concepts of “development versus envelopment,” “trailer society,” and a few correlated ones used in this chapter were borrowed from an inspiring lecture by Glenn Sankatsing titled “The Caribbean Between Envelopment and Development” (Sankatsing, 2003). Referring to the genesis of the Caribbean societies whose very existence is rooted in the enslavement of African people, Sankatsing argues that the “basic principle of continuity and internal dynamism, underlying all processes of evolution and development, in nature as well as history, was absent in the genesis of our societies that was the product of structural discontinuity rather than self-realisation” (Sankatsing, 2003). He goes on to underscore the perversity with which the West has cannibalized the history, civilization, self-realization, and self-valuation of other peoples, beginning with Africans and their descendants: The history of the last five hundred years of humanity can be summarised in one single phrase as the globalisation of the local experience in the West that turned all other human settings into ‘trailer societies’, towed not toward

26 

M. S. CAMARA ET AL.

their own destiny but toward the destiny and teleology of the West, whose global mission was not to impart, but to collect. Colonialism, therefore, was not a regrettable accident, but a requirement. (Sankatsing, 2003)

This perversion entailed a cynical and self-serving logic of globalization of Western interests, ideologies, values, and domination through what can be termed global envelopment of the Rest by the West which in turn necessitated a fivefold abolition. First, abolition of context to envelop and ultimately vitiate the collective selves of the Rest vis-à-vis the West. Second, abolition of culture that was to posit and impose the culture of the West as the only viable civilized way forward for the Rest. Third, abolition of evolution that was to inculcate into the Rest that because Western civilization is the spear point of human evolution, it stood alone as an ultimate achievement that the Rest must aspire to reach and is destined to reach some day, either by its own efforts or by the blessing of imitation and mimicry. Fourth, abolition of internal social dynamism to deprive the indigenous Rest of the endogenous command over the civilizational engine of creation, self-propulsion, and development. Fifth, abolition of history based on the axiom that universal history coincides with the genealogy of the West and that all experiences not directly connected to or derived from that genealogy is ahistorical (Sankatsing, 2003). In this chapter, we argue that the perversity of the thus-summarized fivefold abolitionist globalization agenda is as present in today’s Africa as it was in the height of colonialism, if for no other reason because meta-­ colonialism (i.e., the framework of the current relations between Africa and the West, and increasingly with non-Western powers such as China) utilizes far more subtle, attractive, and pervasive tools, not least of which are the new information and communication technologies (NICTs). Because meta-colonialism, the third phase of foreign domination after classical colonialism and traditional neocolonialism, involves deceptive subtleties that meta-colonialists manipulate to enlist the “voluntary” and “eager” participation of the meta-colonized in the implementation of the abolitionist globalization agenda, every time Africans use and adopt Western technology and the like, we are walking on one of the sharp edges of a double-edged sword. The only way that we can appropriate such technologies and not get cut by the sword is if we manage to flip it on its side, turn the edges toward its makers, and apply an innovative and inwardly utilitarian usage of the technologies. In other words, we should use the

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

27

NICTs without losing ourselves to the meta-colonial envelopment behind them; we should appropriate the NICTs as problem-solving tools that contribute to enhancing contemporary African inventiveness and innovativeness. Such seems to be at the core of the exhibit called Singulier Plurielles— dans les Afriques contemporaines that Beninese architect Franck Houndégla organized in June-July 2022  in France to showcase authentic African innovations. The innovations included fashion designs by Nigerian designer Bubu Ogisi that tell genuine stories of African civilization often muted in Western literature; automobiles designed and built by Malagasy innovator Karenjy who builds cars in Madagascar that can safely run on all kinds of roads in his country known for its heterogenous landscape; the agricultural/environmental project called Centre Songhaï spearheaded by the Dominican priest Reverend Godfrey Nzamujo whose objective is to device innovative agricultural means and methods to solve problems related to overpopulation, food shortages, and environmental challenges (Carlo, 2022). In its reporting on Houndégla’s exhibit, the French Magazine Jeune Afrique goes beyond the event and covers further innovations by Africans across the continent that deserve to be acknowledged as major steps in the right direction toward countering meta-colonial envelopment and advancing African development and optimal strategic sovereignty in the twenty-­ first century. One is that of 33-year-old Nigerian sculptor John Amanam, a special effect specialist for the Nigerian movie industry Nollywood, who, in response to a tragedy in which his brother lost a hand, invented an artificial hand whose finger movements are controlled by the natural muscles of the arm to which it is connected. In addition to being afflicted by his brother’s struggles after the accident, Amanam was also worried that he could not find an artificial hand that matches his skin color. Hence, he applied his existing skills to solve the dual problem: make a working artificial hand that matches his brother’s skin color. According to Jeune Afrique, his invention is now in high demand not only in Africa but abroad as well, which has prompted him to create Immortal Cosmetic Art, a company based in Uyo, Nigeria, and specialized in building prosthetics (Michel, 2022). Many more African inventions have been showcased in this exhibition and similar international forums by African and/or international organizations focused on debunking the false narratives presenting Africa as the “dark continent” that European colonialists painted in their dehumanizing rhetoric for decades if not centuries.

28 

M. S. CAMARA ET AL.

It is worth stressing, however, that the underlying objective in advancing African inventiveness and innovativeness is not limited to refuting false narratives about Africa. Instead, it is primarily about improving Africans’ well-being, promoting internally generated long-term development and collective self-reliance that will liberate future generations from the shackles of whatever is meant by poverty and underdevelopment. Therefore, although this chapter is about digital information and communication, which implies messaging and messengers, reframing the “‘Development’ Versus ‘Envelopment’ of Africa” paradigm in the digital age must be understood first and foremost as a vision and a mission aimed at imparting concrete changes and producing tangible outcomes in the durable economic, scientific, technological, social, and political development of Africa.

The Productive Dimensions of the Youth-Social Media Nexus Debating the impact of social media on information access, innovation, development, government service delivery, and public attitudes is an ongoing task. Because technology is constantly changing amidst a growing and dynamic youth demographic, these discussions must keep pace and evolve. Within this context of the ever-changing social media landscape, it is almost impossible to fully understand the productive dimensions of social media on any population (in this case, the African youth) outside the framework of a specific place and time. In other words, we can best measure social media productivity among youths on a case-by-case basis. For example, when one examines the youth-social media nexus in correlation with the type of government (democratic or authoritarian) under which the nexus evolves, one can better appreciate the level of freedom that social media users enjoy and, in turn, the extent of social media influence and productivity. Thus, what may be true for Kenya and Senegal may not necessarily apply to Nigeria and Cameroon, and vice versa. Also, when we consider the fact that politics and governance in Africa are not static, we realize that what may have been true for a given country in 2019 may not be the same in 2022. The difference is a function of the dynamic and dissimilar sociopolitical, economic, and cultural contexts in which we find ourselves at any given point in time and space.

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

29

Consequently, this section relies on experiences based on individual countries and scenarios to assess the productive dimensions of the African youth engagement with social media. In doing so, we consider every social media assessment as a snapshot in time and space, wherein every effort toward generalization is carefully investigated and scrutinized for relevant variations. More precisely, to understand African youth productivity on/ with social media, we must ask some pertinent questions: what is the nature of social media usage among the world’s youth? How can we measure productivity on social media? Are the youth simply more engaged or informed on these platforms? How have innovation and development changed with the usage of social media? Are these changes consistent across similar and or dissimilar socio-economic and political environments? We will provide some relevant descriptive statistical context about social media use among Africans to answer these questions. Overview of Social Media Use in Africa Generally, Sub-Saharan African youths and the broader public see Internet connectivity and social media usage as productive and interrelated activities. Internet connectivity directly impacts the degree to which people subscribe to social media platforms. According to the Pew Research Center, Sub-Saharan Africa has experienced dramatic gains in Internet usage in recent years. Table 2.2 is based on a Pew Center research article that indicates Africans’ views toward Internet productivity based on a six-­ country median total sample in Ghana, Kenya, Nigeria, Senegal, South Africa, and Tanzania (Silver & Johnson, 2018). Table 2.2  How the Internet is viewed in Sub-Saharan Africa Areas of focus Education The economy Personal relationships Politics Morality

Good influence

No influence

Bad influence

79% 63% 62% 52% 45%

5% 10% 8% 12% 11%

13% 16% 22% 27% 39%

Source: Table created by the author (Jean Claude Abeck) with data from the Pew Research Center. Retrieved May 24, 2022, from https://www.pewresearch.org/global/2018/10/09/ internet-­connectivity-­seen-­as-­having-­positive-­impact-­on-­life-­in-­sub-­saharan-­africa

30 

M. S. CAMARA ET AL.

As of May 2022, 81.8 percent of Africans on social media were Facebook subscribers, compared to 9.6 percent on YouTube, 3.93 percent on Twitter, 2.34 percent on Pinterest, 1.94 percent on Instagram, and 0.16 percent on Reddit (Statcounter GlobalStats, 2022). These data are best understood when considered within the experiences in individual countries. According to Victor Oluwole on Business Insider Africa, “Nigeria has the most addicted internet users in Africa, with the average user spending 3 hours and 42 minutes on social networks every day” (Oluwole, 2022). South Africa comes second with users spending on average 3 hours and 37 minutes scrolling through social media sites—followed by Ghana where people spend an average of 3 hours and 20 minutes on the Internet and/or social media (Table 2.3). Generally, Northern and Southern African countries had the largest share of social media users in Africa compared to their Central, Eastern, and Western African counterparts. In Northern and Southern Africa, 56 percent and 45 percent of the population, respectively, used social media between 2021 and May 2022, whereas in Central Africa only 8 percent of the people did, making social media usage in Central Africa the lowest regional share in the continent and worldwide (Statcounter GlobalStats, 2022). As revealed in the mapping in Fig. 2.1, however, youth engagement on social media, whether in Africa or elsewhere, is a function of Internet penetration in a particular place and time. The implication is that social media usage depends on the availability of Internet connectivity. Therefore, the youth-social media nexus boils down to the degree to which a given country is connected to the Internet. Figure 2.1 illustrates the Internet penetration of select African countries: Senegal, the Democratic Republic of Congo, Uganda, Ethiopia, Nigeria, Rwanda, Burundi, Botswana, Malawi, Kenya, Zimbabwe, and Tanzania. Increased citizen activity and civic engagement on the Internet and in social media is by and large good for a nation, but authoritarian governments often see it as an obstacle to their pursuit of bad governance, because civic engagement imposes some degree of transparency and sheds light on the actions of such governments. In fact, the political culture of citizen oversight that we now see online during elections in places like Nigeria, Egypt, and South Africa emanates not just from political elites but also from citizen activists who are keen to document human rights abuses with their mobile phones, share spreadsheets to track state expenditures, and pool information about institutional corruption.

66.18 63.21 71.65 76.61 74.88 81.02 83.46 82.66 70.38 82.06 75.65 82.16

Facebook 11.71 10.83 10.58 9.5 10.96 9.21 9.11 8.88 22.72 9.91 15.29 9.43

YouTube 18.32 21.16 13.34 8.61 7.12 3.5 3.06 4.09 3.67 3.78 4.23 3.86

Twitter 2.37 2.54 2.37 3.03 4.13 3.29 2.34 2.28 1.56 2.04 2.41 2.3

Pinterest 1.03 1.61 1.34 1.6 2.2 2.42 1.6 1.61 1.39 1.7 2.01 1.9

Instagram 0.2 0.39 0.5 0.35 0.3 0.14 0.12 0.1 0.04 0.05 0.06 0.04

Tumblr 0.09 0.12 0.11 0.13 0.16 0.15 0.13 0.16 0.09 0.25 0.16 0.12

LinkedIn

0.07 0.1 0.07 0.09 0.15 0.19 0.12 0.15 0.11 0.15 0.14 0.15

Reddit

0.03 0.03 0.03 0.06 0.09 0.06 0.04 0.04 0.03 0.04 0.04 0.04

VKontakte

Source: Table created by the author (Jean Claude Abeck) with data from statcounter.com. Retrieved May 23, 2022, from https://gs.statcounter.com/social-­ media-­stats/all/africa

2021–06 2021–07 2021–08 2021–09 2021–10 2021–11 2021–12 2022–01 2022–02 2022–03 2022–04 2022–05

Date

Table 2.3  Social media statistics in Africa between June 2021 and May 2021

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

31

32 

M. S. CAMARA ET AL.

Fig. 2.1  Internet penetration rates in 12 select African countries from 2000 to 2019. (Source: ICTworks, “Internet Freedom in Africa.” Retrieved May 22, 2022, from https://www.ictworks.org/wp-­content/uploads/2022/01/ Internet-­Freedom-­in-­Africa.pdf)

Among the many tactics by which states interfere with social media usage at various severity levels is online and offline censorship. Censorship tactics range from shutting down political websites or portals (including by arresting journalists, bloggers, activists, and citizens) to exercising proxy control over Internet service providers in the name of countering objectionable content and, in the most extreme cases, shutting down access to entire online and mobile networks. Surprisingly, although authoritarian regimes, more than democratic ones, tend to impinge on entire networks, subnetworks, and nodes, democratic regimes are the most likely to target civil society actors by proxy by manipulating Internet

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

33

service providers. At any rate, governments exercise control by targeting entire networks (i.e., shutting down the Internet), subnetworks (i.e., blocking websites), network nodes (i.e., targeting individuals), and by proxy (i.e., pressuring Internet service providers). Ironically, this level of state disruption of the Internet (and to some extent, social media as well) is a reflection not only of the intensity of youth civic engagement, but also of African states’ desperate fixation with controlling that engagement. More ironically, still, experience shows that the more aggressive state interference becomes, the more resilient and purposeful youth activism gets in Africa. It is hoped and anticipated that as the youth of today grow and mature into the leaders of tomorrow, they will be fully prepared to transform their current resilience into actionable policy programs geared toward the furtherance of civic engagement, transparency, and good governance. Social Media, Youth Innovations, and Development Given the degree of Internet penetration and connectivity in Africa, as documented in Table 2.4, and the nature of social media usage, there are notable developments that are indeed positive and that support youth creativity, innovation, and development. The two significant developments include (1) increased and robust civic engagement and advocacy and (2) the development of e-Governance, innovation, and advanced service delivery. The discussion in this section is centered on these developments.

Table 2.4  Social media penetration and usage across Africa by region Region

Population, est. Social media penetration Est. no. of social media users

Central Africa Eastern Africa Northern Africa Southern Africa Western Africa Africa, Total

179,000,000 445,000,000 246,000,000 68,000,000 402,000,000 1,340,000,000

8% 10% 45% 41% 16% 120%

14,320,000 44,500,000 110,700,000 27,880,000 64,320,000 197,400,000

Source: Saifaddin Galal, “Social Media Penetration in Africa by region 2022.” Statista, June 1, 2022. Retrieved June 13, 2022, from https://www.statista.com/statistics/1190628/social-­media-­in-­africa-­by-­ region/#:~:text=As%20of%20February%202022%2C%20the,figure%20stood%20at%2045%20percent

34 

M. S. CAMARA ET AL.

The proliferation and advancement of social media platforms and information technology have boosted civic engagement among young people in Africa. Facebook, Twitter, and YouTube in particular provide new opportunities through which youths acquire information that shapes their opinions about their sociopolitical environment. Above all, though, the presence of these and comparable social media platforms has created opportunities for making information available instantly and, in many cases, at little cost in that, as William Robert Avis puts it, it has been used to spread and share knowledge and ideas toward a robust civic engagement attitude among youth activists (Avis, 2017). This is exemplified in Nigeria, where Nigerian youth activists harnessed Twitter to fuel significant protest movements, including the most recent #EndSARS, which played a major role in the ultimate curtailing of police brutality in that country (Apir, n.d.). In South Africa and Zimbabwe, Admire Mare found that marginalized groups use Facebook as an alternative arena for discussions. Even politically active youths in those countries have political lives outside the dominant mediated public sphere. In contrast, these platforms often are viewed as “irrational” and “non-political” in mainstream Western literature. The findings reveal that youth activists in Zimbabwe and South Africa used Facebook to engage in traditional and alternative forms of political participation. In both countries, Facebook transmits civic and political information as a conduit for “online donations and fundraising, for contacting political decision-makers, as a political activism platform, as an advertising platform for social and political events, and as a platform for everyday political talk” (Mare, 2015). In other words, youth engagement on Facebook is beginning to shape the dynamics of citizen participation in the public sphere broadly considered. One of the most striking examples in recent memory that exemplifies youth social media and civic engagements in South Africa is the 2015 South African student-led campaign known as Rhodes Must Fall, commonly referred to as #RMF.  The RMF campaign took place at the University of Cape Town and consisted of student-led protests. The students campaigned to remove the statue of British colonialist Cecil John Rhodes, as activists argued that it promoted institutionalized racism and promoted a culture of exclusion, particularly for black students. In this case, South African youths explored activism and counter-memory via the social networking site Twitter. In a qualitative content analysis of tweets and a network analysis, Tanja Bosch argued that despite the digital divide

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

35

in South Africa and limited access to the Internet by most citizens, Twitter was central to youth participation during the RMF campaign, reflecting the politics and practices of counter-memory but also setting mainstream news agendas and shaping the public debate (Bosch, 2017). Research has shown the declining involvement of young people in political processes worldwide, with a general global rise of political apathy among youth and decreased youth consumption of news generated by traditional mainstream mass media. However, Facebook and other new media applications widely used by young people have been seen as potential vehicles to re-engage youth in political debate in South Africa (Bosch, 2013). One of the adverse key indicators of increased civic engagement on social media is rising rates of government interference or what some analysts call digital authoritarianism.7 Digital authoritarianism has been in existence for decades. Nonetheless, its use by authoritarian regimes to repress and manipulate domestic social media users as a tool of state control over their rights is on the rise in Africa. The increased civic engagement of young people poses an existential threat to authoritarian and closed regimes. Over time, the number of incidents involving state interference with Internet infrastructure has risen dramatically. The Center for Technology Innovation at the Brookings Institution tracked four categories of states between 1995 and 2011 and found that until about 2002, most states interfering with their domestic information infrastructure were democracies. The categories are fragile states, democratic states, emerging democracies, and authoritarian states. But after 2002, authoritarian governments began using such interference as a governing tool. Recently, even fragile states have interfered with domestic information infrastructure, usually as a last effort to maintain social control. It is estimated that Africans and, more specifically, African youths have suffered the second most significant number of Internet disruptions globally, surpassed only by Asia (Okunoye, 2020). A critical question to consider at this juncture is why is digital bullying by governments on the rise and how does that translate to a productive civic engagement environment? One primary reason is that freedom on social media means freedom to check government excesses using 7   On this see also Admire Mare, “State-Ordered Internet Shutdowns and Digital Authoritarianism in Zimbabwe.” In International Journal of Communication. 14(2020), 4244–4263.

36 

M. S. CAMARA ET AL.

alternative information narratives outside state-controlled mass media and related sources of top-down information flow. Let us consider some preliminary data from one of Africa’s largest social media consumer markets: Nigeria. Data collected from Twitter regarding Twitter users in Nigeria reveal that political activists are the most influential users compared to others like government officials or institutional actors, mass media, regular citizens, and organizations. Political activists have the most dominant voice on Twitter in Nigeria, while institutional actors such as the Head of State (the president of the Nigerian Federal Republic) and non-profit, non-partisan, and advocacy organizations have some of the smallest. Over 49 percent of the top users are political activists like Nnamdi Kanu, with the most retweeted account within the collection. While citizens and activists continue to use the platform for civic engagements and information sharing, the Nigerian government has effectively shut itself out from the conversation. Social media and its accessibility through phones have continuously increased and enhanced participative governance among young people vis-à-vis their governments. In a study published in 2019, Adrees et  al. define e-Government as “the use of information and communication technologies (ICT) in order to improve efficiency and effectiveness, transparency, and comparability of financial and information exchanges within the government, between the government and its subordinate organizations, between government and citizen, between government and the private sector” (Adrees et al., 2019). Through social media platforms, individual citizens acquire some degree of influence and give their opinions through their respective social media platforms in which their thoughts lead to consensus on different matters. As a result, governments in Africa are increasingly taking advantage of these social media platforms to communicate in real time with their constituents. For example, it was through President Muhammadu Buhari’s official Twitter page that Nigerians learned about their government’s policy of suspending Twitter. Governments have, over time, embraced the integration of ICT, including Internet-powered social media platforms, applications, and services, in government functions and operations. Social media platforms increasingly serve as a marketplace for government services that traditionally were the sole domain of government-controlled media outlets. Social media serve as a tool for governments to facilitate communication with young people, gain their attention, and introduce and deliver e-Government services. This has revolutionized service delivery by partly promoting government

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

37

efficiency. Several governments are rapidly introducing digitalization, e-Government, and digital identity programs requiring citizens to provide detailed personal information, including biometrics for voters’ cards and identity cards. African youth innovators believe that e-Governance and service deliveries are experiencing a seismic shift. Back in 2010, Eril Hersman, an African social media blogger and one of the entrepreneurs behind the successful platform Ushahidi, predicted that “with mobile phone penetration already high across the continent, and as we get to critical mass with Internet usage in some of Africa’s leading countries (Kenya, South Africa, Ghana, Nigeria, Egypt) … a seismic shift will happen with services, products, and information” (Essoungou, 2010). Simplice Asongu and Nicholas Odhiambo assessed linkages between social media and governance dynamics in 49 African countries for the year 2012. The study used four governance dynamics: political governance, economic governance, institutional governance, and general governance. The findings show that Facebook penetration is positively associated with governance dynamics. These positive nexuses differ in significance and magnitude throughout the conditional distribution of governance dynamics (Asongu & Odhiambo, 2018). In this section we have seen that two key trends stand out in Africa’s use of social media. First, there is sustained civic action to counter government-­ induced Internet control measures. Second, social media and Internet freedom in Africa have declined since 1999. Several states continually adopt extreme and punitive measures such as suspensions and data taxation that curtail Internet freedoms, turning social media and Internet shutdowns into political control and stability tools. Nigerian President Buhari’s indefinite Twitter suspension in early 2021 was the latest in a series of blows to restrict social media freedom (CIPESA, 2019). On the one hand, young activists and civil society leaders in Africa increasingly celebrate the creative use of digital media. They advocate for open access to the Internet as a key element in information sharing, innovation, development, and even foreign policy. Since its inception, the Collaboration on International ICT Policy in East and Southern Africa (CIPESA) has positioned itself as a leading center for research and analysis of information to enable policymakers in the region to understand the positive and negative sides of social media and related ICT issues. For example, it is one of two centers established under the Catalyzing Access to Information and Communications Technologies in Africa (CATIA). This initiative focuses on decision-making that facilitates the use of

38 

M. S. CAMARA ET AL.

Information, Communications, and Technologies (ICT) to support development and poverty reduction. CIPESA facilitates access to the Africa Digital Rights Fund (ADRF), offering flexible and rapid response grants to select initiatives in Africa to implement activities that advance digital rights, including advocacy, litigation, research, policy analysis, digital literacy, and digital security skills building.

The Counterproductive Dimensions of the Youth-­Social Media Nexus Social media usage, just like other forms of media consumption, requires an active process of inquiry, interpretation, and analysis on the part of the consumer. However, in the age of the Internet, with the rise of the 24-hour news cycle, and in the era of opinion-based news reporting and citizen journalism, the hitherto clearly defined lines between fact, fiction, and opinion have blurred, thereby begetting an epoch of misinformation, disinformation, and “alternative facts,” the latter having been coined by Kellyanne Conway, former counselor to President Donald Trump. While misinformation, disinformation, and deceitful propaganda campaigns have long existed before the prevalence of the use of social media, the reach or extent to which misinformation and disinformation campaigns can spread and influence individual decision-making is, arguably, unprecedented. Based on 2021 data, social media usage across Africa ranges from 45 percent on the highest end to 8 percent on the lowest end (Galal, 2022). Specifically, and as explained in section “The Productive Dimensions of the Youth-Social Media Nexus,” this accounts for 45 percent of users in Northern Africa, 41 percent in Southern Africa, and 16 percent in Western Africa, the top three regions with the highest population of social media users. On the lower end, Central Africa and Eastern Africa account for 10 percent and 8 percent, respectively. Based on these statistics, an approximate total of 200  million Africans is considered active on social media. This accounts for about 15 percent of the total population, and by comparison, just over 44 percent of Africa’s youth population, as well as almost 45 percent of Africa’s labor force (Rocca & Schultes, 2020; and Fox, 2022). Table  2.4 presents a snapshot of social media usage based on regional population across the continent. Just like with any prerogative, greater access to information comes with greater responsibilities. The concurrent challenge of the unbridled and

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

39

unlimited expanse of social media, coupled with the spreading of misinformation, or unintentional proliferation of incorrect and misleading information, produces myriad negative consequences. Thus, we will begin the analysis of the counterproductive dimensions of the youth-social media nexus by examining this phenomenon. Misinformation and Its Consequences From politics to economics and from education to entertainment, a deluge of misinformation is widely circulated among young people, primarily through social media. Misinformation, when spread via social media, has the impetus to reach people directly and engage them personally, with various audiences, a feat not easily achieved by classical mass media such as television and radio. Social media users can comment, like, dislike, share, and react instantaneously to headlines, images, and tantalizing news contents, sparking “viral” and trending moments on various platforms, traceable through hashtags. Recent examples include the spread of rumors and incorrect information on COVID-19, not only in Africa, but around the world, leading the World Health Organization (WHO) to declare an “infodemic” on the issue. The WHO defines an infodemic as “too much information including false or misleading information in digital and physical environments during a disease outbreak” (WHO, 2020a). On the continental level, African leaders responded to the infodemic phenomenon by establishing a specialized alliance of 13 regional organizations across the continent. Thus, the Africa Infodemic Response Alliance (AIRA) was set up in December 2020 to strategically combat the spread of misinformation on COVID-19. The African youth, although generally more digitally apt than older demographics, are often more exposed to misinformation techniques that perpetuate fallacies which can be harmful and even deadly. Until the first half of 2021, the death-to-survival ratio of Africans who contracted COVID-19 was comparatively minimal compared to the rest of the world. Overall, Africa reportedly accounted then for only 4 percent of the total number of deaths from the virus globally (Thomsen, 2021). However, soon after an infodemic challenge was identified, data from the Africa Centers for Disease Control and Prevention revealed an increase of nearly 0.5 percentage within a year (Mwai, 2021). The spike in deaths may be attributed in part to misinformation about the virus, and in part to disinformation which, as explained ahead, is the deliberate concoction and

40 

M. S. CAMARA ET AL.

circulation of false information with the intent to mislead and, potentially, harm people. In the case of misinformation, the deadly consequences are reflective of what theorists have termed situational extrospection, whereby victims unwittingly misled by misinformation lose the agency to take appropriate and adequate action they would otherwise have been apt to take, should they have had accurate information at hand. Commenting on the deadly consequences of viral misinformation, Dr. Matshidiso Moeti, the WHO Regional Director for Africa, stated that “in health emergencies, misinformation can kill and ensure diseases continue to spread. People need proven, science-based facts to make informed decisions about their health and wellbeing, and a glut of information—an infodemic—with misinformation in the mix makes it hard to know what is right and real” (WHO, 2020b). Inaccuracies regarding the origination of the virus and the efficacy of existing treatments competed for reach and influence on social media, particularly among the youth. As information on the nature of the virus and necessary anti-COVID precautions began to circulate from authoritative medical and scientific sources, misinformation began to spread as well, including from leading public figures such as journalists, political pundits, and government officials. A telling example of the latter are the actions of Tanzania’s former President John Magufuli who, in 2020, vehemently dismissed options offered by national health authorities for the control of the spread of the virus in Tanzania. Online and offline, President Magufuli refuted the guidelines of health experts, going as far as to discharge the country’s deputy health minister of his duties for insisting on following scientific facts. The former president called vaccinations “dangerous” and claimed that his own son survived the virus after he “quarantined and used steam inhalation, took lemons and ginger.”8 Official statistics on COVID-19 deaths in Tanzania have been elusive and that limits our ability to compare the death tolls before and after President Magufuli’s blatant denial of the existence and impact of the virus and ascertain its deadly consequences. Nonetheless, WHO data estimated the death toll from the virus in Tanzania at 840 as of mid-2022, which gives us an idea of the severity of the pandemic in that country (WHO, 2022). Moreover, the sudden death of Magufuli in March 2021  in the height of COVID denial campaign 8  On the case of Magufuli’s son’s battle with COVID, see “Tanzanian president’s son recovers from covid-19.” Vanguard News, May 23, 2020. Retrieved May 20, 2022, from https:// www.vanguardngr.com/2020/05/tanzanian-presidents-son-recovers-from-covid-19/.

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

41

brought to bear the dangerous effects of that campaign. Indeed, although his government cited heart complications as the official cause of the president’s death, concurrent reports indicated that he had caught the virus.9 With over a million followers on Twitter, the Tanzanian president was able to spread his dangerous rhetoric across demographic line and that put younger Tanzanians more at risks, given the fact that they are more actively engaged on social media. Disinformation and Its Consequences A swathe of deep fake content in the form of photos, videos, and other multimedia content doctored to strategically spread false and often harmful information has besieged the Internet and social media across Africa and around the world, making young Africans particularly vulnerable to the twin effects of misinformation and disinformation. Unlike misinformation, which is non-intentional false information, the whole purpose behind disinformation is to propagate falsehoods to strategically mislead, indoctrinate, and debase a target audience. Disinformation entrepreneurs are often skilled communicators who have acquired the dubious expertise in conceiving, formulating, packaging, and transmitting falsehood in ways that make it credible even to some experienced consumers. Coming from such disinformation gurus, propaganda, smear campaigns, distractive news, and intentional plots to condition perceptions and experiences can be attractive to organizations, individuals, and groups working to achieve a particular goal. The proliferation of disinformation limits trust, spurs chaos, and stokes conflicts. When disinformation comes from otherwise trustworthy entities, such as foreign governments in quest of nefarious influence on a rival government, corporations that have established large regional or global markets, celebrities and comparable influencers, and the like, it causes millions of regular citizens to distrust the institutions and leaderships of their own governments and, in extreme cases, the cultural foundation of their societies. In recent decades, shady characters based in countries like China, Israel, Russia, and Saudi Arabia (ACSS, 2021), but also in Western nations less pinpointed for the fakery of some propaganda activities, have been 9  On this, see also Dickens Olewe, “John Magufuli: The cautionary tale of the president who denied coronavirus.” BBC News, 18 March 2021. Retrieved May 20, 2022, from https://www.bbc.com/news/world-africa-56412912.

42 

M. S. CAMARA ET AL.

inundating the Internet and social media platforms on which the African youth have a substantial presence with all sorts of fake news. Africa-focused disinformation does not exclusively originate from abroad, however; for, as Tessa Knight, a South Africa-based researcher with the Atlantic Council’s Digital Forensic Research Lab (DFRLab), explained in an interview with the Africa Center for Strategic Studies, “A lot of people are not aware of the scale of disinformation that is happening in Africa and how much it is distorting information networks” (ACSS, 2021). In this interview, Knight delves into a fake civil society digital campaign in support of incumbent Yoweri Museveni ahead of Uganda’s January 2021 presidential elections with many Twitter accounts generating the campaign and attributed to fake citizens being traced to President Museveni’s son, Muhoozi Kainerugaba, and to the Government Citizen Interaction Center (GCIC) at the Ugandan Ministry of Information and Communications Technology and National Guidance. Knight also exposed such disinformation campaigns in the Democratic Republic of Congo mounted by University of Kinshasa students to support a politician called Honoré Mvula under false pretenses, as well as in Eritrea connected to the war in Tigray, and in Sudan (ACSS, 2021). Evidently, such deliberate activities aimed at pushing a negative narrative, ideology, opinion, or strategy interfere with the decision-making of people, organizations, and institutions exposed to the propaganda to the point that the activities can negatively affect elections and related legitimate political and governance programs, health and wellbeing, and the integrity of the values and beliefs of victims of disinformation. A clear illustration of the pervasive impact of political disinformation on electoral and governance processes in Africa is found in Nigeria, Africa’s most populous country with a substantial number of young Internet and social media users. According to a report from the Atlantic Council’s Digital Forensic Research Lab cited by Isabel Debre of the Associated Press whose article was also published by ABC News, the Lab found sample posts removed from Facebook that appeared to praise incumbent President Muhammadu Buhari and smear his leading opponent, Atiku Abubakar, ahead of Nigeria’s February 2019 presidential elections. The report stated: Many of the pages and accounts were discovered to be linked to a Tel Aviv-­ based political consulting and lobbying firm named Archimedes. On its sparse website of African stock images, the company advertises its deliberate

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

43

efforts to conduct disinformation campaigns, boasting that it takes “every advantage available in order to change reality according to our client’s wishes” through “unlimited online accounts operation”. (Debre, 2019)

By the time Facebook banned Archimedes for its “coordinated and deceptive behavior” and conducted a sweeping takedown of dozens of accounts and hundreds of pages primarily aimed at disrupting elections in African countries, as Debre reported in her article, over a million dollars had been spent, and millions of people had been exposed to disinformation from the organization. According to Christopher Ajulo, following this and similar incidents, the Abuja-based NGO International Centre for Investigative Reporting launched initiatives like CrossCheck Nigeria and its FactCheckHub to counter the effects of disinformation (Ajulo, 2019). As challenging as misinformation and disinformation are in the digital age, however, the risks to which young social media users are exposed go beyond mediating between fact and fiction in digital spaces. Young people worldwide, and African youth in particular, are also increasingly exposed to cyber-based attacks, abuse, bullying, and hacking, leaving notable negative scares on their physical, mental, and emotional health and well-being. And the remainder of this section is devoted to an examination of these latter issues. Cyber Bullying, Abuse, and Harassment Beyond cyberspace, abuse, bullying, and harassment have lasting negative effects, but it bears wondering, as well, what happens when online harassment and violence have even more lasting offline effects. The dangers of negativity from these cyber-initiated phenomena can be so pronounced in part because the abusers shelter behind technology where their true identities remain unknown, even to their victims. The volume and consistency of the attacks can also be overwhelming for individual victims struggling to overcome the deluge of negativity. According to a report from Ipsos, a market research firm, South Africa has the highest prevalence of cyberbullying globally, with 54 percent of South African parents admitting to knowing of a child in their community being bullied online (Newall, 2018). In April 2021, a video of a 15-year-old girl being harassed online and physically abused and bullied was believed to have induced her foray into committing suicide on the same day of the attack (Mas, 2021). The video spread across Twitter and on other social media platforms, thus

44 

M. S. CAMARA ET AL.

giving evidence to cyber and physical bullying. It showed classmates filming the attack on their phones and one can only guess how many of such clips ended up on the Internet and social media platforms. According to news reports, the girl was physically assaulted, jeered, and attacked after she blocked her cyber bullies on Facebook and WhatsApp. As cases like this multiply exponentially in South Africa and across the continent, more research is needed in earnest to inform and guide policy makers toward responsible actions against cyber bullying and its corollaries. For one thing, it bears scrutinizing the situation in South Africa with a view to finding out the causal and correlational factors behind the intensification of cyber violence in all its forms in that country and determining the best ways to address the problem.

Conclusion and Recommendations Our purpose in this chapter has been to document, analyze, and contextualize the interfacing of the African youth and social media in conjunction with the intensive creation and extensive spreading of information, misinformation, and disinformation in Africa and worldwide. The chapter has demonstrated, based on statistical data and qualitative information from reliable sources, that the African youth are increasingly becoming major stakeholders in Internet and social media information and communication, by virtue of the unmatched growth of the youth population across this vast continent, the increasing availability of, and access to, cell phones, Internet connections, and social media literacy. The authors of the chapter have endeavored to pinpoint several critical aspects of the complex youth-social media nexus in Africa. Although scholars, including the authors of the present chapter, have meticulously reflected on them, these aspects call for further study that follows the dynamism of the nexus under consideration. One of the critical aspects is that the coincidental meeting of Africa’s youth bulge and the proverbial “fast-and-furious” rise of the social media phenomenon presents a unique opportunity for this energetic and engaged demographic to master, appropriate, and transform social media into a potent instrument for the selfempowerment of young Africans through the discovery, creation, and utilitarian curation and implementation of knowledge. The second aspect is that just as they have the critical mass necessary to become a powerful player in the social media arena through constructive information, the African youth are exposed to misinformation and

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

45

disinformation and, therefore, vulnerable to all sorts of nefarious manipulations that can lead to counterproductive mindsets and self-destructive behaviors. The third critical aspect is that Africans in general, and young Africans in particular, can and must take full advantage of the empowering Internet and social media resources available to them, endeavor in earnest to free Africa from the age-old envelopment policies of the West and its corollary of abolitionist globalization, and to promote an Africa-centered development paradigm that will bring about optimal strategic independence in the twenty-first century and beyond. The fourth aspect is the rise in state curtailing of freedom of expression on the Internet and social media through censorship in the name of curbing mis/disinformation and promoting national unity. Arguably, the most difficult part of dealing with this latter aspect of the youth-social media nexus is determining where the legitimate countering of mis/disinformation and promotion of national unity ends and where political and ideological state censorship begins. Although less pointedly underscored in the chapter, the fifth aspect of the youth-social media nexus is as critical as the others and concerns the imperative need for Africans to own the story of Africa. We argue that the malevolent concoction of debasing narratives toward and about Africans principally, but also about people of African descent, is no longer limited to the precolonial (fifteenth to nineteenth centuries), colonial (nineteenth and twentieth centuries), and immediate postcolonial (second half of the twentieth century) racialism aimed at legitimizing the exploitation and “exploitability” of Black people. In the twenty-first century, the narratives are designed to intellectualize and mediatize an ideology of Afrophobia which, in addition to the classical racialism, also and quite significantly, encompasses a fear of witnessing Africans rise and attain true and self-­ sustaining liberation, self-empowerment, and optimal strategic independence in the era of abolitionist globalization. With Africa’s youth growing demographically, becoming more meaningfully connected with the rest of the world and learning how to counter racialistic narratives while the populations of many other parts of the world are aging, old and newer world hegemons find themselves increasingly faced with the prospects of lacking the critical mass required to sustain their current momentum. Some world leaders have begun to refer to Africa’s youth as the prospective world labor force that will drive much of global economic development in the second half of the twenty-first century. As implicitly demeaning to the African world as such worldview can be (scrutinized from a particular angle, the view recalls the status of

46 

M. S. CAMARA ET AL.

enslaved and/or colonized Africans whose sweat and blood built much of the West from the fifteenth century to the twentieth centuries), it denotes a deep-rooted fear, the same fear that underlies twenty-first-century global Afrophobia responsible for most of the false narratives that Africans must demystify. When Africans tell the story of Africa with objectivity and constructive criticism, the meta-colonial myths and outright lies will be deconstructed and exposed for what they are. The African youth have a unique opportunity to contribute substantially to this endeavor through the Internet and social media and, in the process, shield themselves and future generations from the corrosive effects of the meta-colonial and Afrophobic trend of disinformation that has caused generations of Africans to lose some of their self-valuation. To this effect we recommend that African governments and intergovernmental organizations (IGOs), such as the African Unions and the African Regional Economic Communities, support net neutrality by enforcing and promoting policies for equal access and non-discrimination within the public and private sectors at the supranational and national levels of governance. African IGOs should hold governments and their allies accountable regarding the importation and use of censorship software by requiring the design and implementation of fair Internet and social media user policies that will effectively protect freedom of expression while helping to curb misinformation, disinformation, digital bullying, and comparable misuse and abuse of digital resources. Moreover, African governments and IGOs should protect African civil society organizations (CSOs) from national government censorship and from restrictions on the flow of information imposed by foreign governments and IT corporations. We further recommend that African governments and IGOs promote public-private partnerships across the continent that can facilitate investment in Internet and social media infrastructure and in digital literacy. Digital competency should be promoted as a public good that includes the building of a continental framework of anti-misinformation and anti-­ disinformation that interconnects African countries with access to shared information on verified hacking networks and cybercrimes. Africa’s digital competency framework should also encompass social media accountability mechanisms to verify and validate accurate information for public use, while rooting out messages that convey misinformation and disinformation as well as other kinds of harmful activities. In the final analysis, it behooves the African youth, with the support and an enlightened guidance of African leaders (political, social, academic,

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

47

and business leaders alike), to grab the proverbial bull by the horns by appropriating the current revolution in information and communication technology for the furtherance of their creativity geared toward solving Africa’s problems. They must move from an aspirational perspective that makes them receptacles of foreign innovations to an actionable mission whereby they are the captains of their African ship. The foregoing recommendations seem to be consistent with UNESCO’s view that in the age of knowledge economy, “Knowledge societies must be built on four pillars: freedom of expression, universal access to information and knowledge, respect for cultural and linguistic diversity and quality education for all” (UNESCO, 2002). However, to paraphrase a warning that Bruno Amable and Philippe Askenazy voiced in their introduction to the thus-cited UNESCO publication, it would be unrealistic to assume that the dissemination of ICTs can help Global South countries catch up with their Global North counterparts. The dissemination can at best complement organizational, cultural, and behavioral changes, and the accompanying improvement in individuals’ skills which, in the present case, must take place in Africa (Amable & Askenazy, 2005).

References ACSS. (2021). Domestic Disinformation on the Rise in Africa. Africa Center for Strategic Studies, October 6. Retrieved June 8, 2022, from http://africacenter.org/spotlight/domestic-­disinformation-­on-­the-­rise-­in-­africa/ Adrees, M.  S., et  al. (2019). A Framework of Promoting Government Services Using Social Media: Sudan E-Government Case Study. Journal of Physics: Conference Series, 1167 012062, Open Access. African Union. (2020). Africa Youth Month 2020: Building a Better Africa. Retrieved May 22, 2022, from https://1millionby2021.au.int/news/ africa-­youth-­month-­2020-­building-­better-­africa Ajulo, C. (2019). Africa’s Misinformation Struggles. The Republic, October 2. Retrieved May 23, 2022, from https://republic.com.ng/vol3-­no3/africa-­misi nformation-­struggles/ Amable B., & Askenazy, P. (2005). Introduction à l’économie de la connaissance. Contribution pour le rapport UNESCO Construire des sociétés du savoir. Retrieved May 14, 2022, from https://www.unesco.org/fr Apir, L. (n.d.). The #EndSARS Protest Crackdown: The Government’s Response and Matters Arising. Africa Center for Strategy and Policy. Retrieved June 2, 2022, from https://acstrap.org/the-­endsars-­protest-­crackdown-­the-­governments-­ response-­and-­matters-­arising/

48 

M. S. CAMARA ET AL.

ARPT-RG (Autorité de Régulation des Postes et Télécommunications de la République de Guinée). (2022). Lettre Administrative No. 807/ARPT/DG/ DTE/2022 du Directeur, Conakry, 09 Juin 2022. Asongu, S., & Odhiambo, N. (2018). Governance and Social Media in African countries: An Empirical Investigation. African Governance and Development Institute (AGDI) Working Paper, WP/18/039. Avis, W.  R. (2017). Digital Tools and Improving Women’s Safety and Access to Support Services. In Applied Knowledge Services, GSDRC Helpdesk Report 1415, University of Birmingham Press, Birmingham, UK. Bissio, R. (n.d.). Knowledge for Development: Public Good or Good for the Bank? The Globalisation of Development Knowledge, 19–22. Retrieved June 8, 2022, from http://www.norrag.org/en/publications/norrag-­news/online-­ version/the-­globalisation-­of-­development-­knowledge/detail/knowledge-­for-­ development-­public-­good-­or-­good-­for-­the-­bank.html Bosch, T. (2013). Youth, Facebook, and Politics in South Africa. Journal of African Media Studies, 5(2), 119–130. Bosch, T. (2017). Twitter Activism and Youth in South Africa: The Case of #RhodesMustFall. Information, Communication & Society, 20(2), 221–232. Camara, M. S. (2008). Media, Civil Society and Political Culture in West Africa. Ecquid Novi: African Journalism Studies, 29(2), 210–229. Camara, M. S. (2014). Is There a Distinctively African Way of Knowing (A Study of African Blacksmiths, Hunters, Healers, Griots, Elders, and Artists) Knowledge and Theory of Knowledge in the African Experience. Edwin Mellen Press. Carlo, A.-L. (2022). A la biennale de Saint-Etienne, un design africain de situation. Le Monde, April 17. CIPESA. (2019). State of Internet Freedom in Africa 2019: Mapping Trends in Government Controls, 1999–2019. Collaboration on International ICT Policy for East and Southern Africa, September 2019. Debre, I. (2019). Israeli Disinformation Campaign Targeted Nigerian Election. ABC News, May 17. Retrieved June 8, 2022, from https:// abcnews.go.com/Inter national/wireStor y/israeli-­d isinfor mationcampaign-­targeted-­nigerian-­election-­63104347 Essoungou, A.-M. (2010). A social media boom begins in Africa. Africa Renewal, December. Retrieved on May 20, 2022, from https://www.un.org/africarenewal/magazine/december-­2010/social-­media-­boom-­begins-­africa Fox, L. (2022). It’s Easy to Exaggerate the Scope of the Jobs Problem in Africa. The Real Story is Nuanced. Brookings Institution, March. Retrieved May 20, 2022, from https://www.brookings.edu/blog/africa-­in-­focus/2021/04/19/its-­ easy-­to-­exaggerate-­the-­scope-­of-­the-­jobs-­problem-­in-­africa-­the-­real-­story-­is-­ nuanced/#:~:text=For%20a%20labor%20force%20currentlysame%20as%20 in%20North%20America

2  THE AFRICAN YOUTH AND SOCIAL MEDIA AT THE CROSSROADS… 

49

Galal, S. (2022). Social Media Penetration in Africa by Region 2022. Statista, June 16. Retrieved June 13, 2022, from https://www.statista.com/statistics/1190628/social-­media-­penetration-­in-­africa-­by-­region/#:~:text=As%20 of%20February%202022%2C%20thefigure%20stood%20at%2045%20percent Irzik, G. (2013). Introduction: Commercialization of Academic Science and a New Agenda for Science Education. Science & Education, 22(10). https://doi. org/10.1007/s11191-­013-­9583-­8 Mare, A. (2015). Facebook, Youth and Political Action: A Comparative Study of Zimbabwe and South Africa. PhD Dissertation, School of Journalism and Media Studies, Rhodes University, September 2015. Mas, L. (2021). Videos of South African Girl Driven to Suicide by Bullies Shock the Nation. France 24, April 19. Retrieved May 27, 2022, from https:// observers.france24.com/en/africa/ Michel, N. (2022). Santé, agriculture, architecture… Quand l’Afrique des solutions s’expose. Jeune Afrique, June 11. Retrieved June 12, 2022, from https://www.jeuneafrique.com/1342896/culture/sante-­a griculture-­ architecture-­quand-­lafrique-­des-­solutions-­sexpose/ Mwai, P. (2021). Coronavirus in Africa: Concern Grows Over Third Wave of Infections. BBC News, 16 July. Retrieved June 6, 2022, from https://www. bbc.com/news/world-­africa-­53181555 Newall, M. (2018). Global Views on Cyberbullying. Ipsos, June 27. Retrieved May 23, 2022, from https://www.ipsos.com/en/global-­views-­cyberbullying Okunoye, B. (2020). Censored Continent: Understanding the Use of Tools During Internet Censorship in Africa: Cameroon, Nigeria, Uganda and Zimbabwe as Case Studies. A Research Report for the Open Technology Fund (OTF) Information Controls Fellowship, July 31. Oluwole, V. (2022). Nigerians are the Most Addicted Social Media Users in Africa—See How Other Countries Rank. Nigeria Abroad Magazine, 11 February. Rocca, C., & Schultes, I. (2020). International Youth Day Research Brief. Mo Ibrahim Foundation. Retrieved June 12, 2022, from https://mo.ibrahim. foundation/sites/default/files/2020-­0 8/international-­y outh-­d ay-­ research-­brief.pdf Sankatsing, G. (2003). The Caribbean Between Envelopment and Development. Magistral Lecture Presented at the International Seminar “The Caribbean Pluricultural Mosaic”, Organized on the Occasion of the 2003 Awards for Caribbean Thought by the University of Quintana Roo, the Government of Quintana Roo and the UNESCO Regional Office in Mexico. Thursday 15 May 2003. Republished Online by Biblioteca Nacional Aruba with the Author’s Permission. Silver, L., & Johnson, C. (2018). Internet Connectivity Seen as Having Positive Impact on Life in Sub-Saharan Africa. Pew Research Center,

50 

M. S. CAMARA ET AL.

October 9. Retrieved May 22, 2022, from https://www.pewresearch.org/ global/2018/10/09/internet-­connectivity-­seen-­as-­having-­positive-­impact-­ on-­life-­in-­sub-­saharan-­africa/ Statcounter GlobalStats. (2022). Social Media Stats Africa—Nov 2021–Nov 2022. Retrieved May 23, 2022, from https://gs.statcounter.com/social-­media-­ stats/all/africa Thomsen, I. (2021). Africa Has Suffered Fewer COVID-19 Deaths than Predicted: Richard Wamai Knows Why. News@ Northeastern, June 11. Retrieved June 13, 2022, from https://news.northeastern.edu/2021/06/11/ why-­has-­africa-­suffered-­fewer-­covid-­19-­deaths-­than-­predicted/ UNESCO. (2002). Construire des sociétés du savoir. Organisation des Nations Unies pour l’Education, la Science et la Culture, Conseil Exécutif. Cent soixante-­quatrième session, 164EX/INF.6 Paris, le 23 avril 2002. United Nations Programme on Youth (UNPY). (2011). Regional Overview: Youth in Africa. International Year of Youth August 2020–2011. Retrieved May 22, 2022, from https://social.un.org/youthyear/docs/Regional%20 Overview%20Youth%20in%20Africa.pdf Wave. (n.d.). About Wave. Retrieved June 8, 2022, from http://www.wave.com WHO. (2020a). Infodemic. Retrieved June 7, 2022, from https://www.who.int/ health-­topics/infodemic#tab=tab_1 WHO. (2020b). Landmark Alliance Launches in Africa to Fight COVID-19 Misinformation. WHO Africa, December 3. Retrieved June 6, 2022, from http://www.afro.who.int/news/landmark-­a lliance-­l aunches-­a frica-­f ight-­ covid-­19-­misinformation WHO. (2022). United Republic of Tanzania. WHO Health Emergency Dashboard. Retrieved June 13, 2022, from https://covid19.who.int/region/ afro/country/tz World Bank. (n.d.). The Four Pillars of Knowledge Economy. Retrieved June 6, 2022, from https://web.worldbank.org/archive/website01503/ WEB/0__CO-­10.HTM

CHAPTER 3

Artificial Intelligence for Social Evil: Exploring How AI and Beauty Filters Perpetuate Colorism—Lessons Learned from a Colorism Giant, Brazil Juliana Maria Trammel

Introduction Colorism is the differential treatment of people of the same race based on their skin color. In Brazil, colorism has contributed to a myth that miscegenation would foster a lighter complexion among the Brazilian population, thus, progress (Hanchard, 1999). Likewise, in the United States, European beauty standards have associated lighter skin individuals with purity and wealth and darker tones with skin and poverty. In both countries, the prevalence of skin color variation has led to colorism issues that complement the discussions of racism and ethnic disparities. In both countries, the prevalence of social media selfies and photo filters has accentuated the implications of colorism.

J. M. Trammel (*) Savannah State University, Savannah, GA, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_3

51

52 

J. M. TRAMMEL

While most regard deep fake as artificially generated or modified faces, a deep fake is as simple as using deep learning to modify or generate digital content (Fuentes, 2021). On a micro-scale, through Snapchat, Instagram, TikTok, and Facebook, people’s everyday lives are seen more than ever before. These photograph shots are frequently altered, and color biases have been built into these systems. Snapchat reported that over 200 million individuals used its filter product daily. Some use filters to lighten their skin tone, while others use automatic boosting functions that change the feel and tone of their images. On a macro scale, on the other hand, deep fake videos have been created to simulate the likeness of high-profile politicians charged with misinformation and triggering political instability (Maddocks, 2020). Nevertheless, despite the focus on the macro-scale and high-profile deep fakes, the artificial intelligence technology has allowed for further replication of racism and ethnic marginalization on a micro-scale through video filters and photo alteration, which, contrary to deep fakes that are primarily targeted at high-profile individuals, impact everyday computer technology users. This chapter examines the impact of artificial intelligence and digital algorithms on people of color. Specifically, this chapter explores how beauty filters perpetuate colorism, further victimizing people of darker complexion. First, the chapter provides an overview of artificial intelligence and deep fakes, followed by a discussion of colorism, race, and the implications for people of color.

Artificial Intelligence and Deep Fakes The International Business Machines Corporation (IBM) defines artificial intelligence (AI) as the mechanism of leveraging “computers and machines to mimic the problem-solving and decision-making capabilities of the human mind” (IMB Cloud Education, 2020, Para. 1). AI technology has enabled the mass creation of “deep fakes,” synthetic videos that closely resemble real videos (Vaccari & Chadwick, 2020). Photo fakery is not new, but artificial intelligence has completely changed the game. What used to be a unique practice of big-budget movie productions, video face-­ swap is available to anyone with a decent computer or phone and a few hours to spare (Knight, 2018). AI is more than just a better version of Photoshop or iMovie; it allows a computer to learn how the world looks and sounds “so it can conjure up convincing simulacra” (Knight, 2018, p.  38). Consequently, computer technologies have enabled and made

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

53

fakery production easier and readily available with the pervasiveness and to some extent cost-effectiveness of smartphones (Knight, 2018). In a fundamental definition, deep fakes are the fabrication or manipulation of content. “The term deep fake was first used in a Reddit post in 2017 to refer to videos manipulated using artificial intelligence techniques” (DaSilva et  al., 2021, p.  301). In the post, the user known as “Deep Fake” used machine learning “to swap famous actresses’ faces into scenes featuring adult movie stars, and then posted the results to a subreddit dedicated to leaked celebrity porn” (Knight, 2018, p. 38). The term then went viral with the pornographic and “within a month, tens of thousands of people had followed suit and shared their own ‘deep fake’ porn on the site” (Maddocks, 2020, p.  415). “In January 2018, another Redditor created the FakeAPP to make this technology easily accessible for everyone” (Meskys et  al., 2020, p.  25). Whereas these applications only make changes to facial movements (e.g., Face2Face) or change the visual texture and visual features of the face (e.g., deep fakes) (Meskys et al., 2020), the use of computer technology to manipulate facial features has furthered the layers of discussions regarding color, race, ethnicity, gender as well. While still evolving, Meskys et al. (2020) discussed four main categories of deep fakes and provided an in-depth model that looks at deep fakes in terms of advantages, significant concerns, unintended consequences, and the legal response. According to the scholars, there are four types of deep fakes: porn, political campaigns, commercials, and creative use of deep fakes. Deep fake pornography, also referred to as revenge porn, is considered one of the first forms of deep fakes, and it is referred to as sexually explicit material often created to humiliate, threaten, or harm typically celebrity or a person who has broken off a relationship, leading to many negative ramifications. The scholars cite sexual privacy, autonomy, humiliation, and abuse as concerns. An example is when a face is transferred to a porn actor’s naked body. Additionally, these can lead to several unintended consequences, including financial, mental, or emotional abuse of individuals. From a legal perspective, porn deep fakes can be challenged in criminal law, administrative action, and private law (tors) (Meskys et  al., 2020). Wilkerson (2021) also argued that deep fake, including pornographic images, could be covered under obscenity, child porn, or revenge porn law. Political deep fakes, also referred to as political campaigns, are reproductions that often target the reputation of politicians, political

54 

J. M. TRAMMEL

candidates, elected officials, or public figures associated with the government. These portrayals can also potentially portray false fabrication of events even if it is not moral or ethical but works (Meskys et al., 2020). The scholars argued that this type of deep fake “can have profound negative consequences on the functioning of liberal democracies” (Meskys et al., 2020, p. 28). Similarly, Wilkerson (2021) claimed that “while deep fakes originated in the pornography industry, there are growing concerns about how they could interfere with politics and elections and threaten individual privacy” (p.  410). Examples of deep fakes used for political advantage have been observed globally, as in the case of Gabon and Malaysia, for example (DaSilva et al., 2021), and the phenomenon of deep fakes has become a concern for governments because of its potential to pose a threat not only to politics (DaSilva et al., 2021). Examples of political deep fakes include distorted speeches of politicians, news reports, and socially significant events. While some argue that this could promote some form of freedom of speech, it can cause damage to reputation, distortion of the democratic discourse, impact on election results, and hostile governments (Meskys et al., 2020). Similarly, Wilkerson (2021) argued that deep fakes could fall under several branches of the First Amendment with the claim that the deep fake video is a political parody. Meskys et al. (2020) claimed that unintended consequences could include lack of institutional trust, corroding institutional trust, damage to international relations and social security, divisions, and polarization. Legal ramifications to political deep fakes can include constitutional law, criminal law, defamation, libel, slander, torts, and copyright law. Another category of deep fakes is commercial deep fakes, often called deep fake ad, which is the use of deep fakes in sponsored content such as advertisements. Contrary to pornography and deep political fakes, commercial deep fakes can be beneficial. Examples of commercial deep fakes include translating a video recording into multiple examples (Meskys et al., 2020). For example, David Beckham recorded an English video to raise awareness of Malaria, and deep fake technology enabled him to speak in nine languages (Zero Malaria Britain, 2019). The scholar argues that some benefits include new business model creation and simplifying the human and production processes. They also argue that “commercial uses of deep fakes are socially beneficial as long as deep fake technologies are used for lawful purposes” (Meskys et al., 2020, p. 29). In fact, deep fakes in advertising have gained grounds in the past years with celebrities, such as Bruce Willis, for example, licensing their images for commercials

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

55

(McQuarrie, 2021). Campbell et  al. (2021) provided insights into the impact of ad manipulation on consumers and concluded that advanced manipulation could lead to “perceptions of verisimilitude and creativity [that] can affect awareness of ad falsity and an ad’s persuasiveness” (p. 13). Finally, creative or original deep fakes, such as face-swapping technology, have opened opportunities for creative expression, generating satire, often regarded as a medium that facilitates political debates and free speech. Examples of that are parody memes. It could contribute to creativity and free speech. On the other hand, they could also lead to invasion of privacy and bullying among children. Legal replications could include copyright law, contract law, constitutional law, and fair use (Meskys et al., 2020). In conclusion, scholars argued that as deep fake technology evolves and becomes sophisticated, detection technologies should also advance, or it will be impossible to detect fake content. In addition to the four-item framework to define deep fakes, DaSilva et al. (2021) pointed to the danger of deep fakes leading to fraud and cyberbullying. DaSilva et al. (2021) supported what has been pointed out by other scholars (see Maddocks, 2020) that “although most of the deep fakes that are spread over the internet are pornographic in nature, public attention is focused above all on political deep fakes because of their ability to generate political instability” (DaSilva et al., 2021, p. 308).

Deep Fakes and Social Media Social media has played an enormous role in the production and spread of deep fakes, and many individuals have first encountered deep fakes through social media sites and applications. Ahmed (2021) examined the deep fake sharing behavior and found that social media news consumption and the “fear of missing out” were positively correlated with intentional sharing of deep fake sharing. The scholar also found that “those with lower cognitive ability exhibited higher levels of fear of missing out and increased sharing behavior” (Ahmed, 2021, p. 1). “Fear of missing out was found to be a positive predictor of deep fakes sharing” (Ahmed, 2021, p. 13). On the other hand, “cognitive ability was negatively related to both fear of missing out and intentional deep fakes sharing. Those with low cognitive ability were more likely to exhibit higher levels of fear of missing out and increased deep fakes sharing” (Ahmed, 2021, p.  14). “Individuals with lower cognitive levels do not consider the fallacious nature of the content

56 

J. M. TRAMMEL

and share them within their fear of missing out needs. In contrast, the indirect effects are comparably weaker for those with higher cognitive abilities and are less likely to share deep fakes” (Ahmed, 2021, p. 15). The scholar also drew further attention to the association between education and disinformation sharing. A likely explanation is that “those with higher levels of education would be sharing disinformation to raise literacy or awareness regarding the fabrication” (Ahmed, 2021). The scholar also argued that “the use of social media can for information gain lead to the spread of disinformation” (Ahmed, 2021, p. 15). Deep fakes have demonstrated no signs of slowing down. In 2019, an investigation by Deeptrace, a cybersecurity company, concluded that the number of fake videos had doubled and that most “were pornographic videos used as revenge to harm women” (DaSilva et al., 2021, p. 301). Knight (2018) argued, “These advances threaten to further blur the line between truth and fiction in politics Fake news stories, aside from their possible influence on the last U.S. presidential election, have sparked ethnic violence in Myanmar and Sri Lanka” (Knight, 2018, p. 37). Similarly, Vaccari and Chadwick (2020) argue that political deep fakes have the potential of fostering uncertainty which can impact trust in news on social media. Efforts to Address the Issue In 2020, Twitter was under fire for not quickly removing a slowed-down video of House Speaker Nancy Pelosi, which portrayed her as intoxicated and incoherent (Kantrowitz, 2020). Moreover, in anticipation of the 2020 election, Twitter implemented a new policy to address synthetic media. Twitter started to remove altered videos and other media that “it believed to threaten people’s safety, risks mass violence, or could cause people not to vote” (Kantrowitz, 2020). If the content is suspected to be altered, Twitter may label the content “as misleading, reduce its visibility by removing it from algorithmic recommendation engines, provide additional context, and show a warning before people retweet it” (Kantrowitz, 2020, Para. 8). Pfefferlorn (2021) argues that “the fight against fake videos will complicate the fight for black lives” (Pfefferlorn, 2021, Para. 16). “Currently, most of the content distribution platforms such as YouTube, Facebook or Twitch, have numerous legal obligations to control the content” (Meskys et al., 2020, p. 31).

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

57

States have started the discussion about deep fakes by preventing the dissemination of sexually explicit deep fakes and preventing deep fake election tampering. Virginia and California, for example, amended their revenge porn status to penalize the dissemination of deep fakes when used for revenge porn purposes (Wilkerson, 2021). To exacerbate this issue, “With technological advances, deep fakes are increasingly easy to create and difficult to identify” (Kobriger et al., 2021, p. 204). While much of the deep fake discourse has been centered around pornography, political, commercial, and creative content, their consequences, including political instability, emotional impact, and legal implications as well, very little has focused on the threat to people of color and how AI is reproducing the same existing ethnic disparities observed outside the computer and phone screens.

Deep Fake, Filters Beauty Standards, and Colorism While these advancements sound promising at first look, AI and deep fakes have posed threats and accentuated even farther larger societal issues. A central idea that the present chapter explores is how deep fakes further escalate a society already plagued by socially constructed cultures through the lenses of racism and colorism. This also escalates the complexity of how racism or pre-existing racial bias permeates the current state of digital technology and social media. While most of the conceptualization of AI and deep fakes is often focused on content creators for mass distribution, it is imperative to examine how deep fakes have served to support a more acceptable and “democratic” virtual and synthetic society from the perspective of the creator for interpersonal and reciprocated sharing such as Instagram. This refers to internet users’ dependency on using filters to beautify them. A person’s face plays an essential role in any visual content or communication; to enhance it, quickly and widely accessible tools are used (Kohli & Gupta, 2021). Tharps (2016) contended that skin color is critical because humans are visual species who respond to one another based on physical and visual presentation. In this sense, “skin color will continue to serve as the most obvious criterion in determining how a person will be evaluated and judged” (Tharps, 2016, Para. 4). The COVID-19 pandemic also impacted the search for a better look. With teleworking, telechurch, telemedical visits, and so on, internet users spent more time looking at themselves through the lenses of Zoom,

58 

J. M. TRAMMEL

Webex, and other technology-mediated communication. As communication transitioned to screens, users engaging in technology-mediated communication have become more self-conscious of their faces (Haines, 2021). “Both Instagram and Snapchat, platforms that hit record high levels of engagement during the pandemic, have beauty and augmented-­ related facial filters” (Haines, 2021, Para. 2). “Social media apps that offer augmented reality filters, such as Instagram and Snapchat, have seen usage increase during the pandemic” (Haines, 2021, Para. 3). “In this airbrushed online environment, everyone has access to their own virtual plastic surgeon” (Haines, 2021, Para. 8). For content creators on social media, whether for entertainment or personal journaling experiences, filters have become a standard tool to beautify their pictures. “With just one click, Instagram, TikTok, and Snapchat users can shoot rainbows out of their mouths, sprout furry ears, and even turn back the clock to infanthood” (Wang, 2020, Para. 2). “Filters, which began as cute puppy dog ears and love heart eyes, have now evolved into face, eye and skin color-transforming toolkits for the every-day user” (Lee, 2020, P. 4). The issue with these filters is that, in many cases, filter users cannot manipulate or customize the options available to them. On Instagram, for example, “users can choose from over 20 filters, but as subjects, users don’t have a choice in how their images are processed once a filter is in place” (Jerkins, 2015, Para. 2). These filters sometimes support social, ethnic, and racial stereotypes. Referencing filters that attempt to mimic Asian features, Lee (2020) maintained that Instagram filters perpetuate offensive stereotypes. “The distortions are designed to perpetuate a Eurocentric standard of beauty” (Wang, 2020, Para. 3). “Ignoring these nuances to only focus on white skin represents a modern version of flesh tone imperialism, a term first coined by cinematographer Thierry e Brun” (Wang, 2020, Para. 8). Weiner (2017) reported that users claimed that the FaceApp augmented-reality app lightened their skin and changed their features when they used it (Weiner, 2017). With a “hot” feature, the beautification process of the app involves the lightening of the skin, which alters its features (Weiner, 2017). A South Korean app, popular among K-pop celebrities, analyzes users’ faces and suggests improvements involving sharper chins, thinner nose bridges, and whiter skin. The number of users jumped from 40 million to 200 million active users between 2017 and 2018 (Wang, 2020). “The way that racism operates aesthetically is to neglect or, in extreme cases, erase

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

59

whoever is not white” (Jerkins, 2015, Para. 5). “Technologies are narrowing beauty standards” (Ryan-Mosley, 2021, Para. 15). “Photos reflecting small nose, big eyes, and fuller lips attract more comments and likes, leading recommendation algorithm to prioritize them” (Ryan-Mosley, 2021, Para. 15). “At a basic level, the imaging chips found in most personal cameras have pre-set ranges for skin tones, making it technically impossible to accurately capture the real variety of complexions” (Ryan-­Mosley, 2021, Para. 12). “The harm can involve bleaching or other risky body treatments: the skin-lightening industry has grown rapidly and is now worth more than $8 billion worldwide each year” (Ryan-Mosley, 2021, Para. 17). “Women have been fighting the same battle in the magazine and fashion industries for years, but over the use of photo-editing programs like Photoshop. Over the years celebrities have complained about being photoshopped without permission, saying that removing blemishes and slimming down body parts creates unrealistic expectations of perfection” (Mulaudzi & Mnyanda, 2017, Para. 8). “For black celebrities, the retouching often takes the form of skin lightening” (Mulaudzi & Mnyanda, 2017, Para. 11). “People often think of technology as inherently unbiased, but photography has a history of racism” (Jerkins, 2015, Para. 4). Ryan-­ Mosley (2021) referred to it as “an ancient form or prejudice about skin color is flourishing in the modern internet” (Ryan-Mosley, 2021, Para. 1). Wang (2020) goes on to argue that “this doesn’t just create racial divisions in photography—it also separates, and consequently stratifies, members within the same ethnic community” (Wang, 2020, Para. 9). When capturing dark skin subjects, photographers often use lighting; however, this is potentially complex when filters are catered to whiter subjects (Wang, 2020). Colorism and Societal Issues Skin-tone bias is prevalent around the globe (Balalini, 2020). Rooted deeply in European beauty ideals, colorism bolsters the idea that lighter skin signifies purity and wealth, and darker tones (emphasizing tone in plural form) signify sin and poverty. “Though related to racism, it is distinct in that it can affect people regardless of their race and have different effects on people of the same background” (Ryan-Mosley, 2021, Para. 7). strmic-pawl et al. (2021) even went on to explore the notion of transracialism, a phenomenon attributed to people who change their

60 

J. M. TRAMMEL

appearances, such as hair texture and tanning (strmic-pawl et al., 2021). “The act of ‘passing’ as a different racial group has historical roots during Jim Crow when BIPOC, particularly Black people, passed as white to obtain benefits otherwise withheld from them” (Cowart, 2021, Para. 9). Furthermore, in addition to the process physical features modification, it has been noted that social media algorithms favor persons with lighter skin to the detriment of those with darker skin (Ryan-Mosley, 2021). In addition to the concerns about the use of filters to maintain white imperialism, the notion of content creation, deep fakes, and the people of African descent poses further implications, including policing, mental health, socio-economic concerns, and health, just to name a few. In the United States, for example, “recent police killings of Black Americans caught on camera have inspired massive protests that filled U.S. streets” in 2020 (Pfefferlorn, 2021, Para. 3). The proliferation of video recording has changed how narratives of police violence in the United States are constructed, recorded by bystanders and body-cameras. When faced with a video recording of an event—such as an incident of police brutality— viewers can generally trust that the event happened, as shown in the video (Pfefferlorn (2021). However, the author contends that this could change soon, thanks to the emergence and prevalence of deep fake videos. In a world of pervasive, “compelling deep fakes, the burden of proof to verify the authenticity of videos may shift onto the videographers, a development that could further undermine attempts to seek justice for police violence” (Pfefferlorn, 2021, Para. 2). Studies have also found a correlation between skin color and the social justice system. Viglione et al. (2011) found that women of darker complexion received more severe sentences than their whiter counterparts. Similarly, strmic-pawl et al. (2021) provided a synthesis of various studies since 2006 that have documented that darker skin individuals are associated with longer prison sentences for the same crime, lower marital rates for women, decreased mental and physical health, lower wages (Balalini, 2020), and lower perceived intelligence (strmic-pawl et  al., 2021). However, while academics and journalists have scrutinized racial discrimination at work, skin-tone bias or colorism has not received as much attention. It is also interesting to note that “unlike racial bias, which is usually perpetrated by individuals of one race against those of another, colorism is also frequently observed among members of the same ethnic or racial group” (Balalini, 2020, Para. 3).

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

61

AI, filters, and deep fakes have also enabled the duplication of the whitening processes already observed and fueled by the beauty and cosmetic industry. Colorism perpetuated by the global beauty industry is projected to reach 8.9 billion by 2024 (Balalini, 2020). In fact, some go on to argue that “the multi-billion-dollar market for skin whitening products is an enduring sign of commodity racism” (Balalini, 2020, Para. 5). “According to the World Health Organization (WHO), 77% of women in Nigeria use skin lightening products, the world’s highest percentage” (Brown, 2019, Para. 4). This pattern cannot be examined in isolation as in many parts of the African continent, light-skinned women are perceived to be more beautiful; therefore, more likely to succeed in some industries such as modeling and movie industries (Brown, 2019). In fact, “several African countries, including Rwanda and Ghana, recently have banned the use of skin bleaching products because they are dangerous” (Brown, 2019, Para. 1) With the growth of diversity in the United States, white imperialism could also inspire dangerous trends in pop culture (Wang, 2020). In addition to the impact it creates on individuals of darker complexion, concerns have been raised about health. For instance, Haines (2021) argued, “For particularly insecure individuals, the gap between expectations and reality can lead to body dysmorphic disorder (BDD), a mental illness that affects one in 50 people in the United States” (Para. 17). “Skin whitening has proven to be damaging, physically and mentally” (Balalini, 2020, Para. 4). In India, Instagram came under fire when one of its blackface filters was promoting colorism in India, mirroring the country’s issues of colorism. “This is not the first time that Instagram has been accused of allowing discriminatory filters on the app” (News 18, 2021, Para. 4). In Europe, Instagram has banned filters that depicted or promoted cosmetic surgery due to concerns that they are harmful to users’ mental health. Despite these claims, BBC found dozens of faces and lip-altering filters on the site (Lee, 2020). Defined as a practice of discrimination that supports the notion that those with lighter complexion receive better treatment than those with darker skin, colorism is often overlooked in discussions about racism, more even so about deep fakes (Cowart, 2021). However, some response has started to emerge. The head of YMCA’s Be Real Campaign, Liam Preston, said that while he supports creative freedom, he believes social media platforms could do more to protect the younger users. He also argued that filters that alter users’ race could reinforce racial stereotypes and promote cultural

62 

J. M. TRAMMEL

appropriation (Lee, 2020). Corporations like L’Oréal and Johnson and Johnson have announced they “will no longer sell products that mention ‘skin whitening’” (Mire, 2020, Para. 1). However, “they will continue to promote creams and serums formulated for skin-whitening effects” (Balalini, 2020, Para. 2). Colorism in Other Parts of the World “Colorism, as an internalized idea of the presumed superiority of whiteness, can be witnessed in other parts of the world” (Gonzales, 2009, p. 41). “Discrimination based on skin color has deep historical roots in Latin America and the Caribbean” (Noe-Bustamante et al., 2021, Para. 16; Castro, 2022). Consistent with what was mentioned earlier in this chapter, individuals with darker skin have experienced more discrimination than those of complexion (Noe-Bustamante et  al., 2021). While Latinos typically do not identify with a race, they tend to use labels tied to facial features, skin color, hair type, and so on (Castro, 2022) and “diversity within Latino/as has received increasing attention in the literature over the past two decades” (Chavez-Dueñas et al., 2014, p. 4). Hence, in Latin America, skin color remains an important determinant of outcomes. In Latin America in general, similarly to what was observed in Brazil, a hierarchical or caste system was established connecting privilege and individual worth to skin color, race, and birthplace, exacerbated by the trade of enslaved Africans and miscegenation between indigenous Europeans and African populations (Noe-Bustamante et  al., 2021). A goal of mestizaje, in Spanish, miscigenação in Portuguese, and miscegenation in English had as a goal the assimilation of indigenous and African people into a culturally homogeneous society (Chavez-Dueñas et  al., 2014). Looking at the Latino population, “connections between today’s color-­ blind racial attitudes and ‘mestizaje,’ or the mixing of races, is underscored to demonstrate how these strategies have been used, historically and today, to deny and minimize skin-color privilege” (Chavez-Dueñas et al., 2014, p. 3). “The legacy of ‘mestizaje’ is also outlined as a strategy to minimize and deny the racial privilege of lighter-skinned and European featured Latino/as. Furthermore, mestizaje is compared with more recent color-­ blind racial attitudes in the United States” (Chavez-Dueñas et  al., 2014, p. 4). Hernández (2015) sensitized that Brazil, Cuba, Colombia, Panama, Venezuela, and Nicaragua, with sizable communities of people of African

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

63

descent, favored the mullato, a light-skin class as a distinct group from the dark-skin population (Hernández, 2015). Gonzales (2009) even went on to posit that colorism, “rather than race categories, allowed for finer distinctions in the Latin American context, where there are a large number of people of mixed races” (p. iii). It is important to note that while the scholar used the construct “finer distinctions” to refer to colorism as a better way to define individuals with demographic differences, one cannot deny the negative consequences of a racial democracy mindset. In the United States, Pew Research Center examined skin color’s role in the life experiences of Latino adults and their attitude toward race and discrimination. While “Latino” and “Hispanic” were used interchangeably, the study showed that 62% of the adults surveyed revealed that having darker skin impedes their ability to get ahead in the United States (Noe-Bustamante et  al., 2021). The researchers go on to say that “Hispanics in the U.S. may face discrimination because they are Hispanic (a form of racism), but the degree of discrimination may vary based on skin color, with those of darker shades experiencing more incidents (a form of colorism)” (Noe-Bustamante et al., 2021, Para. 14). Colorism can also affect how Latinos relate to one another (Hatzipanagos, 2022). “The other deeper issue is that of colorism within U.S.  Latino and Latin American culture. As part of the region’s colonial legacy, light-skinned or white-passing Latinos and Latin Americans have earned a social privilege often denied to dark-skinned Afro-Latinos or indigenous people. It is why Latin American media so often only featured blond hair-blue-eyed crooners, telenovela stars or news anchors” (Castillo, 2021, Para. 10). The concept of colorism is not exclusive to people of African descent, European Descent, and the diaspora in the American Hemisphere. “In many parts of Africa and Asia, lighter-skinned women are considered more beautiful, are believed to be more successful and more likely to find marriage” (Fihlani, 2013, Para. 14). Referred to by many scholars as a “colonial legacy,” colorism is also evident in the African continent. In Congo and Rwanda, for example, Europeans used differences in African ethnicity to apply Western social constructs of hierarchy (Kerrison, 2020). In South Africa, which has a long history of colorism, skin color is still used to foster discrimination (Nwamusi, 2022). “A recent study by the University of Cape Town suggested that one in three women bleaches her skin in South Africa. While the reasons vary, wanting “white skin” was raised as one of the goals (Fihlani, 2013). White colonists began employing colorism to create a divide between enslaved South Africans, giving those with lighter

64 

J. M. TRAMMEL

skin preferential domestic tasks while those with dark skin had more demanding work in the fields” (Para. 4). “In apartheid, the government issued the Population Registration Act of 1950, assigning citizens into three groups (Black, White, and Colored) based on racial characteristics. Those categories then determined what social, political, and economic rights and opportunities [one] could access” (Nwamusi, 2022, Para. 4). Signifiers of race, caste, ethnicity, and colorism still mark South Asian cultures exemplified by social inequalities and marginalization experienced by people of African descent (Jayawardene, 2016). “In South Asia, caste categories form a complex system of social organization between caste, race, and colorism” (Jayawardene, 2016, p. 333). “The Hindu caste system, for instance, is based on this premise, whereby lighter skill complexions are viewed as superior to darker [tones]” (Gonzales, 2009, p. 41). In the United States, “school discipline procedures are also directly connected to hierarchies of race and color” (Hunter, 2016, p. 54) creating a school-prison pipeline (Hunter, 2016). “The color-based discrimination that children face in school discipline procedures is directly connected to the growth of the school-to-prison pipeline where African American and Latina/o youth are significantly more likely than others to be routed to prison directly from school” (Hunter, 2016, p. 58). In conclusion, colorism can be regarded as a global phenomenon from the Western hemisphere to the East, placing darker complexions at the bottom of the hierarchy scale and exemplified by all sorts of inequalities and disparities.

Colorism in Brazil: Lessons Learned The issues with colorism and deep fakes perpetuate a long legacy of whitening ideologies, and Brazil is a fitting example of this model of institutional racism. Social media has provided a parallel world where illusions and synthetic discourse can be created. In a similar case, Brazil for years sustained a belief in racial democracy in which all races could co-exist, mainly created by the Racial Democracy Myth, fueled by colorism. Of all countries in the American Hemisphere, Brazil was the last nation to abolish slavery. Nearly 40% of all enslaved Africans brought to the “new world” landed in Brazil (Bouncier, 2017). “Not different from any other territory colonized by Europeans…, the post-colonial mentality in Brazil insisted upon the relationship between whiteness and superiority” (Trammel, 2018, p. 302). By 1700, the number of enslaved Africans outnumbered European descent, and

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

65

miscegenation became a means to merge blacks into a whiter stock gradually (Hanchard, 1999). This miscegenation led to colorism and the myth that Brazil is a country where differences are accepted and valued. However, this only covers a major discrimination issue, which keeps people of African descent in precarious social and economic situations (Ferreira, 2002). Systematic racism in Brazil, which has caused serious consequences to people of color of Brazil, has been mistaken for harmless and subtle racism that is often difficult to address. Rodrigues (1995) referred to it as “cordial racism” (Ting-Toomey, 1999). Campus (2015) argued that the affirmation of blackness competes with this idea of racial hegemony. This complex dynamic of colorism and skin tone is not exclusive to Brazil (Trammel, 2018, p. 303). Trammel (2018) explored how colorism impacts the negotiation of self-identity. The scholar proposed the Racial Democracy Effect Theory to explain “how a group that has been traditionally marginalized, particularly in countries with a covert system of racism, may dialogically sustain their own state of marginalization” (p. 31). Rooted in Ting-Toomey’s (1999, 2005) and Orbe’s (1998) approach to the intersection of interpersonal communication and marginalization, Trammel’s (2018) Racial Democracy Effect Theory expanded to explore how unique frameworks of marginalization surface in systems where colorism is pervasive. Ting-Toomey’s (1999, 2005) piece explored how groups that have been traditionally marginalized used techniques such as assimilation and accommodation to blend with dominant group members. On the other hand, Orbe (1998) also explored the interaction between dominant and nondominant groups, which are often characterized as cautious, guarded, and uncomfortable. Trammel’s (2018) Racial Democracy Effect Theory offered six premises that can help explain how racism is perpetuated in IA and filters. While Trammel’s model was applied to a country with a covert system of racism and widespread colorism due to the racial democracy myth, the model can also be applied to any communicative ecosystem where narratives are fueled by color-based bias. The first premise states, “In cultures with a covert system of racism, there is a collective denial and rejection of cultural pluralism and a pursuit of a unified and national culture” (Trammel, 2018, p. 317). A notion of hegemony and pluralism has characterized the post-modern world. AI and deep fakes have further facilitated this synthetic world where discourse and visual narratives are created to allow everyone to be accepted and regarded as the norm. Filters, for example, are duplicating the persistent culture of

66 

J. M. TRAMMEL

systematic racism that is observed off-screen. What is more interesting is that colorism plays a more significant role than traditional racism. AI technology and filters allow users to find a resemblance that best represents their ethnic features; however, these filters continue to be influenced by white imperialism. This speaks to the first premise of Trammel’s Racial Democracy Effect Theory. When looking at the surface, one can argue that today’s social media represents a sphere that promotes racial and ethnic democracy. Nevertheless, social media’s filters sustain a long-standing legacy of racism and ethnic discrimination. The second premise is that “racial discrimination is pervasive through humorous and playful remarks deemed to carry friendliness and care” (Trammel, 2018, p. 317). A good representation of this second premise is how what would be perceived as immoral and unethical is being carried on and supported in acceptable speech, hiding behind a parody, meme, or humorous form of speech. This phenomenon is more pervasive than just race; it permeates gender and political deep fakes. The central idea is that the use of humor, as it has been observed in other systems of covert racism and high colorism, plays a huge role in carrying out a narrative that is still racist, full of prejudice, and biased but is regarded as acceptable, standard, or adequate. The third premise states that “the contribution of the dominant group towards the national culture is perceived as positive and desired” (Trammel, 2018, p. 317). In countries with a covert system of racism with pervasive colorism, the lighter tone, for example, becomes so desired that it establishes a standard. While the “rainbow of colors” at a first look signifies a racial democracy sphere, a second look shows that while pluralism in the shades is presented, there is still a fascination with the whitening process. Social media filters, as mentioned before, continue to demonstrate and support a system of racism that privileged whiter complexions. What is over, these social media tools and applications often do not allow for much customization and what is available resembles a university standard. Much of this is done unconsciously as the dependency of readily available technology is done almost instantly (or with minimum editing effort), as is in the case of everyday internet users as opposed to complex deep-fakes creators. The average user contributes to forming this “hegemonic culture” without much deliberation. It becomes a usual way of life; adopting the dominant culture phenotype if it generates praises, comments, and shares is easily accomplished.

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

67

The fourth premise states, “Color and other visible phenotypes emerge as the utmost variable and these phenotypes determine power and marginalization levels, which could be measured on a continuous scale” (Trammel, 2018, p. 317). In the case of Brazil, Trammel (2018) showed how in a system where colorism is pervasive, skin tone, hair texture, and lips thickness carry a connotative meaning that signifies attributes that privilege European standards. In the case of synthetic media, a filter that alters nose shape or any other attributes indirectly conforms to or mirrors those already in positions of power. The fifth premise states, “In cultures with covert systems of racism, color emerges as the differential among groups of different ethnic groups or races” (Trammel, 2018, p. 317). This mediated communication that is often filtered signals an ethnic or racial representation that does not represent the actual group of the subject. The sixth premise states, “In cultures with a covert system of racism, institutionalized discrimination is pervasive, yet very little recognized by members of the marginalized groups” (Trammel, 2018, p. 317). As mentioned in this chapter, systematic racism is embedded in algorithms, and AI duplicates the same model of ethnic and racial oppression and marginalization once observed in traditional media, organizations, groups, and interpersonal relationships. However, this is realized in a way that can be little recognized even by members of marginalized groups. In this sense, a person with a dark skin complexation using a filter that enlightens their complexion would just regard the visual outcome as great or acceptable, without any consciousness of the subtle and systematic form of racism or the effect of colorism. For people of color, it is important to recognize that racism, racial discrimination, and the effect of colorism are being created through filters, artificial intelligence, and algorithm. This can indirectly continue to impact people of color’s self concept or the idea that to be seen, and it must happen through the lenses of the oppressor, conforming to European standards.

Conclusion In conclusion, this chapter examined how artificial intelligence and beauty filters perpetuate racism, further victimizing people of color by creating a standard that supports European beauty standards. Filters often enlighten complexion, adjust features, and even change hairstyle—often modeling

68 

J. M. TRAMMEL

European standards. Using Brazil’s example of racial democracy, the chapter established that filters had been designed to support and influence a universal standard that indirectly further marginalizes people of color.

References Ahmed, S. (2021). Disinformation Sharing Thrives with Fear of Missing Out Among Low Cognitive News Users: A Cross-National Examination of Intentional Sharing of Deep Fakes. Journal of Broadcasting & Electronic Media. Balalini, A. (2020, August 26). Colourism: How Skin-Tone Bias Affects Racial Equality at Work. World Economic Forum. Retrieved August 2022, from https:// www.weforum.org/agenda/2020/08/racial-­equality-­skin-­tone-­bias-­colourism/ Bouncier, N. (2017, July). Brazil Comes to Terms with Its Slave Trading Past. The Guardian, Producer. https://www.theguardian.com/world/2012/oct/23/ brazil-­struggle-­ethnic-­racial-­identity Brown, O. (2019, January 15). Banning Skin Bleaching Products Won’t Work as Long as Fair Skin is Linked with Beauty and Success. CNN Health. Retrieved September 2022, from https://www.cnn.com/2019/01/15/health/ banning-­bleaching-­products-­in-­africa/index.html Campbell, C., Plangger, K., Sanda, S., & Kietzmann, J. (2021). Preparing for an Era of Deepfakes and AI-Generated Ads: A Framework for Understanding Responses to Manipulated Advertising. Journal of Advertising. Campus, L. (2015). O negrop é povo no Brasil: afirmação da negritude e democracia racial em Alberto Guerreiro Ramos (1948–1955). Caderno CRH, 28(73), 91–110. Castillo, M. (2021, June 15). The Limitations of ‘Latinidad’: How Colorism Haunts ‘In The Heights’. NPR. Retrieved November 2022, from https:// www.npr.org/2021/06/15/1006728781/in-­t he-­h eights-­l atinidad-­ colorism-­casting-­lin-­manuel-­miranda Castro, G. (2022). Why Understanding Colorism Within the Latino Community Is So Important. Courageous Conversation. Retrieved November 2022, from https://courageousconversation.com/why-­understanding-­colorism-­within-­ the-­latino-­community-­is-­so-­important/ Chavez-Dueñas, N., Adames, H., & Organista, K. (2014). Skin-Color Prejudice and Within-Group Racial Discrimination: Historical and Current Impact on Latino/a Populations. Hispanic Journal of Behavioral Sciences, 36, 3–26. Cowart, K. (2021, September 22). A History of Colorism Sheds Light on Discrimination Today. UGA Today. Retrieved August 2022, from https:// news.uga.edu/history-­of-­colorism-­sheds-­light-­on-­discrimination/ DaSilva, J. P., Ayerdi, K., & Galdospin, T. (2021). The Term Deepfake as First Used in a Reddit Post in 20. Media and Communication, 9(1), 301–312.

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

69

Ferreira, R. F. (2002). O brasileiro, o racismo silencioso e a emancipação do afro-­ descendente. Psicologia & Sociedade, 1(14), 69-86. Fihlani, P. (2013, January). Africa: Where Black is Not Really Beautiful. BBC News. Retrieved November 2022, from https://www.bbc.com/news/ world-­africa-­20444798 Fuentes, L. (2021, February 25). Deepfakes: From Instagram Filters To Dangerous Threats. Snt Partnership Day (K. Lisinski, Interviewer). Silicon Luxembourg. Gonzales, C. (2009, May). The Origins and Effects of “Colorism” in Latin America: A Comparative Study of Mexico and Brazil. Texas Tech University. Haines, A. (2021, April 27). From ‘Instagram Face’ To ‘Snapchat Dysmorphia’: How Beauty Filters are Changing the Way We See Ourselves. Forbes. Retrieved May 2022, from https://www.forbes.com/sites/annahaines/2021/04/27/ from-­i nstagram-­f ace-­t o-­s napchat-­d ysmorphia-­h ow-­b eauty-­f ilters-­a re-­ changing-­the-­way-­we-­see-­ourselves/?sh=212330724eff Hanchard, M. (1999). Black Cinderella? Race and the Public Sphere in Brazil. In M. Hanchard (Ed.), Racial Politics in Contemporary Brazil (pp. 59–81). Duke University Press. Hatzipanagos, R. (2022, March 31). Latinos with Darker Skin Tones Face More Discrimination. The Washington Post. Retrieved November 2022, from https://www.washingtonpost.com/nation/2022/03/31/ latinos-­have-­many-­skin-­tones-­colorism-­means-­theyre-­treated-­differently/ Hernández, T.  K. (2015). Colorism and the Law in Latin America—Global Perspectives on Colorism Conference Remarks. Global Studies Law Review, 20, 683. Hunter, M. (2016). Colorism in the Classroom: How Skin Tone Stratifies African American and Latina/o Students. Theory Into Practice, 55, 54–61. IMB Cloud Education. (2020, June 3). Artificial Intelligence (AI). IMB. Retrieved June 2022, from https://www.ibm.com/cloud/learn/ what-­is-­artificial-­intelligence Jayawardene, S. (2016). Racialized Casteism: Exposing the Relationship Between Race, Caste, and Colorism Through the Experiences of Africana People in India and Sri Lanka. African American Studies, 20, 323–345. Jerkins, M. (2015, July). The Quiet Racism of Instagram Filters. Retrieved June 2022, from https://www.racked.com/2015/7/7/8906343/ instagram-­racism Kantrowitz, A. (2020). Twitter Just Released Its Plan To Deal With Deep Fakes. BuzzFeedNews. Retrieved May 2022, from https://www.buzzfeednews.com/ article/alexkantrowitz/twitter-­just-­released-­its-­plan-­to-­deal-­with-­deep-­fakes Kerrison, K. (2020, April 11). Critical Reflections on Ethnicity and Colourism in Africa and the Diaspora. E-International Relations. Retrieved November 2022, from https://www.e-­ir.info/2020/04/11/ critical-­reflections-­on-­ethnicity-­and-­colourism-­in-­africa-­and-­the-­diaspora/

70 

J. M. TRAMMEL

Knight, W. (2018). Fake America Great Again. MIT Technology Review. Retrieved May 2022, from https://www.technologyreview.com/2018/08/17/240305/ fake-­america-­great-­again/ Kobriger, K., Zhang, J., Quijano, A., & Guo, J. (2021). Out of Depth with Deep Fakes: How the Law Fails Victims of Deep Fake Nonconsensula Porgography. Richmond Journal of Law & Technology, XXVIII(2). Kohli, A., & Gupta, A. (2021). Detecting DeepFake, FaceSwap and Face2Face Facial Forgeries Using Frequency CNN. Multimedia Tools and Applications, 80, 18461–18478. Lee, S. (2020, October). Instagram Filters: ‘Our skin is for life, not for likes’. BBC News. Retrieved December 2021, from https://www.bbc.com/news/ uk-­england-­london-­54360146 Maddocks, S. (2020). ‘A Deepfake Porn Plot Intended to Silence Me’: Exploring Continuities Between Pornographic and ‘Political’ Deep Fakes. Porn Studies, 7(4), 415–423. McQuarrie, L. (2021, August). Bruce Willis Licensed His Deepfake Image Rights for Ads. Trend Hunter. Retrieved September 2022, from https://www.trendhunter.com/trends/deepfake-­commercial Meskys, E., Liaudanskas, A., Kalpokiene, J., & Jurcys, P. (2020). Regulating Deep Fakes: Legal and Ethical Considerations. Journal of Intellectual Property Law & Practice, 15(1). Mire, A. (2020, July 27). What You Need to Know About Rebranded Skin-­ Whitening Creams. The Conversation. Retrieved September 2022, from https://theconversation.com/what-­you-­need-­to-­know-­about-­r ebranded-­ skin-­whitening-­creams-­143049 Mulaudzi, S., & Mnyanda, M. (2017, February). Let’s Be Honest: Snapchat Filters are a Little Racist. HuffPost UK News. Retrieved June 2022, from https://www.huffingtonpost.co.uk/2017/01/25/snapchat-­f ilters-­a re-­ harming-­black-­womens-­self-­image_a_21658358/ News 18. (2021, July). ‘Beauty’ Filter That’s Not: Twitter User Calls Out ‘Blackface’ Trend on Desi Instagram. News 18. Retrieved June 2022, from https://www.news18.com/news/buzz/beauty-­filter-­thats-­not-­twitter-­user-­ calls-­out-­blackface-­trend-­on-­desi-­instagram-­4012679.html Noe-Bustamante, L., Gonzalez-Barrera, A., Edwards, K., Mora, L., & Lopez, M. (2021, November 4). Majority of Latinos Say Skin Color Impacts Opportunity in America and Shapes Daily Life. Pew Research Center. Retrieved November 2022, from https://www.pewresearch.org/hispanic/2021/11/04/majority-­ of-­latinos-­say-­skin-­color-­impacts-­opportunity-­in-­america-­and-­shapes-­daily-­life/ Nwamusi, K. (2022, February 16). Too Black and Not Black Enough. Assembly: A Malala Fund Publication. Retrieved November 2022, from https://assembly. malala.org/stories/too-­black-­and-­not-­black-­enough

3  ARTIFICIAL INTELLIGENCE FOR SOCIAL EVIL: EXPLORING HOW AI… 

71

Orbe, M. P. (1998). Constructing Co-Cultural Theory: An Explication of Culture, Power, and Communication. Sage. Pfefferlorn, R. (2021, April). Tech Stream. The Threat Posed by Deepfakes to Marginalized Communities. Retrieved May 2022, from https://www. brookings.edu/techstream/the-­threat-­posed-­by-­deepfakes-­to-­marginalized-­ communities/ Rodrigues, F. (1995). Racismo Cordial. In C. Turra & G. Venturi (Eds.), Racismo Cordial: A mais completa análise sobre preconceito de cor no Brasil (pp. 11–56). São Paulo: Editora Ática. Ryan-Mosley, T. (2021, August 15). An Ancient Form of Prejudice About Skin Color is Flourishing in the Modern Internet Age. MIT Technology Review. Retrieved December 2021, from https://www.technologyreview. com/2021/08/15/1031804/digital-­b eauty-­f ilters-­p hotoshop-­p hoto-­ editing-­colorism-­racism/?utm_source=Twitter&utm_medium=tr_social& tm_ campaign=site_visitor.unpaid.engagement strmic-pawl, h., Gonlin, V., & Garner, S. (2021). Color in Context: Three Angles on Contemporary Colorism. Sociology of Race and Ethnicity, 7(3), 289–303. Tharps, L. (2016, October 6). The Difference Between Racism and Colorism. Time. Retrieved September 2022, from https://time.com/4512430/ colorism-­in-­america/ Ting-Toomey, S. (1999). Communication Across Cultures. The Guilford Press. Trammel, J.  M. (2018). Color Privileges, Humor, and Dialogues: Theorizing How People of African Descent in Brazil Communicatively Manage Stigmatization and Racial Discrimination. In Black/Africana Communication Theory (pp. 301–337). Switzerland: Palgrave Macmillan. Vaccari, C., & Chadwick, A. (2020). Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News. Social Media + Society. https://doi.org/10.1177/2056305120903408 Viglione, J., Hannon, L., & DeFina, R. (2011). The Impact of Light Skin on Prison Time for Black Female Offenders. The Social Science Journal, 48(1), 250–258. Wang, C. (2020, September). Why Do Beauty Filters Make You Look Whiter? Popular Science. Retrieved June 2022, from https://www.popsci.com/story/ technology/photo-­filters-­white-­kodak-­film/ Weiner, Z. (2017, April 24). FaceApp is Being Accused of Racism for Its Filter That Visibly Lightens Skin. Allure. Retrieved April 2022, from https://www. allure.com/story/faceapp-­accused-­of-­racism-­with-­skin-­whitening-­feature Wilkerson, L. (2021). Still Waters Run Deep(fakes): The Rising Concerns of “Deepfake” Technology and Its Influence on Democracy and the First Amendment. Missouri Law Review, 86(1/12). Zero Malaria Britain. (2019, April). David Beckham Speaks Nine Languages to Launch Malaria Must Die Voice Petition. YouTube. Retrieved September 2022, from https://www.youtube.com/watch?v=QiiSAvKJIHo

CHAPTER 4

Deepfakes as Misinformation: The Next Frontier of Sports Fandom Monica L. Ponder, Trayce Leak, and Kalema E. Meggs

Exploring Deepfakes: An Emerging Communications Crisis In a world that is connected beyond measure, participation no longer requires permission. That means your messages, opinions, and even your face and voice are no longer your own. In recent years, we have witnessed a growing threat to public discourse, human society, and democracy at the hands of mis- and disinformation, also known as fake news (Li, 2019). “Fake news refers to fictitious news-style content that is fabricated to deceive the public” (Li, 2019, p. 39). Fake news and social media are a perfect formula for moving bad information to the masses quickly (Figueira & Oliveira, 2017). Because it is so easy to spread fake news through social media, it becomes increasingly challenging to determine what is real and what is fake (Li, 2019). Enter the world of deepfakes.

M. L. Ponder (*) • T. Leak • K. E. Meggs Howard University, Washington, DC, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_4

73

74 

M. L. PONDER ET AL.

“The game-changing factor of deepfakes is the scope, scale, and sophistication of the technology involved, as almost anyone with a computer can fabricate fake videos that are practically indistinguishable from authentic media” (Li, 2019, p. 39). Defined as the use of “artificial intelligence and machine learning to create convincing synthetic reproductions of human speakers” (Martin, 2022, p.  13), “deepfake technology manipulates images, videos, and voices of real people and alters the original to appear to be, do, or say something other than the original” (PR Newswire, 2020, para. 3). In a space that now seems very comfortable and familiar, due to athletes’ increased use of social media during the pandemic, deepfakes are much more likely to be successful. “Deepfakes are difficult to detect, as they use real footage, can have authentic-sounding audio, and are optimized to spread on social media quickly” (Li, 2019, p. 40). Social media is the perfect place for deepfakes because fake news can spread quickly, largely because social media users tend to go with the crowd and many users are open to questionable information as long as it aligns with their existing views (Li, 2019). Deepfakes can pose threats to society in a variety of ways. Of primary concern is the fact that “deepfakes are likely to hamper digital literacy and citizens’ trust toward authority-provided information” (Li, 2019, p. 42). Deepfakes also increase cybersecurity issues for the public and for organizations alike (Li, 2019). In a digital world that is seemingly obsessed with celebrity and equally steeped in fan culture, deepfakes are on the verge of a new high. With dedicated channels on YouTube and results galore from basic Google searches, deepfakes are taking on lives of their own. For example, one out of five internet users seeks and receives their news from YouTube which is second to Facebook (Li, 2019). In the context of celebrity and fandom, deepfakes have a promising future. We must understand that “deepfakes pose a greater threat than ‘traditional’ fake news because they are harder to spot and people are inclined to believe the fake is real” (Li, 2019, p.  42). Langmia (2021) reminds us that even in today’s digital world, the medium is often more important than the message. This is particularly true when the medium is a synthetic reproduction of your favorite sports celebrity. To demonstrate this emerging crisis and its potential to impact the health and wellness of communities, this chapter conceptually explores the intersection of deepfake phenomena and sports and fandom culture.

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

75

The COVID-19 Pandemic’s Role in Increasing Deepfake Culture Amid the COVID-19 pandemic, we all sought and found new and innovative ways to connect and stay connected with family, friends, and even our favorite celebrities—especially athletes. “With a universal impact affecting both professional athletes and their fans, quarantine provided a point of connection for two otherwise distinct groups through shared experience” (Feder, 2020, p. 459). For many fans and athletes alike, social media went from a form of entertainment to a lifeline of connectivity. During the pandemic, many professional athletes spent much more time engaging with fans via social media than they had before the pandemic (Baker, 2020). Many professional athletes increased their social media activity during the pandemic, providing more ways for fans to connect and more content with which to engage. For example, NBA star Serge Ibaka created an Instagram LIVE talent competition to engage fans and other celebrities alike. Other athletes started podcasts, posted workout videos, or took the opportunity to try their hands as interviewers in their own web-based series. Athletes and fans both took advantage of the opportunity to connect in different ways than they had before. With more time and the ability to expend more effort and energy, athletes became more accessible and relatable during the pandemic. For fans and athletes, the pandemic provided an opportunity to connect on a shared experience, which is not always a common occurrence for these two groups, that completely changed the lives of everyone on the planet. With this connectivity came an increase in the depth of fan-athlete relationships, which may have seemingly shifted—especially from the fans’ perspective—from one-way communication to two-way communication. Fans truly felt more connected to their favorite athletes because they seemed so much more accessible. Whether this shift was real or not, for many it is strongly perceived as shifted because of the newfound access fans have to their favorite athletes. The pandemic bred isolation, and in turn it “intensified individuals’ [athletes and fans] longing for social interaction” (Feder, 2020, p. 459). This longing could lead to a vulnerability or susceptibility fans have not experienced before; it could lead to being open to things to which fans are not usually so receptive. After all, during the pandemic, these fan-athlete relationships seemingly became much more comfortable, reciprocal—even

76 

M. L. PONDER ET AL.

casual. Fans went from just following their favorite athletes to interacting with them through social media, or at least feeling a lot more connected because of this new shared experience. It can be said that the pandemic “humanized” professional athletes by seemingly making them more accessible via social media and by making their posts more personal because of the lack of available sports-related content (Feder, 2020). We saw athletes in new ways and in new spaces. For example, LeBron James joined TikTok during the pandemic, so fans got to see and experience him in ways beyond what had become the norm. Fans got used to seeing and experiencing their favorite athletes in a manner that seemed up close and personal. On any given day, LeBron James is headlining the news and social media. He may be in the news because he scored yet another triple double. He may be all over social media because he exhibited, yet again, what a dynamic philanthropist he is; or, to some, he might be visible advising others to stay home to prevent the spread of COVID-19. Using a critical lens, why would LeBron James, an NBA superstar and world-renowned basketball player who typically avoids taking a public stance on COVID-19 mitigation strategies, be warning against the spread of COVID-19? He is not a physician. He is not a scientist. But he is arguably the most famous basketball player of the modern era. As such, he’s a celebrity with reach and clout. So, people listen to LeBron whether he is talking about basketball, education, or COVID-19. Within health communications literature, it is understood that at least 59% of all adults in the United States look for health information online. This estimate is exacerbated in times of uncertainty, increased health risk, and misinformation. In the early onset of the US-based COVID-19 pandemic, users could find a video of LeBron James encouraging people to stay home. One might ask, what do the NBA’s superstar and COVID-19 have in common? Maybe the commonality is nothing at all. But to the lay public, there must be some connection because LeBron took the time to make this video. In said video, he clearly advises the public to stay home to prevent the further spreading of the virus. It is there—in full color— LeBron’s face and the message is clear. Or is it? What if the video of LeBron James, who seems to be clearly advising us to stay home, is not really LeBron’s? What if this video is not LeBron’s at all? What if this video is part of a growing digital phenomenon known as deepfakes?

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

77

Creating Celebrity The definition of celebrity is not a singular one. As a matter of fact, scholars have engaged in key debates regarding “the definitions and taxonomies of celebrity” (Turner, 2004, p.  4) for decades. For the purposes of this chapter, we will borrow from the cultural and media studies literature and define celebrity as: The product of a number of cultural and economic processes. These processes include the commodification of the individual celebrity through promotion, publicity, and advertising; the implication of celebrities in the processes through which cultural identity is negotiated and formed; and most importantly, the representational process employed by the media in their treatment of prominent individuals. (Turner, 2004, p. 4)

According to Turner (2004), “we can map the precise moment a public figure becomes a celebrity. It occurs at the point at which media interest in their activities is transferred from reporting on their public role (such as their specific achievement in politics or sports) to investigating the details of their private lives” (p. 8). Furthermore, Wernick (1991) posits that “a star is anyone whose name and fame has been built up to the point where reference to them, via mention, mediatized representation, or live appearance, can serve as a promotional boost in itself” (p. 106). Such a boost can be instantaneous and reach worldwide in the digital space. Good, bad, or indifferent, the impact of celebrity is usually widespread, instant, and long-lasting. All celebrities are not created equally. Sports celebrities are different than the norm. People’s deep-seated connection to sports—particularly in the United States—is far beyond what most would consider normal. Fans’ connections to sports figures often go well beyond rooting, cheering, and wishing them success. According to Hyman and Sierra (2010), sports celebrities are expected to represent not just themselves, but often time they are expected to represent their entire culture. They are expected to uphold values and morals and inspire hope. So, sports celebrities are not just entertainers; they are the sources of sports fandom. And sports fans are not just any fans. Athletes have become global phenomena as sports celebrities. Whether an athlete performs on a collegiate or professional level on and off the court and field, sports celebrities have commanded the attention of sports

78 

M. L. PONDER ET AL.

fans, by elevating their own personal status and brand (Duncan, 2022). As some of the most recognizable personal brands, athletes are not only influential to their most loyal fans, but they are influential commodities for the teams they play for and represent. In addition, as being an influential tool, companies and organizations also find themselves aligning their brands with sports stars who continue to connect and capture the attention and admiration from their most loyal and beloved fans. Sports fans find themselves connecting with sports celebrities whether on a personal or public level. According to research, a celebrity is a social actor who maintains a high status and increases public attention, in addition to receiving emotional feedback from stakeholders (Agyemang, 2016, p. 441). Researchers have differentiated the status of celebrity by focusing on their socioeconomic status, endorsements, partnerships, negotiations, and opportunities. Focusing on these entities helps us better understand the placement and context of a celebrity within companies, organizations, and industries. In the case of celebrity athletes, they are considered different from “traditional” celebrities (p. 441). According to scholars, Braunstein and Zhang (2005), sports celebrities and stars represent five-star power factors such as athletic expertise, social attractiveness, likeable personality, characteristic style, and professional trustworthiness, and because of this, sports stars can influence those around them and fans (Duncan, 2022). Although athletes rely on the evolution of their public image, how fans perceive them also plays an important role in how they leverage and maximize business deals, and endorsements, and negotiate their contracts (Smith & Sanderson, 2015).

Fueling Fanship and Fandom Like celebrity, “there is no one accepted universal measure of fanship mainly owing to the complexity of the concept and how it is manifested” (Pegoraro, 2015, p. 248). Similarly, to sports celebrities, sports fans are alike in many ways but equally unique. They share common threads, but there is no one definition of sports fanship. However, as outlined in Wann’s (1995, 1997) seminal works, there are eight common motivations for becoming a sports fan—group affiliation, family, aesthetic, self-esteem, entertainment, escape, economic, and eustress. King (2004) posits that of the eight identified motivations, group affiliation and self-esteem are the two motivations that are ever-present drivers for sports fan behavior.

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

79

Turner (2004), Rojek (2001), and Frow (1998) call our attention to the significant parallels between celebrity culture—especially in sports— and the many societal functions normally reserved for religion. It can be argued that this worship-like approach to and reverence for sports celebrities provide the basis for the extreme fanship associated with sports. Much like the creation of the church, Turner (2004) cautions that “we are using celebrity as a means of constructing a new dimension of community through the media” (p. 6). These new communities are represented and marked by sports affiliations and allegiances; they are fandom. According to He et al. (2022): Fandom is a group of fans gradually formed in the internet environment. It [fandom] has shown the characteristics of being born online for public opinion wars since its birth. It has a distinct class, a rigorous organizational structure, a strong self-driving force, and reckless aggressiveness at all costs. (p. 419)

These new communities operate 24/7. They are local, national, and international. They are men, women, and children. These representations are more than just labels; they often provide purpose and validation. People dedicate exorbitant amounts of time and resources to their fanship and fandom. And the outcomes and behaviors associated with fanship and fandom vary as widely as the fans themselves. Fanship, the state of fandom, is not unlike other emotional states in that it can bring out the best and/or worst in people. Fanship has been upheld as the inspiration some people need to excel, which may be demonstrated through their quest to be like certain celebrities. Fanship has also carried the blame for fans’ abhorrent behaviors in response to disappointment and displeasure. Fandom can be a gift or a curse; it impacts each one differently, but equally deeply.

Digital Fanship and Fandom The transition and evolution of digital and social media channels have drastically changed within the last decade. The use of social media has also evolved and increased by stakeholders stemming from the COVID-19 pandemic (Meggs & Ahmed, 2021). From Twitter, Instagram, and TikTok, these social media platforms have created a bridge for individuals to connect over commonalities and differences. For athletes and sport

80 

M. L. PONDER ET AL.

organizations, social media platforms such as Twitter have created a digital space for them to connect and communicate with fans along with strengthening fan-team and athlete identification. According to research, social media creates a space for community and experience. Sites such as Twitter provide a rich experience and setting (Agyemang, 2016). Social media has become a digital space where fans engage, connect, and build community over discourse of their favortie celebrity athlete. For example, Generation Z (Gen Z) has proven to spend a substantial amount of time on social media platforms and is less engaged in traditional news outlets than any other generation (Beaupré et  al., 2020). This stems from Gen Z in the United States being born and raised into the evolution and influx of a digital world. Therefore, presenting differences in how media habits are developed between different generations, and how fans engage with their favorite sports celebrities like LeBron James, Serena Williams, and Cristiano Ronaldo who have a strong social media presence and following, this not only adds to their sports celebrity status, but it also categorizes them as celebrity influencers. Therefore, according to research, professional athletes are considered the most popular and followed celebrities on social media (Beaupré et al., 2020, p. 384). Pegoraro (2015) reminds us that the way sports fans behave and express various aspects of their fanship has changed significantly with the expansion of digital media and social networking. Digital media—particularly social media—allow fans to express themselves, congregate, and celebrate with like-minded people. The digital world operates much differently than the stadiums and pubs in which fans gathered in days past. The rules—if there are any rules—are very different. In the world of social media, it has become increasingly difficult to monitor and control content—especially user-generated content. It has become equally difficult to regulate ethics and ethical behavior among content consumers who are also content creators. The level of agency and the locusts of control have forever changed, as these now rest with fans/consumers/users as much as formal content creators. “Digital media and associated technology now provide a more direct link between fans and their favorite teams, athletes and other fans and allow new relationships and interactions that were not possible before” (Pegoraro, 2015, p. 254). Sports fans are not only using social media, but they have also embraced these platforms as means to consume information, create and promote information, and engage with their favorite teams, star athletes, other fans, and detractors alike.

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

81

In the digital age, most sports fans are both active consumers and creators of sports-related, media content. From Facebook to Twitter to TikTok, sports fans seem to have an insatiable desire for more and more information. And when the desired content is not readily available, many fans turn inward and become the creators of the content they seek. This degree of craving paired with the immediacy and access to the 24/7 news cycle can be as negative as it is positive. The positive is that today’s fans are more informed, knowledgeable, and engaged than ever before, and this means they are more invested—emotionally and financially—than ever before. The negative is that these same fans are more vulnerable and impressionable than ever before. The combination of desire and access makes today’s sports fan much more susceptible to all sorts of misinformation, disinformation, and mal-information. With such passion and desire comes a sense of vulnerability that leaves many fans susceptible to confusion and misrepresentation. Enter deepfakes.

Deepfakes as Fandom “The digital revolution has had a profound impact upon fandom, empowering and disempowering, blurring the lines between producers and consumers, creating symbiotic relationships between powerful corporations and individual fans, and giving rise to new forms of cultural production” (Pearson, 2010, p.  84). In all of this, media manipulation is not a new phenomenon, but deepfakes have the ability to take this already challenging space to a new dimension. Some might argue that deepfakes “can be seen as just a new technology expression. But that perspective would fall short when it comes to understanding its potential societal impact” (EPRS, 2021, p. 22). Per scholars, deepfakes can have their pros and cons, therefore, causing celebrities in the entertainment and fashion industry to create their personal network accessible and available so deepfake footage can be manipulated and created without there being a need to travel for a video shoot (Keitzmann et al., 2020, p. 142). For example, public service announcements can be broadcast across the world in different languages, much like soccer hall of famer David Beckham who was a spokesperson for a malaria campaign and advocated in different languages. The ever-­ changing media landscape coupled with the increasing demand for visual media creates a perfect space for deepfakes. EPRS (2021) reminds, “All video media offer some sort of manipulation tools, such as virtual

82 

M. L. PONDER ET AL.

backgrounds, face filters and video editing tools. This drives the normalization of mixed-reality experiences” (p. 23). According to PR Newswire (2020), deepfakes are among the most troubling aspects of artificial intelligence because they are relatively easy to create and equally difficult to stop. Martin (2022) posits, deepfakes have become tools for those hoping to mislead people because convincing media can be created that is virtually identical to the original. Martin (2022) continues, “Deepfakes represent a disturbing new development in technology that can lead to misunderstanding, slander, and misrepresentation … combining both speech and video deepfakes can lead to almost indistinguishable persona of any living and non-living person” (p. 14). In addition, deepfakes are created to disseminate discriminate algorithms (Spivak, 2019, p. 342). Media is an essential part of our everyday lives, and the landscape is constantly changing. Media is not only what we use, but also who we are and what we do as a society. Media not only mediate our experiences, but they create and situate said experiences. With such a reliance on media and the growing appeal of visual media—especially visual-forward social media like Instagram and TikTok—deepfakes are an obvious next step. EPRS (2021) reminds that visual communication and content are today’s norm rather than just the exception. We expect and demand visual content. EPRS (2021) notes: The explanation for the growing popularity of visuals as a dominant mode of communication is the fact that it [visual] is a very effective way of transmitting information. Images are effective communication means because audiences can create and retrieve memories more easily when exposed to visuals. Therefore, audio-visual media have strong appeal for their audience and may have unique psychological power. (p. 22)

What better way to make a point, share an opinion, stir controversy, and/or engage an already adoring public than to attach said point, opinion, or controversy to a visual of a celebrity? Thanks to deepfake technology, you no longer need an actual endorsement, consent, or agreement. With said technology, the news is yours to create, generate, and perpetuate as you wish—globally—in just a matter of seconds. This technology truly changes the landscape of communication and celebrity. Martin (2022) offers, deepfake “technology poses a great risk because in general voices offer more intimate access to people’s trust, making them a potent tool for

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

83

fraud,” coupled with “the lack of media literacy in this country and around the world is a perfect stage to allow deepfake technology to flourish” (p. 14). Therefore, deepfake images, audio, and video will continue to be created and shared between communities and individuals through social platform spaces (Keitzmann et al., 2020, p. 137). Although deepfakes are not limited to celebrity victims, the fandom space is a viable frontier. Because they are so emotionally attached and invested, many sports fans stand at attention every time their favorite athlete makes a move, statement, or expresses an opinion. Why are sports fans so susceptible to the messages, opinions, and whims of their favorite athletes? Many would argue that sports fans’ susceptibility is due to lifelong conditioning. Research indicates that many sports fans are socially conditioned to idolize sports celebrities from a young age, and they willingly carry that reverence into adulthood (Hyman & Sierra, 2010). Wann et  al. (2001) note that family allegiances, peers’ associations, and community affiliations are very prominent in the development of one’s fanship. Research further indicates that fan identifications are formed “for reasons of geography … allegiances held by significant others … and school attendance” (Hirt & Clarkson, 2011, p. 61). We must also revisit the motivations for sports fandom and underscore the ever-present needs for group affiliation and self-esteem that are derived from sports attachments. Pan (2022) posits belonging to a group of like-minded people not only brings significant happiness but also increases self-esteem and self-meaning. Regardless of the source of the fan affiliation and allegiance, these ties and bonds run deep, as do the connections to the associated celebrities. Such ties and bonds are the source of the susceptibility of fans in the face of the newly present danger of deepfakes.

Conclusion The COVID-19 pandemic changed the way most people view the world. It changed the way most people communicate, both personally and professionally. The pandemic forever changed the ways in which we connect to others. Social media, in all its forms, is now the norm. What was once a simple source of entertainment has now become the primary form and source of communication for many. We expect messages via social media. We seek messages via social media. And, most importantly, we accept messages via social media. We connect with our families, friends, coworkers, and even celebrities through social

84 

M. L. PONDER ET AL.

media. The need to validate messages and information through traditional media is no more. The days of waiting for information via the daily newspaper or nightly news broadcast have ended. The world is digital, and sports celebrities and fans are major players in the digital space. Sports fanship and fandom are no longer limited by proximity. These relationships are built on digital information systems that span the globe, and access is endless. Our daily lives are consumed with media, as they provide the sources of our information and entertainment. The bi-­ directional nature of communication is enhanced with social media. Sports fans feel more connected to and engaged with their favorite athletes. In this realm of connectivity, sports fans are no longer simple consumers of meditated messages; they are now content creators with messages of their own. With the increased use of platforms like Instagram LIVE, fans and celebrity athletes are co-creating content. Fans are now part of the experience, as opposed to just being consumers of the provided information. They interact with some of their favorite celebrity athletes on matters they may have never had the opportunity to do so decades ago. This has influenced and given fans the power and leverage to agree or challenge social issues that they either relate to or find a difference of opinion on. For example, we have seen the resurrection of sport activism led by former NFL quarterback, Colin Kaepernick, and the interaction by supporters and opposers through social media platforms such as Instagram and Twitter. Memes of Kaepernick which were supposed to create satire about his character were the embodiment and form of propaganda creating a method of framing Kaepernick as unpatriotic, in addition to questioning his race, manhood, and loyalty to the National Football League, creating him to be enemy in the hands of social media users. According to researchers, deepfakes are pranks, goofs, and funny memes with a comedic undertone and satire (Li, 2019, p. 43). These content creators are not always operating from places of truth or authenticity. Deepfakes, which are intelligence mechanisms, allows for synthetic reproduction of human faces and voices. In these meditated spaces, we are learning that participation no longer requires permission. “Deepfakes are quickly evolving to threaten businesses in multiple ways … businesses around the globe have already lost money, reputations, and hard-won brand strength due to deepfakes” (PR Newswire, 2020, para. 1). Martin (2022) warned that next to businesses, public figures are the most susceptible and have the most to lose with deepfakes. The risk to public figures also creates a risk to the public in general because

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

85

“celebrities and public figures have long been shown to be important influencers of human behavior” (Selkie, 2022, p. 1). Understanding the disciplinary linkages between sports and health communication and through the intersection of deepfakes is imperative to combat the growing mis- and disinformation crises that plague digital spaces. This technology truly changes the landscape of communication and celebrity. Because sports fans are so connected to their favorite athletes, and athletes are now so readily available to the public, this makes for a dangerous combination in the hands of those looking to spread misinformation, disinformation, or malinformation. Although deepfakes are not limited to celebrity victims, the fandom space is a viable frontier. We continue to see the rise in fandom across cultural, racial, and gender barriers throughout various sport leagues. Gone are the days of sports journalists being the gatekeeper to pertinent news about a fan’s favorite sports celebrity. Fans have become the storytellers much like citizen, grassroots journalists, and bloggers. Although fans may not have a sports journalism background, there has been a shift in citizen journalism and how storytelling by sports fans has created a different avenue of how people get their sports news, and are these stories true, false, or fabricated due to sports fans being one of the gatekeeper’s of athlete coverage outside of the traditional sports journalist. The ever-growing emergence of deepfakes, technological innovation, and high usage of social media by stakeholders will only create a path of manipulation of images, stories, and representations of sports celebrities going into the next decade and beyond.

References Agyemang, K.  J. (2016). Managing Celebrity via Impression Management on Social Network Sites. An Exploratory Study of NBA Celebrity Athletes. Sport, Business and Management, An International Journal, 6(4), 440–459. Baker, K. (2020). Bored Athletes Take to Instagram to Connect with Fans During Coronavirus Shutdown. Axios. https://www.axios.com/2020/04/03/ instagram-­athletes-­coronavirus-­streaming Beaupré, J.  G., Alfaro-Barrantes, P., & Jacobs, B. (2020). Professional Athletes and Gen Z: A Commentary on Social Media Influence During the COVID-19 Pandemic. The Journal of Social Media in Society, 9(2), 381–392. Braunstein, J. R., & Zhang, J. J. (2005). Dimensions of athletic star power associated with Generation Y sports consumption. International Journal of Sports Marketing and Sponsorship, 6(4), 242–267.

86 

M. L. PONDER ET AL.

Duncan, S. (2022). Applying Dyer’s Star Theory to Sport: Understanding the Cultivation of Athlete Stardom. Physical Culture and Sport. Studies and Research, 94, 35–45. European Parliamentary Research Service (ERPS). (2021). Tackling Deepfakes in European Policy. https://www.europarl.europa.eu/RegData/etudes/ STUD/2021/690039/EPRS_STU(2021)690039_EN.pdf Feder, L. (2020). From ESPN to Instagram LIVE: The Evolution of Fan-Athlete Interaction Amid the Coronavirus. International Journal of Sport Communication, 13, 458–464. Figueira, A., & Oliveira, L. (2017). The Current State of Fake News: Challenges and Opportunities. Procedia Computer Science, 121, 817–825. Frow, J. (1998). Is Elvis a God? Cult, Culture, Questions of Method. International Journal of Cultural Studies, 1(2), 197–210. He, H., Li, X., Tavsel, M., & Zhou, R. (2022, January). A Literature Review on Fans’ Identity Construction. In 2021 International Conference on Public Art and Human Development (ICPAHD 2021) (pp. 419–423). Atlantis Press. Hirt, E. R., & Clarkson, J. J. (2011). The Psychology of Fandom: Understanding the Etiology, Motives, and Implications of Fanship. In L. R. Kahle & A. Close (Eds.), Consumer Behavior Knowledge for Effective Sports and Event Marketing (pp. 59–80). Taylor & Francis. Hyman, M. R., & Sierra, J. J. (2010). Idolizing Sport Celebrities: A Gateway to Psychopathology? Young Consumers, 11(3), 226–238. https://doi. org/10.1108/17473611011074296 Keitzmann, J., Lee, L. W., McCarthy, I. P., & Keitzmann, T. C. (2020). Deepfakes: Trick or Treat. Business Horizons, 63, 135–146. King, B. (2004). What Makes Fans Tick. Sports Business Journal, 7, 25–34. Langmia, K. (2021). Black Lives and Digi-Culturalism: An Afrocentric Perspective. Lexington Books. Li, H. (2019). The Emergence of Deepfake Technology: A Review. Technology Innovation Management Review, 9(11), 39–52. Martin, E.  J. (2022, April 7). Deepfakes: The Latest Trick of the Tongue. Speech Technology. https://www.speechtechmag.com/Articles/Editorial/Features/ Deepfakes-­The-­Latest-­Trick-­of-­the-­Tongue-­152290.aspx Meggs, J., & Ahmed, W. (2021). Applying Cognitive Analytic Theory to Understand the Abuse of Athletes on Twitter. Managing Sport and Leisure, 1–10. Pan, Y. (2022, January). Analysis on the Motives Being a Fan or Fandom and the Possible Factor That Some Fans Performed Sasaengpaen/Fanatical Behavior. In 2021 International Conference on Social Development and Media Communication (SDMC 2021) (pp. 258–261). Atlantis Press. Pearson, R. (2010). Fandom in the Digital Era. Popular Communication, 8(1), 84–95. https://doi.org/10.1080/15405700903502346

4  DEEPFAKES AS MISINFORMATION: THE NEXT FRONTIER OF SPORTS… 

87

Pegoraro, A. (2015). Sport Fandom in the Digital World. In P. Pedersen (Ed.), The Routledge Handbook of Sport Communication (pp. 248–258). Routledge. PR Newswire (USA). (2020, September 29). Deepfakes Could Mean Deep Losses for Businesses If Not Prepared. https://infoweb.newsbank.com/apps/news/ document-­view?p=AWNB&docref=news/17DCD7235B9708B8 Rojek, C. (2001). Celebrity. Reaktion. Selkie, E. (2022). Influence at the Intersection of Social Media and Celebrity. JAMA Network Open, 5(1), 1–3. Smith, R. L., & Sanderson, J. (2015). I’m Going to Instagram It! An Analysis of Athlete Self-Presentation on Instagram. Journal of Broadcasting & Electronic Media, 59(2), 342–358. Spivak, R. (2019). “Deepfake”: The Newest Way to Commit One of the Oldest Crimes. Georgetown Law Technology Review, 339, 339–400. Turner, G. (2004). Understanding Celebrity. SAGE Publications Ltd. Wann, D. L. (1995). Preliminary Validation of the Sport Fan Motivation Scale. Journal of Sport & Social Issues, 19, 377–396. Wann, D. L. (1997). Sport Psychology. Prentice Hall. Wann, D. L., Melnick, M. J., Russell, G. W., & Pease, D. G. (2001). Sport Fans: The Psychology and Social Impact of Spectators. Routledge. Wernick, A. (1991). Promotional Culture: Advertising, Ideology and Symbolic Expression. Sage.

CHAPTER 5

The Influence of Social Media Use in the Wake of Deepfakes on Kenyan Female University Students’ Perceptions on Sexism, Their Body Image and Participation in Politics Nancy Mayoyo

Introduction The world today has witnessed advancement in technology which has led to a rapid growth of interactive media content which includes videos and images on the internet. Some of this has evolved for the generation of deepfakes which is a form of synthetic media. Deepfakes are artificial intelligence (AI)-generated videos and audios. AI applications are used to merge, substitute and superimpose pictures and video clips to create fake videos that appear authentic (Maras & Alexandrou, 2019). Deepfake technology enables users to alter videos and images to misrepresent events and individuals. This is possible as people can generate realistic visuals of very

N. Mayoyo (*) Kenyatta University, Nairobi, Kenya © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_5

89

90 

N. MAYOYO

high quality and manipulate audio and video to fabricate circumstances. Deepfakes have both positive and negative consequences depending on how they are put to use. On a positive note, deepfakes can be useful in the medical and social fields. For instance, a patient who is losing their voice due to a disease like Lou Gehrig’s can be made to communicate digitally with their family using synthetic voice. According to De Ruiter (2020), deepfakes can be used in courts to protect the witnesses’ identity when giving a testimony. In education, synthetic media in the form of deepfakes can improve learning by providing the teacher with the opportunity to engage effectively with learners using synthetic videos. They can also be used to intensify appeal and innovation of museums and history lessons (Citron & Chesney, 2019; Griffin, 2019). For instance, in Florida Dali museum during Dali lives exhibition, guests were treated to an encounter with the deceased surrealist painter Salvador Dali who greeted them and even invited them for a selfie (Lee, 2019). In the film industry, deepfakes can be creatively and innovatively used to provide interactive artistic experience and to convey audio speeches of departed historical figures long gone (De Ruiter, 2020). A good example is the generation of an audio of John F. Kennedy’s speech which he was to deliver in 1963. Equally, instead of having to repeat a shooting of a scene, deepfakes can be used to change the dialogue in a scene in a film. In advertising, deepfake technology has been incorporated in the fight against malaria. David Beckham is shown in a video speaking nine languages to promote the Malaria Must Die Initiative (Meskys et al., 2020). On the other hand, deepfakes can have a negative impact on society as they are a threat to individuals, businesses and political systems. This is complicated by the evolution of social media which enables individuals to share deepfakes in form of images and videos easily through readily available digital devices. According to Tiggemann and McGill (2009), the mass media is the most powerful way of spreading these images. The use of mass media has a negative effect on individuals in a number of ways. For instance, mass media is used to portray ultra-thin women as ideal beauty and attractive (Hawkins et  al., 2004). This has created a standard of women attractiveness that is difficult to achieve by women yet we know that deepfakes can be used to manipulate one’s facial features as well as skin tone. This definitely has an impact on individuals’ perception of their body image.

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

91

Grogan (2016) defines body image as the internal representation of one’s external appearance. It includes an individual’s perception in relation to their body, their attitudes, beliefs, thoughts, behaviours and feelings (Cash, 2012). On the contrary, the negative feelings and thoughts an individual has regarding their bodies are referred to as body dissatisfaction. It is associated with an individual’s negative assessments of their body shape, weight and size and generally implies a supposed inconsistency between one’s body evaluation and the ideal body (Grogan, 2016; Tiggemann, 2011). The emergence of deepfakes has made it progressively challenging to differentiate real from fake media and as such they can be used to create pornographic videos or political videos. They can manipulate individuals in the clip to say something which in real life they did not, all without their consent (Day, 2019; Fletcher, 2018). This becomes problematic because individuals are wrongfully portrayed to say or do things that may injure their character and ruin their reputation. The unfortunate thing is that creation of deepfakes does not require outstanding computer knowledge. Anyone with a computer and access to the widely available sophisticated software can actually fabricate with ease deepfakes which are difficult to distinguish from authentic media (Fletcher, 2018). It is visualized that like other forms of technology, synthetic media will surely evolve to more progressive methods and refined materials (Agarwal et al., 2019). The initial targets for deepfakes were famous people like actors, politicians, actresses and musicians. Their faces were superimposed into porn videos (Hasan & Salah, 2019). Today, the accessibility and ease of deepfake technology have contributed to its use as a weapon of abuse used to harm others (Maras & Alexandrou, 2019). Furthermore, the generation of sophisticated algorithms that can easily manipulate multimedia content has been used to spread disinformation online through social media platforms. Consequently, deepfakes shared across social media spread rapidly to reach many people. This is made worse by the large amount of content generated online and shared as many people have become heavily dependent on access to the internet and media for information. This disinformation and misinformation can be used to manipulate public opinion which has a negative impact on our society at large. Deepfakes are a political threat as they are used to spread misinformation and disinformation on elections. Misinformation refers to misleading content often shared mistakenly by people whereas disinformation is shared deliberately to cause harm or make a profit. Deepfakes undermine

92 

N. MAYOYO

democracy by spreading fake news. They can be used to misrepresent politicians’ speech or actions which contribute to misinformation of voters and threaten the reliability of the democratic process. The spread of misinformation and disinformation is complicated by the emergence of deepfakes as bloggers can post fake videos and make politicians say things they did not actually say. In the month of May 2022, a viral video of former US President Barack Obama endorsingone of the main contestants for the presidential position in Kenya went viral on TikTok. This video which was a deepfake was generated to spread fake news and misinform the electorates ahead of the polls (Bhalla, 2022). Research by Mozilla indicated that 130 TikTok videos were flagged for contravening the platforms policy that forbids content that is manipulated to mislead users by distorting the truth. These videos which collectively had more than 4 million views spread disinformation and had hate speech. Deepfake videos of prominent politicians in Kenya have been spread on Twitter, WhatsApp and Facebook too. These deepfakes are maliciously generated to change the political narrative in Kenya, with the opposing sides spreading propaganda against their opponent. Moreover, fake news are used to humiliate, demean and silence women posing a threat to democracy. Women attempting to run for political office find themselves targets of deepfakes intended to undermine them or their followers or disgrace them greatly that they feel obliged to give up. In the Kenyan political arena, a female politician’s video was shared publicly by her rival. The video showed the politician dancing all nude. The video’s (released after her appointment to a government position) authenticity could not be ascertained. Nevertheless given the timing of its release, it was meant to bully, intimidate, shame her and tarnish her name. Unfortunately, fellow women shared the video making it go viral. This indicates that clearly women running for political positions easily become targets of deepfakes pornography with the intention of undermining them to their supporters. This is meant to shame them to compel them to give up hence meant to put them at their place. It is clear therefore that the advent and subsequent sharing of such deepfakes continue to influence gendered inequalities within visual information production. The fact that women more than men seem to be at the receiving end when it comes to being humiliated and demeaned via social media points to sexism. According to the New Oxford Dictionary (2010), sexism is discrimination or prejudice based on one’s sex or gender. Although it can affect anyone, more often than not it mainly affects women and girls. On

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

93

the extreme, sexism leads to different kinds of sexual violence including harassment and rape. Digital technologies have been thought to have the capacity to bring equality; however, there are disproportionate levels of abuse targeting women on social media. Amnesty International in an online poll in 2017 found that 23% of women had suffered harassment or some sort of abuse in different social media platforms. Equally, female politicians are a target of online hate, violence and disinformation. Female politicians have suffered from hate speech on Twitter. Mainstream media covered the excessive online misogyny that female candidates in the 2019 US general election experienced (Dehingia, 2020). The Economic Intelligence Unit in a recent analysis of 4500 women in 45 countries found that 85% had experienced or witnessed online abuse (Meco & Mackay, 2022). Women who are victims of online abuse end up limiting their interactions on social media for fear of further abuse and violence, thus limiting their freedom of expression. Deepfakes only emerged in 2017 (Westerlund, 2019); hence there is a paucity of literature on the same. This chapter therefore is a reflection of how the emergence of deepfakes spread by social media influences Kenyan women’s perception of their body image and identity. It also investigated the use of social media in the spread of Kenyan female’s participation in politics. Finally, it examined the role of social media in promoting sexism in the wake of synthetic media among Kenyan female university students.

Methodology A systematic integrative review and meta-analysis were conducted to assess how the use of social media shapes Kenyan female university students’ perception of their body image and identity and how the spread of disinformation and misinformation influences Kenyan female university students’ participation in politics. This was done by searching multiple electronic databases for relevant research. This focused on individual scholarly articles from theses from university repositories, scholarly journal articles and news items on use of social media among Kenyan female university students. A Google search was done using keywords deepfake, deepfakes, social media, synthetic media and Kenyan universities. The criterion set for the articles sampled was that it had to be on the use of social media among Kenyan university female students in relation to the three themes of the study; and/or it had to have examined the use of deepfakes

94 

N. MAYOYO

generally and specifically in Kenya. A total of 32 relevant articles were selected. Articles not in English and those on same themes but not on Kenya were excluded. After selecting and reading these articles, data obtained were analysed, evaluated, synthesized and summarized. Empirical analysis of data was organized based on three themes namely: perception of Kenyan female university students’ body image and identity; social media use in the spread of fake news amongst Kenyan female university students; and the role of social media in promoting sexism.

Findings and Discussions Influence of Social Media Use on Kenyan Female University Students’ Perception on Their Body Image and Identity A study conducted in four universities amongst 16–25-year-olds with a sample of 528 found that some students felt that their body weight was not anything near to the ideal body image (Nguchara et  al., 2018). Equally, some students were psychologically affected by how they got judged based on their body appearance. Most were dissatisfied with their body appearance which had a lasting effect on their self-esteem and academic performance. Of the 528 sampled, 66.3% reported that they were very sensitive about their body appearance. Majority revealed that they were uncomfortable with their body appearance while over 40% perceived that they were judged by others as being very thin, overweight or very overweight. Another study carried out among University of Nairobi female students found that 95% of the 183 sampled used Instagram with many using it and Snapchat to seek peer approval. Majority also reported that body surveillance leads to anxiety and depression. Furthermore, users of social media received immediate and anonymous feedback on pictures posted. Negative comments had a negative effect on the victim’s confidence. The obsession among female university students to get likes from their followers made some to alter their looks online by editing them to enhance their appearance. This was an avenue for unnecessary competition as peers sought to outdo each other. Unfortunately this had a negative effect on an individuals’ sense of self-worth and esteem as they compared themselves to others with ‘ideal’ body image (Nyambura, 2019). Another study by Kingori (2018) among 290 University of Nairobi main campus students aged 18–25 years old also sought to find out if

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

95

body size influenced the level of self-esteem among students. The study utilized a questionnaire and standardized attitudinal scales to collect data. Findings from the study revealed that there was a positive relationship between body size and gender. The problem of low self-esteem due to body size increases when one is female. Students’ skin complexion had an effect on their self-esteem and one was more likely to be affected if they were female. Most preferred medium skin tone. The findings from the studies analysed indicated that university female students like to take time on social media observing, liking and sharing images portrayed as ‘ideal images’. These unrealistic body images depicted as ideal led to negative feelings regarding their body images culminating to body image dissatisfaction. It is therefore evident that females’ self-­ esteem was affected depending on how they perceived their body image. This is likely to get worse with the emergence of deepfakes as individuals can alter their images to look better than they really are. Waswa (2018) indicates that feelings of dissatisfaction about one’s body image might affect their consumption behaviours and in turn negatively affect their academic achievement especially in the cognitive and problem-solving abilities. A meta-analysis of the three studies’ (1001) female students indicates that use of social media influenced majority of the respondents’ perceptions. Many did not perceive their bodies to be anywhere near the ideal body image depicted on social media. The immense media exposure experienced by these students to unrealistic ideal body images lowered their self-esteem and increased their desire to match the body images observed or to at least portray themselves as such. What social media users may not be aware of is that not all images or videos they see are real. Some have been generated using generative adversarial networks to create deepfakes reflecting ideal body images. This deceitful display puts pressure on university students who imagine they are the only ones left behind. This results in feelings of dissatisfaction and low self-esteem. The continued use of synthetic media in the future will likely compound this problem. The Influence of Social Media Use in Spread of Disinformation and Misinformation and Its Effect on Kenyan Female University Students’ Participation in Politics A study by Kamau (2017) amongst 600 university students found that almost everyone (91%) who used social media accessed information on

96 

N. MAYOYO

politics anytime they logged in. Fifty-six per cent actively posted political information on social networking sites while 58.7% sought political information on the media. Kamau’s study further revealed that while 54% of men had factual knowledge on politics only 47% of women did. This implied that more women were likely to be misled by fake news and misinformation. Another study by Mbetera (2017) on social media use and participatory politics among 385 students in universities in Nairobi found that 79.5% of the respondents preferred social media as a source of political information due to its accessibility. Ninety per cent of the respondents discussed politics on social media with family and friends with the aim of influencing decision making processes. Sixty-four per cent of the respondents actually reported that they had been influenced to change their political opinions as a result of interaction on social media. These findings were in consonance with those from a study by Chamegere (2020) who investigated the use of social media among 378 students in university students’ elections at Masinde Muliro University of Science and Technology. The findings revealed that social media especially WhatsApp was greatly used in political communication and campaigns. Individuals used social media to express their preferences and this had an impact on the political decisions made by other university students. These findings implied that social media did play a crucial role in exposing the youth to political information which not only influenced their opinions but also enabled them to re-evaluate their different political stands. A meta-analysis of 978 respondents indicated that 87% preferred to use social media as their main source of political information hence the risk of being misled by misinformation and disinformation cannot be ignored. A survey by Geopoll internews indicated that political blogging in Kenya dominates social interaction on Facebook and WhatsApp and is distinctly tribal or ethnic in nature. Fake news thrive on social media with Safaricom the largest mobile service provider reporting that in 2017, they faced a challenge with so many fake news coursing through their servers (Mwita, 2021). The run-up to the August 2022 elections was no different. Bloggers and influencers competed for dominance using different strategies including disinformation. These bloggers who were well paid were not afraid to use propaganda and fake news in their vote hunting mission (Mozilla, 2021). Given that it is not easy to tell the difference of a deepfake from a real video and to verify information shared online poses a challenge to the use of social media in this era of misinformation, propaganda and disinformation. More challenging is the fact that political misinformation can be

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

97

fuelled by the generation of audio deepfakes which are very easy to generate. This can mislead the youth who may make wrong decisions especially since most prefer social media as a source of information than watching TV news which are more credible. On the other hand, misbehaving politicians can take advantage of the existence of deepfakes. Even when they go out there, spread propaganda, hate speech and insult their opponents, they can always deny their involvement and blame it on deepfakes. This will cause mistrust amongst electorates who may not be in a position to differentiate the truth from lies. Misinformation, propaganda and disinformation may negatively affect the county’s peace and hinder the youth’s participation in politics. It also poses a threat to the integrity of the democratic process in Kenya. The Influence of Social Media in Promoting Sexism The evolution of social media as a preferred means of communication has presented new ways in which women on social media experience sexual violence. Studies among Kenyan female university students indicated that a number had been victims. A study by Njuguna (2017) among 150 females aged 18–25 years old found that 63% had been victims of violent video sharing, 63% had received unsolicited pornographic content while 40% knew of someone who had their personal information and images disclosed. Another study by Mayoyo (2021) among students in universities in Nairobi City County sought to find out university students’ perceptions of their partner sharing their intimate information without their permission. The findings indicated that of the 350 respondents, 161 (51.5%) did not think this behaviour was abusive but found it to be normal amongst dating university students. Ironically, although in Mayoyo’s study students normalized the sharing of private photos and videos to spice up their relationships, some females reported having suffered as a consequence of this. Cases of revenge porn were reported with two females revealing that their boyfriends had publicly shared their nude photos after their breakup. Yet another incident was reported of a male who shared a video of him making love with his girlfriend. His friends later shared this video with some of their other friends. Additionally, it was reported that a female university student from University of Nairobi who was also a model had been dethroned from Miss Kenya after her boyfriend sent her nudes to the event organizers.

98 

N. MAYOYO

These findings are a revelation of real cases of revenge porn perpetrated by ex-boyfriends against their ex-girlfriends. The emergence of deepfakes is likely to complicate matters. According to Woodlock et al. (2020), victims of domestic violence are likely to encounter technology-facilitated abuse as perpetrators can use a victim’s single photo showing victim’s face to fabricate a non-consensual sexual deepfake. Malicious males may choose to use porn deepfakes against ex-girlfriends or against any females for selfish reasons. Just as with revenge porn, abusive male partners can utilize non-consensual sexual deepfakes to isolate, control, intimidate, micromanage and shame victims. Incidentally, most of these victims are majorly women (Hu et al., 2020; Kelion, 2020; Muthukumar, 2021.) According to Hao (2021), results based on findings of a research company called Sensity AI indicate that tracking of online deepfakes since December 2018 shows that between 90% and 95% of them are non-consensual porn. Sadly, 90% of these are non-consensual porn of women. These deepfakes target mainly female celebrities and regular women. Adam Dodge asserts and says this is nothing but violence against women. A bot service on Telegram used to generate nude photos of women had their images stolen from their private conversations or social media accounts manipulated and sexualized. The use of such deepfakes is likely to torment and harm individuals psychologically by causing them emotional distress. Deepfakes can not only be used to abuse victims but perpetrators can also use them to blackmail and threaten the victims. Deepfakes have also been used to reduce women to sex objects. A case in hand is that of a Kenyan female politician whose video in which she was dancing nude went viral. In yet another case, in the run-up to the 2017 general elections in Kenya, a sex video of a famous Wajir woman representative Fatuma Gedi was leaked and widely shared on social media. The victim later revealed that the video was a brainchild of some unnamed male politicians who were out to tarnish her name. She accused them of paying a lot of money to have the video generated to bring her down (Musau, 2022). One suspect was arraigned in court over the video. The fake video was meant to defame, discredit her and thwart her chances of being re-elected. The two videos could be examples of porn deepfakes as their authenticity could not be verified. Besides, the use of generative adversarial networks (GANS) makes it possible for one to insert the face of a woman into existing pornographic material. Equally, free artificial intelligence apps like DeepNude have been used to undress women in images without their

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

99

consent. Apps like FaceApp, FaceSwap and ZAO are freely accessible and available for users to create fake videos (Vincent, 2020). Still the fact that these female politicians’ videos were shared at crucial moments that could affect their political ambitions implies that even if they were real, they were shared with malice. Being in the public eye, their opponent’s aim was to discredit and shame them so that they lose face with their supporters.

Conclusion The emergence of deepfakes is likely to complicate things, especially for women in the near future. This is likely to have an impact on female university students given the widespread use of social media amongst them and to have far fetching effects on victims of deepfakes. On the perceptions of Kenyan female university students’ body image, based on the reviewed studies, it is concluded that the images shared online and spread via social media contribute to negative feelings about their body image and identity. The obsession that young adults have of posting and sharing pictures online also results in negative body image as they observe and like the ‘ideal’ body images shared online. The pressure to fit is likely to negatively affect students who strive to achieve the standard body shape, tone and size. Yet we are aware that deepfakes can be used to alter looks and hence not all that is presented online is what it really is. The fact that deepfakes can be used to manipulate images and videos creates high standards of attraction that are unattainable by the female university students. This leads to unnecessary anxiety and depression and affects their confidence. The loss of self-worth and self-esteem is likely to negatively impact their academic achievement. Based on reviewed studies it is concluded that university female students rely on the use of social media as a source of political information. Unfortunately, social media is the major medium used to spread fake news. Consequently, deepfakes used to spread disinformation and misinformation and spread via social media may mislead them as it is difficult to distinguish between real and fake information. Equally, studies revealed that females are unlikely to have factual political information. As a result they are highly likely to be consumers of disinformation and misinformation spread on social media. This can be threatening to the integrity of democratic process in Kenya. The study also concludes that social media is likely to be used to promote sexism, especially among female university students. Based on the

100 

N. MAYOYO

reviewed studies, women have been objectified and bullied online. This is likely to continue as it is easy for a malicious person to use deepfake applications to undress a woman’s images, to swap faces in pornographic videos. The use of social media to spread non-consensual sexual deepfakes can be used to isolate, manipulate and intimidate women. Apart from the psychological harm caused to the victim, this can undermine women’s participation in politics hence curtailing democracy. There is a need for university students to be sensitized about the positive and negative effects of deepfakes. As outlined in the chapter, deepfakes have many positive roles in our society. However, the negative effects cannot be overlooked hence the necessity of educating university female students to be careful on what information they consume online. This is critical in this era of disinformation and misinformation, complicated by existence of deepfakes which are difficult to spot. Of importance is that they should also desist on quickly sharing images, videos and audios on social media especially those that they have doubts about. When a popular person like a celebrity or politician is made to say things they are unlikely to say or do things we don’t expect them to do, one should take caution not to help in spreading a deepfake which has likely been generated to taint that person’s image or to spread disinformation.

References Agarwal, S., Farid, H., Gu, Y., He, M., Nagano, K., & Li, H. (2019). Protecting World Leaders Against Deepfakes. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, 38-45. Bhalla, N. (2022). Feature-Online Disinformation Stokes Tensions as Kenya Elections Near. Cyclical Consumer Goods. Thomson Reuters Foundation. https://www.reuters.com/article/kenya-­tech-­election-­idINL4N2Y22HF Cash, T.  F. (2012). Cognitive-behavioral Perspectives on Body Image. In Encyclopedia of Body Image and Human Appearance; Academic Press: London, UK, Volume 1, pp. 334–342. ISBN 9780123849250 Citron, D. K., & Chesney, R. (2019). Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security. California Law Review, 107(6), 1753–1819. https://www.californialawreview.org/print/deep-fakes-a-­looming-challengefor-privacy-democracy-and-national-security/ [Google Scholar]. Chamegere, M. N. E. (2020). Utilization of social media in University students’ elections in Kenya. A Masters Thesis in Communication. Masinde Muliro University of Science and Technology

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

101

Day, C. (2019). The Future of Misinformation. Computing in Science & Engineering, 21(1), 108–108. https://doi.org/10.1109/MCSE.2018.2874117 Dehingia, N. (2020). When Social Media is Sexist: A Call to Action Against Online Gender-based Violence. Retrieved from https://gehweb.ucsd.edu/ social-­media-­sexist-­online-­gender-­based-­violence De Ruiter, A. (2020). The Distinct Wrong of Deepfakes. Philosophy & Technology. https://doi.org/10.1007/s13347-­021-­00459-­2 [Google Scholar] Fletcher, J. (2018). Deepfakes, Artificial Intelligence, and Some Kind of Dystopia: The New Faces of Online Post-Fact Performance. Theatre Journal, 70(4), 455–471. https://doi.org/10.1353/tj.2018.0097 Grogan, S. (2016). Body Image: Understanding Body Dissatisfaction in Men, Women and Children (3rd ed., p. 2016). Routledge. Griffin, M. (2019). Edtech Company Udacity Uses deepfake tech to create educational videos automatically. Fanatical Futurist. https://www.fanaticalfuturist. com/2019/08/edtech-­c ompany-­u dacity-­u sesdeepfake-­t ech-­t o-­c reate-­ educational-­videos-­automatically/ [Google Scholar] Hasan, H. R., & Salah, K. (2019). Combating Deepfake Videos Using Blockchain and Smart Contracts. IEEE Access, 7, 41596–41606. https://doi.org/10.1109/ ACCESS.2019.2905689 Hao, K. (2021, February 12). Deepfake Porn is Ruining Women’s Lives. Now the Law May Finally Ban It. Technology Review. https://www.technologyreview Review [Google Scholar] Hawkins, N., Richards, P. S., Granley, H. M., & Stein, D. M. (2004). The Impact of Exposure to the Thin Ideal Media Image on Women. Eating Disorders, 12, 35–50. Hu, S., Li, Y., & Lyu, S. (2020). Exposing GAN-generated faces using inconsistent corneal specular highlights. ARXIV. https://arxiv.org/pdf/2009.11924. pdf [Google Scholar] Kamau, S. C. (2017). Democratic Engagement in the Digital Age: Youth, Social Media and Politics in Kenya. Communication: South African Journal for Communication Theory & Research, 43, 128–146. https://doi.org/10.108 0/02500167.2017.1327874 Kelion, L. (2020, September 1). Deepfake Detection Tool Unveiled by Microsoft. BBC News. https://www.bbc.com/news/technology-­53984114 [Google Scholar] Kingori, C.  A. (2018). Effects of Body Image on Self-Esteem among Young Adults: A Case of University of Nairobi. Master’s Thesis United States International University – Africa Lee, D. (2019, May 10). Deepfake Salvador Dalí Takes Selfies with Museum Visitors. The Verge. Retrieved June 9, 2021, from https://www.theverge. com/2019/5/10/18540953/salvador-­dali-­lives-­deepfake-­museum

102 

N. MAYOYO

Meco, L., & Mackay, A. (2022). Social Media, Violence and Gender Norms: The Need for a New Digital Social Contract. Mayoyo, N. (2021). Cyber Dating abuse and Undergraduate Students’ Academic Engagement in Selected Universities in Nairobi City County, Kenya. A PhD Thesis in Sociology of Education of Kenyatta University. Retrieved from https://ir-­library.ku.ac.ke/handle/123456789/22544 Maras, M.  H., & Alexandrou, A. (2019). Determining Authenticity of Video Evidence in the Age of Artificial Intelligence and in the Wake of Deepfake Videos. International Journal of Evidence & Proof, 23(3), 255–262. https:// doi.org/10.1177/1365712718807226 Mbetera, M.  F. (2017). Social Media Use and Participatory Politics Among Students in University in Nairobi. Masters of Arts Degree in Communication studies of the school of Journalism and Mass Communication. University of Nairobi. Meskys, E., Liaudanskas, A., Kalpokiene, J., & Jurcys, P. (2020). Regulating Deep Fakes: Legal and Ethical Considerations. Journal of Intellectual Property Law & Practice, 15(1), 24–31. Mozilla. (2021). Fellow Research: Inside the Shadowy World of Disinformation for Hire in Kenya. Retrieved from foundation.mozilla.org/en/blog/ fellow-research-inside-the-shadowy-Kenya Musau, D. (17th April, 2022). Wajir Woman Rep Fatuma Gedi breaks silence on leaked ‘sex tape’. Retrieved from https://www.citizen.digital/news/ fatuma-­gedi-­says-­her-­star-­started-­going-­up-­after-­viral-­sex-­tape-­n296685 Muthukumar, R. (2021, March 18). Nagpur students innovate AI-based technology; Help identify deepfakes with over 90% accuracy. The Better India. https:// www.thebetterindia.com/251180/nagpur-­students-­ai-­technology-­how-­to-­ identify-­d eepfake-­v ideos-­i mages-­a udio-­m aching-­l earning-­c yber-­s afety-­ ros174/ [Google Scholar] Mwita, C. (2021). The Kenya Media Assessment 2021. Retrieved from https:// internews.org/wp-­c ontent/uploads/legacy/2021-­0 3/KMARepor t_ Final_20210325.pdf New Oxford Dictionary. (2010). ‘Sexism’. Oxford University Press. ISBN 9780199891535. Nguchara, N., Serem, J., & Were, M. G. (2018). University students’ Perception on the Influence of the ‘ideal’ Media Body Image on Their Choice of Clothing in Kenya. African Journal of Education, Science and Technology, 4(4), 1. Njuguna, E.  W. (2017). Perceptions of violence against women on the social media platform: The case of social media in Kenya. Master of Arts (Gender and Development Studies) of Kenyatta University Nyambura, I. (2019). Social Media Influence on Body Image Among Female University Students: A Case Study of Instagram. Master of Arts in Communication Studies, School of Journalism and Mass Communication, University of Nairobi

5  THE INFLUENCE OF SOCIAL MEDIA USE IN THE WAKE OF DEEPFAKES… 

103

Tiggemann, M., & McGill, B. (2009). The Role of Social Comparison in the Effect of Magazine, Advertisements on Women’s Mood and Body Dissatisfaction. Journal of Social and Clinical Psychology, 23(1), 23–44. Tiggemann, M. (2011). Sociocultural Perspectives on Human Appearance and Body Image. In T. F. Cash & L. Smolak (Eds.), Body Image: A Handbook of Science, Practice, and Prevention (2nd ed., p. 490). The Guilford Press. Vincent, J. (2020). New AI Deepfake App Creates Nude Images of Women. Available: https://www.theverge.com/2019/6/27/18760896/deepfakenude-­ai-­app-­women-­deepnudenon-­consensual-­pornography Waswa, J. (2018). Determinants of Body Image Perceptions Among College Students in Kenya. Research Journal of Food and Nutrition, 2(3), 25–28. Westerlund, M. (2019). The Emergence of Deepfake Technology: A Review. Technology Innovation Management Review. Retrieved from timreview.ca/ article/1282 Woodlock, D., McKenzie, M., Western, D., & Harris, B. (2020). Technology as a Weapon in Domestic Violence: Responding to Digital Coercive Control. Australian Social Work, 73(3), 368–380. https://doi.org/10.108 0/0312407X.2019.1607510

CHAPTER 6

The Ogre and the Griot: Culturally Embedded Communicative Approaches Addressing ‘Deep Fake’ COVID-19 Narratives and Hyperrealities in Kenya Oby Obyerodhyambo and Wambui Wamunyu

Introduction The Griot’s creation of the Ogre in African folklore is a process of representation of an altered reality, a disturbed and disturbing reality—a hyperreality in which things that seem real are unreal, persons are not who or what they say they are, and situations are hyperrealized in order to create tension within which certain social commentary is made. The alternate existential spheres in which the ogres operate constitute multiple hyperrealities, a sociological term used to refer to the blurring between reality and the appearance of reality (Baudrillard, 2010). The Griot, a

O. Obyerodhyambo (*) Daystar University, Athi River, Kenya e-mail: [email protected] W. Wamunyu Aga Khan University, Nairobi, Kenya © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_6

105

106 

O. OBYERODHYAMBO AND W. WAMUNYU

consummate community storyteller—the conscience of the society and the moral compass, uses representation of the ogre to construct and deconstruct hyperreality and foster a tension within which the well-camouflaged reality is exposed. The dominance of the ogre in African folklore is presented as a temporary, anomalous period that persists only until the rise of the hero who is able to unmask the ogre and defeat him and reestablish reality. Ruth Finnegan in her seminal study of African Oral Literature described the way that the storyteller conceptualizes reality, what would today be termed hyperreality (Finnegan, 1970). Hyperreality has been used more recently to denote the technological advances and occurrences that proliferate in contemporary society, including artificial intelligence, virtual reality and fake news (Morris, 2020). In this chapter we argue that the idea of hyperreality—discussed later in further detail—has been used by the Griot to raise social awareness and agency. This chapter appropriates the terms ‘ogre’ and ‘hyperreality’ in connection with COVID-19 and the emergent pandemic narratives, respectively. The ogre narrative operates on several layers of reality. In the classic story of Nasikofu1 and her sisters, the comeliness of the most beautiful maiden, Nasikofu, is camouflaged by her disability. She is a hunchback. Her fellow maidens despise her and marginalize her in the competition for an incredibly handsome suitor. Unbeknownst to them this handsome man is actually an ogre, eager to lure not only the supposedly top contender chief’s daughter, but all the maidens to his lair so that he can feast on them. The real beauty, Nasikofu is kept away from the eyes of the guest lest her ugliness dampen the spirits at the betrothal event. The vain, unfair, unintelligent maidens present themselves as the crème de la crème. In their folly the maidens are not able to discern the tell-tale signs that the suitor is a fake and poses a danger to them. Nasikofu notices tell-tale signs that reveal the fakeness of this suitor yet her attempts to warn them go unheeded. The young women end up in the lair of the ogre where Nasikofu through ingenuity has joined them. She is eventually able to unmask the ogre, eliminate him and save the maidens and at that moment her disability vanishes and she emerges as the most beautiful, brave, selfless and the personification of the ontological perfection in that society. 1  Nasikofu’s story exists in several versions in Western Kenya. A popular version has been included in an anthology of folktales by John Osogo titled, The Bride Who Wanted a Special Present (1966) Nairobi, Kenya Literature Bureau.

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

107

The griot creates layers of hyperrealities in order to engage with the society’s perception of intelligence, beauty, courage, discernment, selflessness and so on. The ontological position of the society in which intelligence is revered above beauty, where lack of tolerance for disability is condemned and discernment supersedes superfluity, the griot uses the tale to create abhorrent hyperrealities of a handsome ogre-suitor, unintelligent maidens who are duped by the ogre’s charms, their helplessness and the ogres lack of intelligence and wit. These elusive qualities are signified through the creative layering of vices which are then deconstructed by good overcoming evil, intellect and wit overcoming brawn and the senseless consumerism represented by the ogre being defeated. Through the act of outwitting the ogre and freeing the maidens (and bringing joy to the community that would have lost its maidens and what they represent), the heroine uncovers the grime under which the real beauty lay. The ogre himself had been fooled by the hyperreality that the real beauty is Nasikofu. Villains are felled by heroes, and in this chapter, the conceptualization of a griot serves as the metaphorical response to the ogre’s power and is embodied in a Kenyan clinician who positively reframes the communication around COVID-19. In this chapter the clinician, referred to as ‘GG’, plays the role of the Griot and engages the hegemonies of the media, the global health, scientific, health financing system, and the individual healthcare provider. These systems have created hyperrealities and smoke-screens that delude the ordinary person that healthcare in the wake of COVID-19 was equally available. Taking a constructivist grounded theory approach, this chapter explores emergent communication theory arising from the African cultural context that framed coronavirus-related discourse and communication. The question the chapter grapples with is, how do African voices frame realities and communicate information, especially about health from their lived experiences? The experiences around COVID-19 are the data upon which a grounded theory is to be founded. Secondly, what is the nature of hyperrealities that manifest in this context and how are they simulated? Charmazian Grounded Theory (Charmaz, 2000; Charmaz, 2006; Charmaz, 2008) and especially the constructivist orientation hold that Grounded Theory research orientation allows the addressing of why questions while preserving the complexity of social life (Charmaz, 2006). Health communication research in the African context has been plagued by the application of ‘çookie cutter’ theories based on studies conducted in the distant past, based on alien experiences in far-away places in the west

108 

O. OBYERODHYAMBO AND W. WAMUNYU

and by Euro/American scholars (Airhihenbuwa & Obregon, 2000; Betsch et al., 2015). These imposed theories fail to reflect or adequately explain the co-created reality in different contexts. Individual and community responses to experiences as traumatic as health pandemics elicit unique ways in which communicative experiences are understood and how realities are constructed and interpreted. This chapter relies on Constructivist Grounded Theory approach to examine health communication theory within African experiences and the intersectionality between modern Western medicine, traditional African health practices and emerging decolonized African communication scholarship. The chapter examines how health communication is mediated between health professionals and consumers of health communication within this particular context and interrogates the extant qualities of a traditional clinician that made the physician in our study stand out for having gained credibility and trust of the public in his role as a mediator whose role is to make health information accessible. He became a health griot. In the rankings of the World Bank, Kenya, a Sub-Saharan country, transitioned from a low-income country (LIC) to a lower-middle-income country (LMIC) in 2014. However, multiple social and economic challenges, with health featuring among the most severe persist. Health challenges include high maternal and child mortality, as well as a high burden of infectious diseases including HIV, TB and malaria. The health infrastructure is characterized by limited access to healthcare facilities, lack of adequate personnel and clinical expertise to address the medical needs. According to data collected by the World Bank in June 2022, the ratio of physicians to population stood at 0.1565 per 1000 or 1 physician to every 16,000 people. The combination of dire health needs alongside the paucity of the institutional capacity to provide services to handle the health challenges defines the relationship between the Kenyan people and physicians and therefore influences how a clinician is perceived in Kenya. The chapter adopts a constructivist grounded theory approach examining online content created by one Kenyan doctor Githinji Gitahi (known as GG), in response to COVID-19 online content. The emerging data is examined using a qualitative content analysis approach. The content examined comprised of purposively sampled, publicly available content he published on various digital media. We approached the data using a multidisciplinary lens, given our backgrounds in health communication, the arts and media studies. The thematic analysis that we applied enabled us to examine GG’s communicative style, while incorporating a high level of

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

109

researcher reflexivity. In the Charmazian model of GT, it is made clear that the research process itself is a construction of reality. The assumptions inherent in this approach are that there are multiple subjective realities arising from the processing of experiences by the various players constructing the reality (and hyperrealities) under specific conditions. In the case of this study, this occurs in a specific Kenyan context but within the global communicative reality of the pandemic created by digital communication. We also aver that the theory emerges from the researchers’ interaction with the data, and in this case data constituted communication from GG engaging with his audience and his audience’s responses to him. These are twin layers of interaction and as researchers, we analyse on both levels. There is also the assumption that the research process takes into account the researchers’ own position vis-à-vis the research question even as we acknowledge that as researchers, the personal perspective of the researcher informs the research process and its outcome (Guba & Lincoln, 1994). The global eruption of COVID-19 disease unleashed a torrent of information, misinformation and disinformation spread through social, mainstream and other media. The World Health Organization (WHO) while raising an alarm at the rate of proliferation of fake news around the coronavirus termed the phenomenon an ‘infodemic’ and characterized it as a tsunami (World Health Organization, 2021, p. 64) while admitting that misinformation was posing a real test to the pandemic response (Chou et al., 2021). False information abounded in the Kenyan context as well. Kenya’s first recorded case of the coronavirus was reported on 13 March 2020 by the cabinet secretary overseeing the Ministry of Health. It would become the practice of ministry officials to provide statistics (such as on the number of cases recorded, hospitalizations and fatalities), measures to contain the spread (including mask-wearing and frequent washing of hands) and information on testing (such as facilities offering services). According to observers, frequent updates and a concerted media campaign to share COVID-19-related information contributed to an early favourable response to the government’s efforts (Wako, 2020; Walker, 2020). However, in the wake of the emergence of COVID-19, fake news about every aspect of the disease proliferated through physical and digital spaces even as more knowledge and understanding of COVID-19 emerged. In Kenya, part of the spread of the misinformation was

110 

O. OBYERODHYAMBO AND W. WAMUNYU

attributed to the country’s rapid and extensive adoption of digital technologies but also to deep levels of mistrust in government authority figures. There were other hurdles that hindered the Kenyan government’s credibility. These included inconsistencies in messaging where the general public was expected to adhere to laid-down government guidelines while the political class flouted the very same guidelines without suffering any consequence. There were instances of police brutality, an insufficient supply of medication, protective equipment and vaccines, and a lack of accountability related to COVID-19 resources and data (Ayah, 2020; Center for Policy Impact in Global Health, 2020; Guguyu, 2021; Lewis & Fick, 2021) From the onset, the Kenyan government’s strategy for curbing the spread of the virus included a centralized communicative style from the healthcare system, be it from government officials or medical personnel. While messages with far-reaching implications such as lockdowns, curfews and travel restrictions were announced by the president, it is the cabinet secretary for Health who served as the primary government voice with regard to the pandemic. Even with these efforts to provide information, COVID-19 misconceptions, myths and fake news swirled locally as they did globally. Misinformation experience has shown that fake news travels faster, further and more broadly on social media than true stories, and further that fake news tends to be believed more (Martel et  al., 2020). Social media, in particular Facebook and Twitter, are identified as the main source of health information on the virus but at the same time also fingered as most responsible for spreading fake news (Baptista & Gradim, 2020). There is an argument that in Sub-Saharan Africa the main drivers of fake news are not as distinct as is the case in the global north. The increasing availability of media digital technology has allowed non-journalist individuals to become content creators in a way that had democratized information while also allowing persons with no sense of communication ethics to freely communicate with millions globally (Mare et al., 2019) Studies of the fake news phenomenon have tried to explicate reasons why people conceptualize and spread fake news as well as why people believe fake news. Individuals shared fake news because they felt that they might do some good in the process (Chakrabarti et al., 2018, p. 37). Chakrabarti et  al. (2018) found that most Kenyans share fake news from a sense of civic duty, not malice. Whereas they might be aware of the

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

111

possible negative consequence of their sharing, there is a sense of fatalism that overrides the decision premised on the logic that whereas it might be fake news it could as well be helpful. Usually, fact-checking is not a priority and the identity of the source is currency enough to share a piece of information. It was also found that there is the desire to remain socially relevant and to want to get a ‘scoop on an emerging issue leading to sharing at times even without reading a piece of information’ (Chakrabarti et al., 2018). The definition of hyperrealities by Jean Baudrillard is appropriate where he defines hyperrealities as where there is a prevalent inability of consciousness to distinguish reality from a simulation of reality and a condition in which what is real and what is fiction are seamlessly blended together so that there is not clear distinction between where one ends and the other begins (https://www.mlsu.ac.in). It is our considered view that the narratives shared and built during the emergence of the coronavirus created hyperrealities in which the Kenyan populace grappled with a fantastical, deathly enemy against a backdrop of dry, technical, abstract and top-down communication. We theorized that the clinician who is central to our study took on the persona of a ‘griot,’ a figure who spoke the language of the street but could walk credibly in the corridors of power while slaying the fears and anxieties which the pandemic had wrought. Our explorations pushed us across the wide spectrum of perspectives within Afrocentricity, allowing us to make sense of the coronavirus beyond the abstract, techno-focused conception of ‘hyperrealities’ to encompass the folklore-infused understanding of the world in which we live.

Hyperrealities In this chapter hyperrealities include the non-technology-based fictionalization of reality that creates ogres and other abstractions of reality in African folklore, thereby engaging in symbolic representation of reality and cryptic communication. In such communication there is an interface between the real or the actual and what is fictionalized about real phenomena into symbolized communication calling for one to unravel the symbolism. While deciphering the meaning in the symbolized message is expected, it is also seen as a challenge to which only the wise can rise. The Swahili saying, ‘Fumbo mfumbie mjinga lakini mtu mwerevu atalingamua’ (unravel a riddle or mystery for an idiot, the wise one will understand it

112 

O. OBYERODHYAMBO AND W. WAMUNYU

himself) captures the mental acuity expected when faced with hyperreality. In the rendering of ogre stories, the truth or reality is symbolically presented so that the fact and the simulation of the fact are embedded and one is supposed to unravel them. A leading literary scholar and poet, Micere Githae Mugo (2020) with Kenyan roots, has written a poem titled ‘Ogre Named COVID-19’, an excerpt of which goes as follows: Fictionally and allegorically, of all known anti-human forces, none has the lethal power of the demonic, eight-headed monster known as the ogre; his mission to devour and eliminate human life and their world humanity’s mission… … Which of my griotic ancestors will possess me anointing me with eloquence and entrancing oracy born of the verbal magic of captivating utterance? … Who will restore human sight, teaching people to see wingless angels in differing human form: the sleep-starved ambulance-driver who never lets-go the steering wheel the exhausted doctor, armored in untainted ethics of professional steel.

The poet creates a hyperreality in which the disease caused by coronavirus is described in the most frightening terms. The oral artist struggles to create an image to paint a reality that will galvanize the world’s population into action. Through the fictional presentation Mugo creates the reality of the devastation of the disease and then proceeds to question the whereabouts of the extraordinary individual ogre-slayer who would defeat the ogre. The image Mugo (2020) conjures is one of a diligent clinician driven by ‘untainted ethics’, a celestial being, wingless angels in different human forms who selflessly go about providing support to those under the threat of the ogre. She paints a world of hyperrealities related to the COVID-19 pandemic. The hyperrealities emerging from the media reinforced the prevalent image among the public of clinicians and other healthcare stakeholders—such as big pharma—as haughty, callous, elitist profit-seekers. This image fitted perfectly with the hyperreal image of the ogre of multiple oral traditions. The ogre in Oral tradition transcends nationalities to become a universal motif epitomizing greed, cruelty, disloyalty and duplicitousness. Thompson’s Motif index describes motifs as persistent, indivisible and defining narrative elements or story details. Globally, the ogre appears in

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

113

different forms, but the defining element is that in their greed and cruelty they endear themselves to mankind through pretence, gain trust and then pounce on the unsuspecting victim and devour them. The ogres are cannibalistic, possess super-human traits and capabilities, use trickery and duplicity and pretend to be what they are not. Mankind turning tables on the ogres, defeating them and redeeming themselves through acts of courage or superior wit closes the circle on the ogre narratives. In the end the ogre stories are about good overcoming evil. Just like Naskofu introduced in the opening paragraphs does. Kiarie et al. (2016) argue that in post-structuralist studies, a contextual examination and interpretation of the narratives unlocks the symbolic meanings. The same can be said about the conceptualization of the ogre as an element of hyperrealism. The characterization of COVID-19 as an ogre and by extension the depiction of the healthcare provision system as callous, unfeeling, exploitative and uncommunicative creates a hyperreality. In this hyperreality the hegemony of the uncaring healthcare provider and the healthcare system including big pharma finds expression within folklore through the image of the ogre. The hyperreality in this context sees big pharma and clinicians as profiteers, keen on making profits amid the suffering of their clients. Various scholars have documented patients being detained or sexually exploited over unpaid hospital bills, and a lack of communication, empathy and ethics in patient care among healthcare providers, unequal power dynamics between doctors and patients (Githui, 2011; Irimu et al., 2014; Katuti, 2018; Manyeti, 2012; Nyongesa, 2016; Wachira et al., 2018; Yates et al., 2017). The ideal caring, empathetic clinician was a hyperreality, a non-existent, imaginary or virtually mythical figure (Githui, 2011, p. 125). In this chapter we focus on the symbolic presentation of the COVID-19 clinician with the understanding that there are levels of hyperreality: the hyperreal image of the white-coat-clad clinician who would supposedly stop at nothing to save lives and nurture the sick back to health but the reality is that he will only do so for profit, During the COVID-19 outbreak there were anecdotes of clinicians turning away patients who needed care, or refusing to tend to patients. In the latter instance, some of those cases arose because the medical personnel did not have Personal Protective Equipment (PPE). Other stories included those of patients dying at the entrance of health facilities while clinicians seemed unmoved (Odula, 2020; Oketch, 2020).

114 

O. OBYERODHYAMBO AND W. WAMUNYU

Just like the ogre in folklore, clinicians have presented the ogre-like duality in which benevolence acts as a smoke-screen for a more sinister personality that leads to confusion and frustration as expressed by Manyeti (2012). Clinicians are notorious for their unintelligible scribbles on prescriptions and speaking in medical jargon as a way of esotericization of the practice. The clinician lacks empathy towards patients, is callous and casual in demeanour, pursues profit relentlessly, eschews clear and simple communication and displays open disregard and disdain for the suffering of their clients. The clients are unable to unravel the two-facedness of the clinician because it is so seamlessly blended together so that there is no clear distinction between where one ends and where the other begins. Like in the ogre narratives unravelling the hyperreality then becomes a major challenge.

Methodology It is in this context that one Kenyan clinician stood out in the course of the pandemic, his messaging and communicative approach distinguishing itself for its clarity, candour and effort to clarify the hyperrealities related to the COVID-19 pandemic. Dr Githinji Gitahi, the clinician, was visible on social and mainstream media platforms, and we studied his visual and textual communication on COVID-19 vaccination as portrayed on purposively sampled tweets, Facebook posts and webinar/media appearances. He is referred to as GG in the rest of the chapter. We purposely focused on his discourse around vaccines because there were several layers at which he was communicating: with global and local anti-vaxxers or vaccine denialists discouraging uptake, globalized vaccine procurement process where he presented a pan-African equity agenda, local health system to encourage policy and management of vaccine accountability, the patient/clinician level where he interacted with people seeking his professional advice and finally as the ordinary but knowledgeable physician engaging at gut-­ level with the public. GG has multi-faceted professional experience, having worked as a clinician and in leadership roles at media houses, a pharmaceutical company and hospital group. He is the group chief executive officer of a health organization located in multiple countries across the African continent and serves on multiple bodies including the governing board of the Africa Centres for Disease Control and Prevention, and the Commission/ Taskforce on Africa’s COVID-19 response. GG holds social media

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

115

accounts on Facebook and Twitter, and we qualitatively analysed a purposively sampled selection of his posts. Through the written discussions we engaged in while reviewing the data, we immersed ourselves in a multidisciplinary range of literature and engaged in extended reflexivity even as we established the themes that emerged from the data. We have studied or worked in the fields of health communication, literature studies and mass media scholarship, and it is in drawing from these areas that we conceptualized COVID-19 as a terrorizing ogre. By taking a pragmatist approach in seeking the best method for our study, we avoided an engagement with a meta-physical focus on the nature of truth, knowledge and reality (Morgan, 2014). Instead, grounded theory enabled us to construct an understanding of the distinctions between GG’s communicative approach versus that of the government establishment. Grounded theory is premised on the epistemological position that knowledge is co-created and that there is no objective truth (Glaser & Strauss, 1967; Charmaz, 2006; Charmaz, 2000). In the process of interacting with reality, reality is co-created and as people work to interpret and understand the reality they create it too. In this case examining GG’s messages as well as the meanings that the audience derives from it, and our interpreting it shows our interpretivist inclination. From the ontological point of view, in grounded theory-inspired methodology human interaction is key and the way that GG interacts with his audience from his worldview—a clinician without airs is premised on a belief that those with knowledge should be of service to mankind—the type called out by Micere Mugo, the poet. So we are looking at the interface of the individual with a worldview that elevated the individual. This chapter examines health communication theory within African experiences and the intersectionality between modern Western medicine, traditional African health practices and emerging decolonized African communication scholarship.

Findings Re-humanizing the Clinician On his social media accounts, the doctor references both the personal and the professional, describing himself by the schools he has attended, senior

116 

O. OBYERODHYAMBO AND W. WAMUNYU

positions held in the media and health sectors, or with terms such as ‘father’, ‘husband’ and ‘commissioner Africa COVID-19’. He is particular about presenting himself as an ordinary person, an ‘ordinarization’ as it were, so that it is clear that he is first a husband and father before he assumes the role of a health professional. Keeping the identities separate, as we shall see later, allows him to challenge and criticize the local and global health system hegemony and hence humanize the medical profession. Concerning his visual persona on the two platforms, he uses photographs of himself with his wife (Facebook) or his mother (Facebook and Twitter) as part of his personal profile. They are informal, relaxed images depicting a husband and son engaged with close family members. In one of the pictures, he is seated with his mother on a blue bench as they lean against a wooden wall. Many of the comments on the image are approving of how he highlights his mother, acknowledging it as a mark of his care and respect for her. This is in contrast to the image of the omniscient white-coated, stethoscope-­carrying doctor. GG dresses down and appears in videos clad in everyday wear unlike the typical chief executive officer of one of the largest health NGOs in the region. As a result he does not alienate the clinician from the ordinary patient. Subliminally his visual persona portrays him as a relatable ‘Everyman’, creating the image of a humane, more accessible clinician as opposed to the closed, distant professional. Yet his self-presentation also encompasses his professional life, as he lists his career achievements included holding senior positions in the worlds of media and healthcare, and serving on the boards of Kenyan and continental organizations. These accomplishments served to reinforce his credibility as an articulate clinician with knowledge and experience yet he is also careful to show that he is not all-knowing, such as when he refers those questioning him on social media to professional authorities such as the World Health Organization and Center for Disease Control. In keeping with this people-centric caring image, GG offers consultancy services online that would have been subject to professional fees. In an online video GG explains the diagnostic significance of monitoring low levels of oxygen in geriatric care. He explains that a gadget as simple and easily accessible as a pulse oximeter could provide the early warning signal that an elderly patient is in need of critical care. GG reaffirms his humanity by referencing the deep personal loss when a friend loses his mother, which is unlike the image of the distant clinician who shows no emotion.

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

117

Here is a feeling clinician, one who empathizes with those affected, thus re-defining the image of a clinician as an objective scientist. However, his credibility as a medical professional also emerges, both in what he posts and how his audiences respond. For instance, in September 2021, he had generated 109 tweets or retweets, many of which addressed COVID-19 vaccination. He decried unequal access to vaccines globally, using terms and hashtags such as ‘Vaccine Apartheid’ and ‘#VaccineEquity’, respectively. He highlighted the arrival of vaccines in Kenya, such as by retweeting Ministry of Health notices and photographs. Twitter users hailed him (his Twitter handle is @daktari1) as a person worth meeting or as a credible source such as in these two tweets posted on 3 September 2021 and 8 September 2021, respectively. @daktari1 since covid happened he’s been consistently sharing valuable insights... I admire him. (3 September 2021) His credibility was further underlined by a Twitter user highlighting of a post on voting he made at a youth conference on 8 September 2021: Vote for leaders who have your best interests at heart  - @daktari1 #YouthGenderCOVID2021 Responding to the global and local health system hegemonies Globally, narratives that justify the poor health status of the poorer nations are perpetrated and go unchallenged because the hyperreality is that Western-trained clinicians are part of the global commercialized, anti-­ people health system. GG questions an emergent dominant narrative that stated that despite vaccines being available there was reluctance by Africans to get vaccinated. He challenged this position by asserting that there was what he called vaccine apartheid—where the powerful rich nations, who also have monopoly of manufacturing the vaccines in an act of excess greed and disregard for the poorer nations—had acquired and stockpiled all the available vaccines well beyond their needs. GG used his global platform to call out the unethical practice that denied the poorer nations access to the vaccines even after Africa for instance had put in place a mechanism for purchasing the vaccines from the manufacturers. In this 14 September post (carried on both Facebook and Twitter), he decried global inequality in vaccine access, using tags and hashtags to link to related organizations/entities or discourse. So #COVAX has $10b, financially committed 4b doses but has managed 256m (5.6%) Why? High income states booked over 50% doses for 14% of world’s population & occupy front of the manufacturing queue while the rest

118 

O. OBYERODHYAMBO AND W. WAMUNYU

of the 84% suffer @gavi @WHO @_AfricanUnion #EconomicGenocide (14 September 2021) This exposed the deep fake messages that characterized Africans as being reluctant to get vaccinated. The same global systems that peddled the myths that Africans were reluctant to take the vaccines had hoarded and thus made access impossible. GG used his global platform to call out this ignominious act by the Western powers while at the same time he actively engaged with the Fake news surrounding vaccines. He explained the science behind the manufacturing of the vaccines explaining the rapid roll out of the Ribonucleic Acid (RNA)-based vaccines in a manner that the layman could understand. He also addressed fears stoked by the fake news about the vaccines and provided assurance for pregnant and lactating users and reassured individuals living with illnesses/co-morbidities such as HIV/AIDS and tuberculosis that it was safe to take the vaccine. GG promoted immunization by communicating in the most powerful way when used his aged mother and mother-in-law as to educate the public about the vaccine. GG makes a bold step to vouch for the safety and efficacy of the coronavirus vaccine and vaccination by referencing his mother and mother-in-law. Within the context in which he is communicating GG is cognizant of the gravitas the invocation of the name of mother and mother-in-law communicates, this is equivalent to swearing by the most sacrosanct item. On both platforms, he has used the social media to provide general information about the COVID-19 vaccine while addressing vaccination hesitancy, local and global access to different vaccines, unethical practices and immunization myths. He applies a straightforward sharing of scientific facts in layman’s terms, responds to questions or comments raised by his post, and plays an advocacy role in promoting vaccination and decrying unfair practices. In a 14 August 2021 tweet, he posted: Want to know the best #covid19 vaccine? It’s the one in your arm! Get vaccinated. GG also challenges incorrect information and takes on those who would manipulate people’s fears for personal gain. The doctor does not fail to challenge those peddling Deep Fakes that went global in providing ‘scientific’ backing to the fake news. The scientific and medical fraternity was equally divided on what was sound science and what was not, opening up another hyperrealities platform. Those against the vaccine, the anti-­ vaxxers, had great visibility and caused confusion. The case in point is one

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

119

surrounding the controversial claims made by Prof Delores Cahill debunking the pandemic. GG used his clout and credibility to speak against her claims, as seen in the figure below.

The perspective of a callous, profiteering health system emerged in multiple ways over the course of the COVID-19 pandemic. In Kenya’s case, one particular illustration that stood out emerged on social media as a health insurer promoted health cover in the event of one experiencing side effects from COVID-19 vaccination. The exchanges that then ensued on the same day between the doctor, the insurer, government and social media users exposed the multiple hyperrealities related to COVID-19 vaccination. These hyperrealities included citizen’s prevailing fears about

120 

O. OBYERODHYAMBO AND W. WAMUNYU

vaccination, the insurer’s presumption and exploitation of vaccination side effects, and the commercialization of healthcare. On 11 August 2021, the doctor chided the insurer as shown in Fig. 6.1. The doctor’s pointed questioning generated the response by the insurer which posted that its cover was ‘meant to cover Kenyans in the UNLIKELY (their emphasis) event of adverse reactions’. The doctor then responded by asking the insurer this question: And is it true that you don’t cover for COVID illness? Do you cover your members for these side effects you want to sell cover for to others? Could you please

Fig. 6.1  Health insurer AAR provides a health cover for possible side effects, which the doctor views as a callous profit-driven effort to play on patient fears

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

121

share a list of these side effects and how you will determine they are COVID vaccine? What’s your criteria? (11 August 2021)

The insurer did not respond but a Twitter user did, saying: [The insurer] does not cover COVID19. I asked them last year with several reminders and they were conveniently mum. (11 August 2021)

The questions raised by the doctor generated a variety of posts from other Twitter users, one of whom appreciated GG’s holding of the insurer to account. Beyond this, there eventually emerged an 11 August 2021 government response which stated that insurance against side effects is not only unnecessary but unethical. No company should claim to provide insurance services against vaccine side effects of side effects of any medicine.

The Ministry of Health’s tweet was posted on 11 August 2021 at 5:52 p.m. At 7:28 p.m. the same day, the insurer posted a tweet acknowledging the concerns and indicating they were consulting with the relevant authority.

Applying a Culturally Competent Communicative Style In his multiple interactions on social and mainstream media, the doctor uses a conversational, understandable style, culturally relevant symbols, and relatable analogies. In one 30-minute Facebook live event, he spoke in his vernacular language (https://www.facebook.com/daniel.n.wachira.7/ videos/3275783015975036) to discuss the pandemic and respond to questions raised. The event generated more than 2000 views, 40 comments and 62 likes. The interviewer mentions in his introduction that there was a lack of trust among the citizenry concerning the COVID-19 information shared by the government. The 25-minute discussion that ensued covered vaccine hesitancy, receiving of different vaccine brands, vaccinations for pregnant or breastfeeding mothers, and vaccinations for those with co-morbidities including diabetes or for those who had contracted COVID-19. The listeners posted questions in the comment section, which the host raised in the course of the conversation.

122 

O. OBYERODHYAMBO AND W. WAMUNYU

In his responses, the doctor used a blend of scientific information and commonly understood phenomena such as the value of seatbelt and helmet wearing in cars or motorcycles (boda boda), respectively, or children being vaccinated against measles. Without condescending to his audience, he acknowledged vaccine hesitancy and attributed it to multiple reasons including the fear of being infected, concerns about vaccinating pregnant or breastfeeding mothers, or individuals with other illnesses. He described the vaccine as a message (ndũmı ̃rı ̃ri) that comes to tell the body about the disease, helping the body generate antibodies, which will not harm unborn children, babies or those with other illnesses. In responding to concerns about individuals receiving different vaccine brands, he likened it to one knowing they needed to wear a helmet while riding a motorcycle—boda boda—but being more concerned about the brand of helmet rather than valuing the work of any helmet. He also cautioned against over vaccinating, since those who would want to have multiple jabs even after having the requisite doses, would leave those vulnerable without access to vaccines. In taking a grounded theory approach, we centred COVID-19 mythmaking which we saw as a process of seeking to interpret and explain the pandemic as a lived phenomenon that defied current knowledge or understanding. The study of the doctor’s visual and textual discourse enabled the exposure of a health communication approach that was more ‘griot’ than ‘ogre’, with the clinician in our study possessing multiple-­sphere credibility among peers, and within professional and socio-cultural settings.

Who Is a Griot? Hale (1997) goes into great length describing the origin of the term, ‘griot’, especially within the West African poetic tradition. One relevant feature of the griot is their ability to speak truth to power, their ability to articulate the thoughts and views of the voiceless. Griots have multiple roles among which are as historians, genealogists, advisors, spokespersons, diplomats interpreters, musicians, composers, poets, teachers exhorters, town-criers, reporters and masters of or contributors to a variety of ceremonies (Hale, 1997). Within the African tradition the griot is a storyteller, the purveyor of truth of facts and has the responsibility of correcting any factual or perception error. The griot is a fearless interlocutor, representative of the popular. The griot is allowed the liberty of contradicting the

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

123

official version of events from the leadership. The griot is the one to call out hyperrealities and correct fake news. An image of the clinician as an unfeeling profiteering, exploitative, authoritarian bully forms part of a hyperreality which presents as a ‘deep fake’. In this context, the ‘deep fake’ is not a manipulated digital image but rather more like the transformed image of the ogre who presents in African folklore as a handsome young suitor. Once the nubile damsel is fooled the ogre reverts to his fiendish self and consumes the damsel. GG re-casts the hyperreality of the clinician from the messianic miracle worker of the television series into an ordinary human being who cares, feels and hurts when bereaved. He separates the professional from the individual to emphasize that being a clinician is part of a role that does not subsume one’s identity. He is not the omniscient all powerful life saver but empowers individuals to play a role in their healthcare. He empowers through providing relevant information and by pushing for a demand-driven health supply system. He acknowledges that the client or patient is better placed to tell what they need. This is a departure from the clinician who prefers the submissive non-questioning client. In an African context where the ‘white coat’ syndrome exemplifies clinical communication hegemony, GG demystifies the traditional image of a clinician. He counters vaccine myths and related fake news through posts and talks on social and mainstream media platforms, simplifies clinical information and shares highly personalized and culturally embedded myth-busting mediated perspectives where he re-casts health communication per se as well as the role of the clinician in it. He attempts to break the curse of the man-ogre and seeks to have clinicians revert to the handsome suitor. He challenges the clinicians’ privileged position that has been used to commercialize the practice of medicine, and he challenges insurers and the tendency to place profit above lives. He directly questions the morality and altruism of the global health supply system where poorer nations are condemned to ill-health while the richer nations though posing as morally and ethically beyond reproach actually perpetuate a discriminative practice that systematically denies the poor the benefit of scientific advancement. GG by taking on the global health system went against the hyperreality that positions clinicians as being beholden to big pharma and international health bodies. He also took umbrage with the philosophical orientation of the local health delivery system that is supply as opposed to being demand driven. At the philosophical plane the supply-driven thinking is driven by the notion that the client does not know what they need or want and thus

124 

O. OBYERODHYAMBO AND W. WAMUNYU

the hegemonic health system has to make decisions for them. This underscores the power imbalance and the paternalistic attitude that has been described as typical of clinicians and the health system. Even as he criticized the local health system GG avoids falling into the hyperreality of modern educated African intellectuals who troll anything local as lacking merit. GG acknowledged that Kenya had mounted a scientifically guided response to the pandemic and goes on to explain the government lockdown strategy in a manner that is not dismissive but based on the epidemiology. He said that the science had shown that wearing a mask, avoiding unnecessary travel outside the home and washing hands were ways to prevent being infected. But he also acknowledged the economic circumstances where those who needed to leave the home to sell vegetables or catch fish (acknowledging some families had to leave home for their livelihoods) would need to do so. His advice to the listeners was to trust in God, to believe that the science guiding the prevention and care of COVID-19 was God-given, to ensure they and their close family members were vaccinated and to not be afraid. The combination of common sense advice, acknowledgement of his community’s widespread reliance on God and religious faith, use of relatable analogies and reassuring posture and comments served to portray an accessible, credible expert. The host of the Facebook Live event, a church minister, added to GG’s credibility by affirming GG’s professional status as a chief executive officer with Amrf, and referring to the interview GG had just finished with a Nigerian media outlet. The model of effective health communication typified by GG revolves around his addressing the hyperrealities that surround an unusual event, such as the COVID-19 pandemic. He does so in three ways: by demystifying science, challenging the profiteering exploitative establishment and using a democratic space. GG demystified science by confirming his credibility as a medical professional yet building trust among the public by making themselves ordinary and at par with the average citizen. His persona was that of a caring son or husband who spoke in the register of the ‘Everyday’ person, enabling a visceral connection with audiences, and convincing persuasive delivery of technical health information. He challenged the established hegemonies, such as by directly taking on an insurer’s attempt to profit from worries about vaccine side effects, and effectively marshalling

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

125

government support against it. He showed a genuine concern for the people and challenged the hegemony of health communication and authoritarian abrasive discourse emanating from corporations and those holding positions of authority. By communicating not only in spaces where he was invited as a medical doctor but also on his own social media pages, he enabled the sharing of health messages and busting of myths in a non-­ hierarchical, non-linear persuasive style that enabled peer-to-peer interaction with his audiences. The persevering quintessential image of the clinician in Africa is that of Albert Schweitzer (1875–1965) a medical missionary. In a piece commemorating his 141st birthday in 2016, Dr Howard Markel (2016) noted that in the recent past the mere mention of the name Schweitzer instantly conjured up images of selflessness, heroism and the very model of a modern, humane physician. In 1952 Albert Schweitzer won the Nobel Peace Prize. He set up a medical camp that grew into a compound comprising 70 buildings, 350 beds and a leper colony for 200 by the time of his death at age 90 in 1965. Though there has been criticism arising from his racist remarks, he presents the ‘messianic’ image of the modern-day clinician. This image has been further cemented through the media especially television through medical and hospital series such as M*A*S*H, Gray’s Anatomy, House, Scrubs, Chicago Med and New Amsterdam. These hospital dramas centre around the daily challenges of extraordinary, super-­ human clinicians performing miraculous life-saving feats that end with a cathartic re-affirmation of their power over mortality. The media reinforces the power difference between the patient and clinician, emphasizing the altruism of the clinician and the ‘doctor knows best’ attitude that depicts second guessing or questioning the clinician as grossly misadvised. The dramas show the archetypal struggle against commercialization and the hegemony of profit-crazy big pharma whose interest are simply to make as much money as possible. In TV drama cases these clinicians single-­handedly unmask and defeat the commercial interests and in a recreation of the super-hero save the patients from exploitation. Clinicians are presented as objective, science-driven and focused individuals who would stop at nothing to use their knowledge in the service if making. When clinicians come through as uncaring it is because they uncompromisingly believe in science uber alles to the extent of depersonalizing themselves. Just like the ogre in folklore clinician present the ogre-like duality in which benevolence acts as a smoke-screen for a more sinister personality that leads to confusion and frustration as expressed above by Manyeti,

126 

O. OBYERODHYAMBO AND W. WAMUNYU

2012. The clinician lacks empathy towards patients, is callous and casual in demeanour, pursues profit relentlessly, eschews clear and simple communication and displays open disregard and disdain for the suffering of their clients. The clients are unable to unravel the two-facedness of the clinician because it is so seamlessly blended together so that there is no clear distinction between where one ends and where the other begins. In a typical Schweitzerian dilemma like in the ogre narratives unravelling the hyperreality is a major challenge. GG’s persona, personal and professional credibility, and persistent messaging around COVID-19 counters the hyperrealities that emerged in the wake of the pandemic, confronted the myth of the clinician as an all-­ knowing, distant figure and presents a tantalizing picture of the clinician not as an ‘ogre’ enemy, but as a ‘griot’ ally, a peer of the people, yet a speaker of truth to established hegemonies.

Conclusion In this chapter GG plays the role of the Griot and engages the hegemonies of the media, the global health, scientific, health financing system, as well as the individual healthcare provider. These systems have created hyperrealities and smoke-screens that delude the ordinary person that healthcare in the wake of COVID-19 was equally availed. GG challenges the media infodemic that was communicating fake news about coronavirus, the global news organizations that are partisan and perpetuate a hyperreal narrative of a global response while conditioning the mind to be more receptive of the US versus China narrative. GG challenges the inequity within the global health systems in the acquisition of vaccines where the hyperreality of equality and concern for the health for all is but a facade. He challenges global pharma who develop vaccines and medicines driven by the profit motive but couched under the hyperreality of ethical behaviour and desire to save lives and not profit. The individual national governments create a hyperreality of concern while creating corruption conduits that led to the creation of COVID billionaires. Medical practitioners despite their Hippocratic oath exist within a hyperreal relationship driven by profit motive and not healing. The image of the avaricious ogre gobbling up everything fits the commercialization of what should have been a humanitarian crisis. In waging war against hyperrealities in the cyber sphere, GG served as a griot who spun an alternative narrative, one in which the Nasikofus of

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

127

the world fought to recalibrate a hyperreality in which the ogre was dominant. The monstrosity of the COVID-19 ‘Ogre’ was unmasked and its power diminished through GG’s tweets, Facebook posts, hosting webinars and public pronouncements. Using his online presence, GG challenged the COVID-19 hyperrealities to claw back and re-establish reality.

References Airhihenbuwa, C. O., & Obregon, R. (2000). A Critical Assessment of Theories/ Models Used in Health Communication for HIV/AIDS. Journal of Health Communication: International Perspectives, 5-15, 1. https://doi. org/10.1080/10810730050019528 Baudrillard, J. (2010). Simulacra and simulations (1981). In C.  Greer (Ed.), Crime and Media: A reader (1st ed.). Routledge. https://doi. org/10.4324/9780367809195 Betsch, C., Bohm, R., Airhihenbuwa, C. O., Butler, R., Chapman, G. B., Haase, N., et al. (2015). Improving Medical Decision Making and Health Promotion through Culture-Sensitive Health Communication: An Agenda for Science and Practice. Medical Decision Making: An International Journal of the Society for Medical Decision Making, 36(7), 811–833. https://doi.org/10.117 7/0272989X15600434 Chakrabarti, S., Rooney, C., & Kweon, M. (2018). Verification, Duty, Credibility: Fake News and Ordinary Citizens in Kenya and Nigeria- a comparative study Beyond Fake News. BBC. Retrieved from https://downloads.bbc.co.uk/mediacentre/bbc-­fake-­news-­research-­paper-­nigeria-­kenya.pdf Charmaz, K. (2000). Grounded Theory: Objectivist and Constructivist Methods. In N. K. Denzin & Y. Lincoln (Eds.), The Handbook of Qualitative Research. SAGE Publications, Inc. Charmaz, K. (2006). Constructing Grounded Theory: A Practical Guide Through Qualitative Analysis. SAGE Publications Ltd. Charmaz, K. (2008). Constructionism and the Grounded Theory. In J. A. Holstein & J. F. Gubrium (Eds.), Handbook of Constructionist Research (pp. 397–412). The Guilford Press. Chou, W.-Y.  S., Gaysynsky, A., & Vanderpool, R.  C. (2021). The COVID-19 Misinfodemic: Moving Beyond Fact-Checking. Health Education & Behavior, 48(1), 9–13. https://doi.org/10.1177/1090198120980675 Finnegan, R. (1970). Oral Literature in Africa. Oxford University Press. Githui, D. (2011). Ethical Issues in Health Care in Kenya. A Critical Analysis of Healthcare Stakeholders. Research Journal of Finance and Accounting, 2(3), 1. Glaser, B. G., & Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. Aldine Transaction New Brunswick.

128 

O. OBYERODHYAMBO AND W. WAMUNYU

Guba, E.  G., & Lincoln, Y.  S. (1994). Competing Paradigms in Qualitative Research. In N.  K. Denzin & Y.  S. Lincoln (Eds.), Handbook of Qualitative Research (pp. 105–117). SAGE. Hale, T. A. (1997). From the Griot of Roots to the Roots of Griot: A New Look at the Origins of a Controversial African Term for Bard. Oral Tradition, 12(2), 249–278. Irimu, G. W., Greene, A., Gathaara, D., Kihara, H., Maina, C., Mbori-Ngacha, D., et  al. (2014). Factors Influencing Performance of Health workers in the Management of Seriously Sick Children at a Kenyan Tertiary Hospital  Participatory Action Research. BMC Health Services Research, 14(59), 1. Katuti, C.  S. (2018). Patient Level of Satisfaction with Perceived Health Service Quality in Nyandarua County Referral Hospital. MSc Thesis In Health Management, Kenyatta University, School of Public Health. Kiarie, M. K., Onyango, J. O., & Goro, N. K. (2016). Analysis of the Mesandstylistic Devices in Gikuyu Oral Ogre Narratives. Scholars Journal of Arts, Humanities and Social Sciences, 4(10), 1226–1232. https://doi.org/10.21276/ sjahss.2016.4.10.2 Manyeti, O. N. (2012). Client Satisfaction with HIV/AIDS Care Services offered at the Comprehensive Care Center Machakos District Hospital, Kenya. MPH Thesis, Kenyatta University, School of Health Sciences. Mare, A., Mabweazara, H. M., & Moyo, D. (2019). “Fake News” and Cyber Propaganda in Sub-Saharan Africa: Recentering the Research Agenda. African Journalism Studies, 40(4), 1–12. https://doi.org/10.108 0/237436702020.1788295 Markel, H. (2016). Dr. Albert Schweitzer; a renown medical missionary with a complicated history. PBS News Hour. Retrieved from pbs.org Martel, C., Pennycock, G., & Rand, D. G. (2020). Reliance on Emotion Promotes Belief in Fake News. Cognitive Research: Principles and Implications. Retrieved from https://doi.org/10.1186/s41235-­00252-­3 Morris, J. (2020). Simulacra in the Age of Social Media: Baudrillard as the Prophet of Fake News. Journal of Communication Inquiry, 45(4), 319–336. Mugo, M. G. (2020). Ogre named COVID-19. Ukombozi Review June 24 2020. Syracuse, New York. Nyongesa, M.  W. (2016). Client Perception on Quality of Healthcare offered to In-patients in Public and Faith-based hospitals in Kiambu and Nairobi Counties, in Kenya: a Comparative Study. PhD Thesis, Maseno University, School of Public Health and Community Development/ Department of Public Health. Wachira, J., Gengburg, B., Kafu, C., Koech, B., Akinyi, J., Owino, R. K., et al. (2018). Journal of Health Communications, 23(6), 591–596. https://doi. org/10.1080/10810730.2018.493061

6  THE OGRE AND THE GRIOT: CULTURALLY EMBEDDED COMMUNICATIVE… 

129

World Health Organization (WHO). (2021). Strategic Response to COVID-19 in the WHO African Region. World Health Organization. Yates, R., Brookes, T., & Whitaker, E. (December 2017). Hospital Detentions for Non-payment of Fees: A Denial of Rights and Dignity. A Research Paper. Centre on Global Health Security. Chatham House. The Royal Institute of International Affairs.

CHAPTER 7

“The Medium is the Massage/Message”: Functions of Synthetic Media in Sense-­Making Conditions Jean Claude Kwitonda and Symone Campbell

Introduction Despite its prophetic insights into the vagaries of new media and postmodern technologies, the “medium is the message” was regarded as an overstated pronouncement when it was made six decades ago by the Canadian academic Marshall McLuhan. At the time, the proclamation sought to call attention to a narrow focus on the “content” of media—at the expense of the great potentiality of the medium. Specifically, McLuhan (1960) sought to emphasize that “it is the medium that shapes and controls the scale and form of human association and action” (p.  108). McLuhan further repurposed his “medium is the message” metaphor to underscore the unique power of technology in manipulating/massaging consciousness and view of the world, hence the variant “the medium is the massage,” particularly in the mass media age.

J. C. Kwitonda (*) • S. Campbell Howard University, Washington, DC, USA e-mail: [email protected]; [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_7

131

132 

J. C. KWITONDA AND S. CAMPBELL

The above metaphorical variations allowed McLuhan to elucidate the dominance of the medium and advance his theory of technological determinism. McLuhan demonstrated support for his postulations by demonstrating how changes in media technologies in different eras shaped human perception, thoughts, and behaviors. Despite his efforts to demonstrate support for these arguments, McLuhan’s theory of technological determinism was criticized for being rather hyperbolic and overly deterministic. Critics argue that his theory overestimated the power of technology neglecting the potential of audiences and social structures in exerting a great deal of influence on media technologies. Nevertheless, the advent and volatility of new digital media, such as synthetic media, attest to the insightful and prophetic nature of McLuhan’s popular representation that “the medium is the message/massage.” For instance, McLuhan’s theory of technological determinism influenced postmodern media studies scholars (e.g., see Baudrillard, 1983). This postmodern movement has in turn heralded the “infocalyptic” nature of synthetic media products (Fallis, 2021) whose blizzard is now giving rise to a global state of hyperrealism increasing the difficulty of making sense of what is “real,” and what is not. This is because in such a mediated world, raw facts are not objectively presented, but rather simulated (Baudrillard, 1983). There are several other aggregating ways in which synthetic media prove to be a vexing iteration of postmodern technologies. First, deepfakes are synthetic media that deploy machine learning techniques to create hyper-realistic audio-visual materials in which individuals, mostly politicians and celebrities, are stating or doing things that are distorted or fabricated (Floridi, 2018). Additionally, research demonstrates that people systematically overestimate their ability for detecting deepfakes, particularly in videos (Fallis, 2021). Lastly, although there are efforts to fight misinformation and fake news through strategies such as fact-checking and media literacy training, deepfake technology has caught society in a double bind. For example, while the potential for fabricating or distorting events is not irrelevant, greater danger may actually lie in that deepfakes can also make people disbelieve the truth—given the power of deepfakes in creating very believable products. This chapter brings together theoretical and empirical literature to identify functions of synthetic media in sense-making conditions. Specifically, the chapter will pay close attention to the reciprocal influence between media work (e.g., by individuals or/and organizations) and

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

133

macro-level dynamics (e.g., political and cultural environments) in which synthetic media operates. The functions of deepfakes will be explored by examining the following questions: Why are synthetic media produced and how are they processed by audiences? What lessons do we learn from such functions? Using recent global public health experiences as illustrative case studies of sense-making, this chapter begins by demonstrating motives behind synthetic media production and various uses of such media by sense-making publics. Ultimately, the purpose of examining functions of synthetic media is to make some recommendations for managing information needs in the post-truth era in which public health is becoming increasingly politicized. Functions of Deepfakes. Manipulation and generation of visual and audio materials are not new practices. In fact, montage has had some positive functions in a variety of industries, such as film and entertainment, educational media, video games, health communication, advertising, and electronic commerce (Westerlund, 2019). An illustrative case of positive use of deepfake technology is the 2019 global malaria awareness campaign that used visual and voice-altering technology to show David Beckham conveying malaria messages in nine languages (Brown, 2019). However, deepfakes are mostly conspiratorial, particularly when society is trying to make sense of disruptions in social norms. Effectively, social anxiety-inducing phenomena and media representation collude in specific ways. Media and social representation scholars have demonstrated that when social phenomena such as new infectious diseases (e.g., COVID-19, Zika, Ebola, SARS, avian influenza, and the H1N1 influenza pandemic) emerge, they are initially widely circulated by media (Moscovici, 1961; Wagner-Egger et  al., 2011). However, this line of research argues that new information is dominated by abstract, probabilistic terms from scientists. As such, lay publics are confronted not only with unfamiliar diseases, but also with inaccessible discourse that is used to explain them. Thus, the production of deepfakes is partly motivated by audience information processing and coping needs as the emerging threat and related discourse require attention, sense-making, and individual and collective action. Before the advent of synthetic media technology, digital art functioned as a tool that assisted audiences in information-processing needs using the potential of manipulative software. Deepfake technology has evolved from this digital manipulation potential serving a double-bind. On one hand, digital technology may help lay audiences in information processing; on the other hand, they tend to manipulate perceptions in conspiratorial

134 

J. C. KWITONDA AND S. CAMPBELL

ways. The social representation perspective (Moscovici, 1961) explains how such media technology may help lay publics grapple with abrupt and unfamiliar events that require immediate social change and how the new and threatening situations are transformed into familiar and less threatening situations. When confronted with a new phenomenon, lay publics must develop a working understanding by transforming abstract, probabilistic risk science into (lay) working knowledge (Beck, 1992; Giddens, 1990; Moscovici, 1961). Conspiracy ideation is partly a byproduct of the clash between and/or mixing of prevailing social realities, scientific and lay knowledge. According to Moscovici (1961), this kind of hybrid knowledge is a result of two main processes, anchoring and objectification. Anchoring is related to the process of grappling with the ambiguity and unfamiliarity of new phenomena. The latter process may involve naming the unfamiliar event with the purpose of relating the event to existing or familiar systems of meanings. By naming the new phenomenon, it becomes relatively recognizable and less threatening. For example, when they emerge, global public health pandemics are often named after the time and/or places they were first discovered (e.g., Zika, COVID-19). This kind of representation may sometimes involve conspiracy theorizations that seek to name a culprit who must be blamed for the outbreak (Joffe, 1999; Wagner-Egger et al., 2011). Akin to Stanley Cohen’s (1973) interdependent concepts of “folk devils” and “moral panics,” the anchoring phenomenon demonstrates the current reciprocal influence between public health, political ideology, and mistrust in scientific institutions (Pfattheicher & Schindler, 2016). Objectification concretizes new phenomena. Metaphors, concrete or visual imagery, are the primary means of representing abstract scientific content. For example, COVID-19 has become a familiar brand, thanks to its representation as a round entity with protruding spikes. According to a Michigan University blog (n.d.), the visual imagery explains abstract scientific facts about the nature of the virus: Those spikes on the outside are really there and they’re what gave the virus its name—‘corona’ comes from the word crown in Latin, and those spikes bind to the receptor on a cell and allow the virus to enter it. Like each image that BioArtography produces, the coronavirus’ spiky ball tells a story, and that story has become an important visual cue about public health and

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

135

safety. In the image, they added some orange and yellow-colored dots on the surface, which represent the myriad of proteins that the virus encodes.

Anchoring and objectification function as processes that increase public understanding of abstract and technical knowledge—by transforming it into common-sense knowledge. The reconstruction of scientific knowledge provides lay publics with a new code of conduct (Moscovici, 1961) and familiar ways of making sense of expert knowledge publicized in popular and elite media. It is in this sense-making context that deepfake content emerges and gains traction by two variables interacting. First, expert knowledge is almost by default perceived as pedantic because it tends to use language characterized by ambiguity or uncertainty (Franks et al., 2013). Second, according to Hall’s (1980) cultural model of media sense-making, media messages are decoded ideologically. This interaction is very important for understanding the processes underlying motives behind the production and spread of deepfakes.

Production, Spread, and Processing of Deepfakes Researchers suggest that manipulation of social perceptions creates divergent thinking by resonating with the human inclination to seek closure or accuracy. These needs are compounded during times of uncertainty or when faced with divergent ways of interpreting a new phenomenon (Leman & Cinnirella, 2013; Marchlewska et  al., 2018; van Prooijen & Jostmann, 2013). The tendency to seek closure and accuracy intensifies when there is no clear official explanation or solution to a phenomenon that disrupts or threatens public well-being. For example, when new infectious diseases emerge, scientific experts provide explanations regarding the causes and/or treatment. Those explanations are then adopted and transformed by lay publics. However, lay public often lack the cognitive tools that resonate with probabilistic and jargon-laden scientific explanations failing to find accuracy and closure in the process. The latter gap gives rise to alternative often-conspiracist frames that diverge from scientific consensus (Franks et al., 2013). Social motives have also been cited as plausible explanations for the production and spread of doctored and conspiracist information. For example, researchers (Brotherton & Eser, 2015) have identified significant associations between feelings of boredom and proneness to conspiracist

136 

J. C. KWITONDA AND S. CAMPBELL

ideation. The latter motive is relevant to the development and spread of conspiracy theories because of the psychological need to appear special to others, particularly when one can share information that seems unique or rare (Imhoff & Lamberty, 2017; Lantian et al., 2016). Social/self-esteem needs can thus stoke conspiracist ideation or rejection of dominant and scientific accounts. However, the social and self-esteem needs may make marginalized communities seem more prone to conspiracist ideation allegedly because they engage in conspiracies simply to feel unique to others, subvert accounts from groups perceived as elitist, or because such communities are lacking in rational thinking as some research suggests (e.g., Mikuškova, 2017; Ståhl & van Prooijen, 2018; Swami et al., 2014). While marginalization, lack of analytical thinking, or low levels of education may lead to systematic errors of judgment, such deficit-based accounts may run the risk of obscuring social-political contexts that provide plausible explanations for real conspiracies (see Briggs, 2004; Davis et al., 2018; Learning Networks for Countries in Transition, 2020; Nattrass, 2013; Thomas & Quinn, 1991). The contemporary global culture characterized by informal networks of global elites invokes the creative and mutating nature of old problems of global and local governance (Basham, 2003; Kwitonda, 2017a; Singh, 2016) that may justify mistrust in official accounts. In authoritarian cultures, responses to public health emergencies may be permeated by conspiracist ideation due to mistrust in government (Kim & Cao, 2016). In particular, many governments in the global south tend to have authoritarian ways of relating to citizenry. Paradoxically, authoritarian governments may be effective at imposing strict public health measures (e.g., lockdowns), but this may sometimes come at the expense of basic human rights. In such scenarios, conspiracist ideation may be understood as a safe strategy for expressing dissent or a desperate attempt to hold local and extra-local governments accountable (Basham, 2003; Dentith, 2016a, 2016b; Dentith & Orr, 2017). COVID-19 has already proven that trust in public officials and networks of global elites is key to acceptance of behavioral change and uptake of vaccines in the African context. For example, a report by Learning Networks for Countries in Transition (2020) demonstrates why conspiracist beliefs should not be dismissed wholesale: After a European doctor on foreign television proposed testing of the vaccine in Africa, a rumor started that a purported vaccine for COVID-19,

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

137

which would spread the virus, was being tested on the population in Cote D’Ivoire. Anti-vaccine movements also took advantage of this fear and distrust to spread rumors, including that vaccines are a money-making scam by manufacturers and vaccine funding organizations. These rumors resulted in a call for vaccinations offered in health centers to be refused. A telephone poll conducted among 55,291 respondents in mid-April showed that half the population were planning to discontinue vaccinating their children either because of the rumors circulating or because they do not believe in vaccination. Consequently, Cote D’Ivoire saw decreased attendance at vaccination centers and increased vaccine refusal, leading to lower immunization coverage overall.

Another notable motive for divergent thinking is the desire to preserve or affirm one’s political worldview, ideology, and/or identity (Miller, Saunders & Farhart, 2016). Research conducted in different countries in Africa demonstrates the need to protect deeply held values such as religious worldviews (Bristol & Millard, 2016; Hird & Linton, 2016; Nasir et al., 2014). This may be one of the key determinants of motivated reasoning and avoidance of cognitive dissonance among religious leaders. Another relevant example has been observed in the USA and how the country has attempted to manage the COVID-19 virus. Due to its coincidence with a political year in the USA, COVID-19 has demonstrated that certain public health behaviors (e.g., attitude toward wearing facial masks or adhering to recommendations provided scientific experts) are often filtered through worldview-affirming beliefs (e.g., conservatives vs. liberals).

Managing Health Information Needs in the Era of Deepfakes New digital technologies have increasingly been linked to the spread of conspiracists information (Bessi et al., 2014; Bessi et al., 2015; Southwell et  al., 2018). Moreover, some social media platforms (e.g., WhatsApp, which has been characterized as “one-stop shop” because of its affordability, anonymity, and social closeness it offers to subscribers, Bowles et al., 2020) are highly prone to deepfakes because they are not easily amenable to traditional fact-checking mechanisms (Del Vicario et  al., 2016; Zarocostas, 2020). While such new platforms may function as vehicles for deepfake content, research suggests that they can also support information management during public health crises. For example, research conducted

138 

J. C. KWITONDA AND S. CAMPBELL

among 27,000 WhatsApp subscribers in Zimbabwe shows that the application can be harnessed to fight misinformation about the COVID-19 virus (Bowles et al., 2020). Building on the latter optimistic view of new media as well as an understanding of conspiracist ideation as collective sense-making, we now turn to the role of the new media in managing conspiracist information in contexts of post-truth public health communication. First, information providers must keep in mind that the emergence of new communicable diseases will naturally engender anxiety and uncertainty prompting people to seek, avoid or reappraise information received via official and mediated channels of communication (Afifi & Weiner, 2004; Babrow, 2001; Brashers, 2001). As uncertainty management and social representation research suggests, anxiety will follow naturally “when details of the situation are ambiguous, complex, unpredictable, or probabilistic; when information is unavailable or inconsistent; and when people feel insecure in their own state of knowledge or the state of knowledge in general” (Brashers, 2001, p. 478). As crucial information managers, new media and public health officials must therefore provide efficacy components in the provision of information. In other words, the process of sense-making does not only entail assessing the information provided but also the reappraisal of outcomes and expectancies (Babrow & Kwitonda, 2020) associated with acting upon the information. With regard to efficacy assessments, the questions to keep in mind (particularly in low-­ income or non-democratic contexts) may include the relationship between prevailing mistrust in government, economic hardships, and human rights abuses because this backdrop often influences perceptions of government mandates (e.g., impositions of lockdowns or other public health measures). That is, efficacy assessments are necessarily anchored in ongoing socio-economic inequities as well as real conspiracies in oppressive and centralized polities. For example, the fight against global terrorism has proven to be a double-edged sword used by autocratic regimes to join the fight against international terrorism on one hand and multilateral conspiracies in the suppression of political dissidents summarily labeled as “terrorists” on the other (Kwitonda, 2017b). The second proposition is derived from anchoring and objectification as tenets of social representation that present both a challenge and opportunity for scientists, media, and other information managers in their efforts of translating and conveying technical health information. The main challenge is that anchoring and objectifying technical information invokes the

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

139

well-known disconnect between scientific communities and lay publics (Getman et al., 2017). The latter researchers indicate that lay publics are often alienated from scientific information because scientists are trained and thus constrained by conventions of their secluded scientific communities. This, in turn, alienates lay publics who view information from scientific experts as opaque and elitists, encouraging them to turn to unverified, crowd-based sources of information such as Wikipedia, which is edited by lay people and thus capable of providing relatable but not necessarily reliable information. Research on the influence of digital networks on vaccine hesitancy (e.g., Getman et al., 2017) indicates that the hierarchical mode of scientific information production and dissemination is often ineffective in mitigating anti-vaccination sentiments in online communities. Getman et al. (2017) go further to conclude that “the most effective usage of this mode of authority online is by the vaccine-hesitant community itself to enforce their vaccine-hesitant narrative” (p.  604). Media managers may therefore seek to translate and present scientific information according to key premises of social representation (i.e., anchoring and objectifications). Although infrequently tapped, bioautography is a viable avenue for reaching lay publics (see Michigan University blog, n.d.). This is encouraging because new media are particularly attuned to such visualization of abstract and complex scientific information. Third, given that conspiracist motives may stem from protecting or affirming cherished worldviews, media and information managers should avoid correcting misinformation by directly challenging audiences’ worldviews, as doing so may have the opposite effect. The latter is often referred to as the backfire effect characterized by a strong resistance against empirical, objective, and factual evidence presented to correct misinformation that is laden with ideological and partisan overtones (Berinsky, 2017; Nyhan et al., 2013; Nyhan & Reifler, 2010; Schaffner & Roche, 2017). This line of research is particularly significant for understanding the implications of partisan information processing in the “fake news era,” a phrase popularized on a global scale by the 45th American President Donald Trump (McIntyre, 2018) or the consequences of paternalistic communication in the wake of maladaptive responses to Ebola communication in West Africa (DuBois et al., 2015; Toppenberg-Pejcic et al., 2018). Most importantly and consistent with social representation principles, fact-­ checking and misinformation correctors must be mindful of the ubiquity of community deficit models of communication (Kwitonda & Singhal, 2018) and partner with scientific experts and community gatekeepers in

140 

J. C. KWITONDA AND S. CAMPBELL

order to affirm the self-worth of information recipients and work toward the earlier-described goals of anchoring and objectifying novel and/or abstract information by graphical means (Lewandowsky et  al., 2017; Nyhan & Reifler, 2018; Seaton et al., 2020). A final implication derived from social representation perspective is borne out of the interdependence between mass media and interpersonal communication. As research on the role of technology in facilitating community engagement in the fight against Ebola in West Africa shows, “no form of engagement was more effective than face-to-face discussion, and there are no technological short-cuts for safe burial and body management” (Toppenberg-Pejcic et  al., 2018, p.  443). The synergy between mass media and interpersonal communication has long been recognized by the social representation perspective because: it offers a unique framework for the social scientific study of how groups of people communicate about and make sense of emerging or novel phenomena (the social element in “social representations”), as covered by the mass media, and how these resulting frames (“representations”) shape the subsequent behaviors of individuals who belong to these social groups. (Morgan, 2009, p. 30)

The importance of digital networks and interpersonal communication is further supported by the increasing reach of global media as well as patterns of information sharing between members of the diaspora and their in-country social networks (Toppenberg-Pejcic et al., 2018). For example, lessons learned from international Ebola communication demonstrate that digital networks are effective in mitigating misinformation and engaging various community members in different geographical locations (Rubyan-Ling, 2015). Digital networks may further buttress comprehensive community engagement via cultural gatekeepers (e.g., religious and traditional leaders, women’s groups, etc.) because previous research (Bristol & Millard, 2016; Hird & Linton, 2016) demonstrates that the contribution of community members has been crucial in Polio and Ebola eradication efforts in African countries and will still play an integral part in preparing for future pandemics. In other words, there seems to be no replacement for interpersonal communication between trusted community members in situations that necessitate sense-making.

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

141

Conclusion Drawing from theoretical and empirical literature, this chapter reappraised functions of deepfakes to explain not only what deepfakes do to people but also what people do with deepfakes. This chapter adds a sense-making perspective as a supplement to the existing new media studies efforts of combating misinformation (e.g., legislation, fact-checking, and media literacy training). While the latter are not irrelevant, we recognize that they are not enough particularly given the quandaries of social media regulation (e.g., section 230) but most importantly the post-truth and deepfake media environments where people are not only prone to believing and affirming falsehoods but also disbelieve what is actually true. Thus, this chapter focused on the management of conspiracist information as an ever-evolving sense-making and feedback process. That is, media, scientific experts, and other information managers should treat deepfake productions and use as particular kinds of collective sense-making and opportunities for identifying information needs, particularly among lay publics and marginalized communities. Understanding these functions (e.g., through text mining of social media) can help identify underlying motives. Subsequently, relevant messages can be created to address specific concerns. We therefore hope this chapter outlines a framework for empirically investigating the production of and audience engagement with synthetic media content. We conclude by reiterating the need for partnerships between lay communities, scientific, and media experts in order to make sense of divergent interpretive frames that are the hallmarks of the new media environment. This, in turn, will assist publics with messages that are attuned to the postmodern relationship between health, politics, and social identity.

References Afifi, W. A., & Weiner, J. L. (2004). Toward a Theory of Motivated Information Management. Communication Theory, 14(2), 167–190. https://doi. org/10.1111/j.1468-­2885.2004.tb00310.x Babrow, A.  S. (2001). Uncertainty, Value, Communication, and Problematic Integration. Journal of Communication, 51(3), 553–573. https://doi. org/10.1111/j.1460-­2466.2001.tb02896.x Babrow, A. S., & Kwitonda, J. C. (2020). Expectancy Value Model. In J. Van den Bulck, D.  Ewoldsen, L.  Mares, & E.  Scharrer (Eds.), The International

142 

J. C. KWITONDA AND S. CAMPBELL

Encyclopedia of Media Psychology. Wiley-Blackwell. https://doi. org/10.1002/9781119011071.iemp0071 Baudrillard, J. (1983). Simulations. Semiotext(e). Basham, L. (2003). Malevolent Global Conspiracy. Journal of Social Philosophy, 34(1), 91–103. https://doi.org/10.1111/1467-­9833.00167 Beck, U. (1992). Risk society: Towards a New Modernity. Sage. Bessi, A., Caldarelli, G., Del Vicario, M., Scala, A., & Quattrociocchi, W. (2014). Social Determinants of Content Selection in the Age of (mis)information. In L. M. Aiello & D. McFarland (Eds.), Lecture Notes in Computer Science: Vol. 8851. Social Informatics (pp.  259–268). Springer International. https://doi. org/10.1007/978-­3-­319-­13734-­6_18 Bessi, A., Coletto, M., Davidescu, G. A., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2015). Science vs Conspiracy: Collective Narratives in the Age of Misinformation. PLoS ONE, 10(2), e0118093. https://doi.org/10.1371/ journal.pone.0118093 Berinsky, A. J. (2017). Rumors and Health Care Reform: Experiments in Political Misinformation. British Journal of Political Science, 47(2), 241–262. https:// doi.org/10.1017/S0007123415000186 Bowles, J., Larreguy, H., & Liu, S. (2020). Countering Misinformation Via WhatsApp: Preliminary Evidence from the COVID-19 Pandemic in Zimbabwe. Plos One, 15(10), 1. https://doi.org/10.1371/journal.pone.0240005 Brashers, D. E. (2001). Communication and Uncertainty Management. Journal of Communication, 51(3), 477–497. https://doi. org/10.1111/j.1460-­2466.2001.tb02892.x Briggs, C. L. (2004). Theorizing Modernity Conspiratorially: Science, Scale, and the Political Economy of Public Discourse in Explanations of a Cholera Epidemic. American Ethnologist, 31, 164–187. https://doi.org/10.1525/ ae.2004.31.2.164 Bristol, N., & Millard, C. (2016). Bolstering Public Health Capacities Through Global Polio Eradication: Planning Transition of Polio Program Assets in Ethiopia. Retrieved from https://www.jstor.org/stable/pdf/resrep23901.pdf Brotherton, R., & Eser, S. (2015). Bored to Fears: Boredom Proneness, Paranoia, and Conspiracy Theories. Personality and Individual Differences, 80, 1–5. https://doi.org/10.1016/j.paid.2015.02.011 Brown, D. (2019, May 24). Wait, is that video real? The race against deepfakes and dangers of manipulated recordings. USA Today. Cohen, S. (1973). Folk Devils and Moral Panics. Blackwell. Davis, J., Wetherell, G., & Henry, P.  J. (2018). Social Devaluation of African Americans and Race-related Conspiracy Theories. European Journal of Social Psychology, 48, 999–1010. https://doi.org/10.1002/ejsp.2531 Del Vicario, M. D., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., et al. (2016). The Spreading of Misinformation Online. Proceedings of the National

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

143

Academy of Sciences, 113(3), 554–559. https://doi.org/10.1073/ pnas.1517441113 Dentith, M. R. X. (2016a). When Inferring to a Conspiracy Might Be the Best Explanation. Social Epistemology, 30(5–6), 572–591. https://doi.org/10.108 0/02691728.2016.1172362 Dentith, M. R. X. (2016b). Treating Conspiracy Theories Seriously: A Reply to Basham on Dentith. Social Epistemology Review and Reply Collective, 5(9), 1–5. Retrieved from http://wp.me/p1Bfg0-­3ak DuBois, M., Wake, C., Sturridge, S., & Bennett, C. (2015). The Ebola Response in West Africa: Exposing the Politics and Culture of International Aid. Retrieved from https://www.odi.org/sites/odi.org.uk/files/odi-­assets/publications-­ opinion-­files/9903.pdf Fallis, D. (2021). The Epistemic Threat of Deepfakes. Philos. Technol., 34, 623–643. https://doi.org/10.1007/s13347-­020-­00419-­2 Floridi, L. (2018). Artificial Intelligence, Deepfakes and a Future of Ectypes. Philos. Technol, 31, 317–321. https://doi.org/10.1007/s13347-­018-­0325-­3 Franks, B., Bangerter, A., & Bauer, M. W. (2013). Conspiracy Theories as Quasi-­ religious Mentality: An Integrated Account from Cognitive Science, Social Representations Theory, and Frame Theory. Frontiers in Psychology, 4(424), 1–12. https://doi.org/10.3389/fpsyg.2013.00424 Getman, R., Helmi, M., Roberts, H., Yansane, A., Cutler, D., & Seymour, B. (2017). Vaccine Hesitancy and Online Information: The Influence of Digital Networks. Health Education & Behavior, 45(4), 599–606. https://doi. org/10.1177/1090198117739673 Giddens, A. (1990). Consequences of Modernity. Polity Press. Hall, S. (1980). Encoding/Decoding. In S. Hall, D. Hobson, A. Lowe, & P. Willis (Eds.), Culture, Media, Language (pp.  128–138). Hutchinson University Library. Hird, T., & Linton, S. (eds). (2016). Lessons from Ebola affected communities: Being prepared for future health crises. Written by Pangeia and commissioned by the Africa APPG. Retrieved from https://research.monash.edu/en/publications/lessons-­from-­ebola-­affected-­communities-­being-­prepared-­for-­future Imhoff, R., & Lamberty, P.  K. (2017). Too Special to be Duped: Need for Uniqueness Motivates Conspiracy Beliefs. European Journal of Social Psychology, 47(6), 724–734. https://doi.org/10.1002/ejsp.2265 Joffe, H. (1999). Risk and “the Other”. Cambridge University Press. Kim, M., & Cao, X. (2016). The Impact of Exposure to Media Messages Promoting Government Conspiracy Theories on Distrust in the Government: Evidence from a Two-stage Randomized Experiment. International Journal of Communication, 10(2016), 3808–3827. Retrieved from http://ijoc.org/ index.php/ijoc/article/view/5127

144 

J. C. KWITONDA AND S. CAMPBELL

Kwitonda, J.  C. (2017a). Development Aid and Disease Discourse on Display: The Mutating Techniques of Neoliberalism. Critical Discourse Studies, 14(1), 23–38. https://doi.org/10.1080/17405904.2016.1174139 Kwitonda, J.  C. (2017b). Disambiguating the Cosmopolitics and Discourse of Democratization: The Role Cyberactivism. In E. Ngwainmbi (Ed.), Citizenship, Democracies, and Media Engagement Among Emerging Economies and Marginalized Communities (pp. 81–100). Springer International Publishing AG. Kwitonda, J.  C., & Singhal, A. (2018). Teaching and Learning About Positive Deviance: Boosting Metacognitions to Grasp Global Communication Theory and Practice. Journal of Intercultural Communication Research Journal of Intercultural Communication Research, 47(5), 382–391. https://doi.org/1 0.1080/17475759.2018.1475295 Lantian, A., Muller, D., Nurra, C., & Douglas, K. M. (2016). Measuring Belief in Conspiracy Theories: Validation of a French and English Single-item Scale. International Review of Social Psychology, 29(1), 1–14. https://doi. org/10.5334/irsp.8 Leman, P.  J., & Cinnirella, M. (2013). Beliefs in Conspiracy Theories and the Need for Cognitive Closure. Frontiers in Psychology, 4(378), 1. https://doi. org/10.3389/fpsyg.2013.00378 Learning Networks for Countries in Transition. (2020). Summary of LNCT Webinar: Designing Behavioral Strategies for Immunization in a COVID-19 context. Retrieved from https://lnct.global/wp-­content/uploads/2020/06/ COVID-­19-­behavioral-­strategies-­summary-­final.pdf Lewandowsky, S., Ecker, U.  K., & Cook, J. (2017). Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369. https://doi.org/10.1016/j. jarmac.2017.07.008 Marchlewska, M., Cichocka, A., & Kossowska, M. (2018). Addicted to Answers: Need for Cognitive Closure and the Endorsement of Conspiracy Beliefs. European Journal of Social Psychology, 48, 109–117. https://doi. org/10.1002/ejsp.2308 McIntyre, L. C. (2018). Post-truth. MIT Press. McLuhan, M. (1960). The Medium Is the Message. In M.  G. Durham & D. Kellner (Eds.), Media and Cultural Studies: Keyworks. Blackwell. Michigan University Blog. (n.d.). Coronavirus: Decoding the spiky fuzz-ball https://labblog.uofmhealth.org/lab-­r epor t/coronavirus-­d ecoding-­ spiky-­fuzz-­ball Mikuškova, E.  B. (2017). Conspiracy Beliefs of Future Teachers. Current Psychology, 37, 1. https://doi.org/10.1007/s12144-­017-­9561-­4 Morgan, S.  E. (2009). The Intersection of Conversation, Cognitions, and Campaigns: The Social Representation of Organ Donation. Communication Theory, 19(1), 29–48. https://doi.org/10.1111/j.1468-­2885.2008.01331.x

7  “THE MEDIUM IS THE MASSAGE/MESSAGE”: FUNCTIONS OF SYNTHETIC… 

145

Moscovici, S. (1961). La psychanalyse, son image et son public. Presses Universitaires de France. Nattrass, N. (2013). The AIDS Conspiracy: Science Fights Back. Columbia University Press. Nasir, S.  G., Aliyu, G., Ya’u, I., Gadanya, M., Mohammad, M., Zubair, M., & El-Kamary, S. S. (2014). From Intense Rejection to Advocacy: How Muslim Clerics Were Engaged in a Polio Eradication Initiative in Northern Nigeria. PLoS Med, 11(8). https://doi.org/10.1371/journal.pmed.1001687 Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32(2), 303–330. https://doi.org/10.1007/ s11109-­010-­9112-­2 Nyhan, B., Reifler, J., & Ubel, P. A. (2013). The Hazards of Correcting Myths About Health Care Reform. Medical Care, 51(2), 127–132. https://doi. org/10.1097/MLR.0b013e318279486b Nyhan, B., & Reifler, J. (2018). The Roles of Information Deficits and Identity Threat in the Prevalence of Misperceptions. Journal of Elections, Public Opinion and Parties, 29(2), 222–244. https://doi.org/10.1080/1745728 9.2018.1465061 Pfattheicher, S., & Schindler, S. (2016). Misperceiving Bullshit as Profound Is Associated with Favorable Views of Cruz, Rubio, Trump and conservatism. PLoS ONE, 11, e0153419. https://doi.org/10.1371/journal.pone.0153419 Rubyan-Ling, D. (2015). Briefing paper: Diaspora communications and health seeking behavior in the time of Ebola: Findings from the Sierra Leonean community in London. Retrieved from http://www.ebola-­anthropology. net/wp-­content/uploads/2015/11/Diaspora-­communication-­and-­health-­ seeking-­behaviour1.pdf Seaton, J., Sippitt, A., & Worthy, B. (2020). Fact Checking and Information in the Age of COVID. The Political Quarterly, 91(3), 578–584. https://doi.org/1 0.1111/1467-­923x.12910 Schaffner, B. F., & Roche, C. (2017). Misinformation and Motivated Reasoning: Responses to Economic News in a Politicized Environment. Public Opinion Quarterly, 81(1), 86–110. https://doi.org/10.1093/poq/nfw043 Singh, D. G. (2016). Conspiracy Theories in a Networked World. Critical Review, 28(1), 24–43. https://doi.org/10.1080/08913811.2016.1167404 Ståhl, T., & Van Prooijen, J.-W. (2018). Epistemic Rationality: Skepticism Toward Unfounded Beliefs Requires Sufficient Cognitive Ability and Motivation To Be Rational. Personality and Individual Differences, 122, 155–163. https://doi. org/10.1016/j.paid.2017.10.026 Southwell, B. G., Thorson, E. A., & Sheble, L. (2018). Misinformation and Mass Audiences. University of Texas Press.

146 

J. C. KWITONDA AND S. CAMPBELL

Swami, V., Voracek, M., Stieger, S., Tran, U. S., & Furnham, A. (2014). Analytic Thinking Reduces Belief in Conspiracy Theories. Cognition, 133(3), 572–585. https://doi.org/10.1016/j.cognition.2014.08.006 Thomas, S.  B., & Quinn, S.  C. (1991). The Tuskegee Syphilis Study, 1932 to 1972: Implications for HIV Education and AIDS Risk Education Programs in the Black Community. American Journal of Public Health, 81, 1498–1505. https://doi.org/10.2105/AJPH.81.11.1498 Toppenberg-Pejcic, D., Noyes, J., Allen, T., Alexander, N., Vanderford, M., & Gamhewage, G. (2018). Emergency Risk Communication: Lessons Learned from a Rapid Review of Recent Gray Literature on Ebola, Zika, and Yellow Fever. Health Communication, 34(4), 437–455. https://doi.org/10.108 0/10410236.2017.1405488 van Prooijen, J.-W., & Jostmann, N.  B. (2013). Belief in Conspiracy Theories: The Influence of Uncertainty and Perceived Morality. European Journal of Social Psychology, 43(1), 109–115. https://doi.org/10.1002/ejsp.1922 Wagner-Egger, P., Bangerter, A., Gilles, I., Green, E., Rigaud, D., Krings, F., & Clémence, A. (2011). Lay Perceptions of Collectives at the Outbreak of the H1N1 Epidemic: Heroes, Villains and Victims. Public Understanding of Science, 20(4), 461–476. https://doi.org/10.1177/0963662510393605 Westerlund, M. (2019). The Emergence of Deepfake Technology: A Review. Technology Innovation Management Review, 9(11), 39–51. Zarocostas, J. (2020). How to Fight An Infodemic. The Lancet, 395(10225), 676. https://doi.org/10.1016/s0140-­6736(20)30461-­x

CHAPTER 8

Deepfakes: Future of Sports and Questioning Video Evidence Unwana Samuel Akpan and Chuka Onwumechili

Introduction There is an ongoing incidence of the use of hyper-learning-based techniques that produce fake images and videos, some that swap real images and replace them with other images to advance a certain narrative. It may also involve altering original media content with the aid of digital technology, and most often, to the degree of unrecognizability. This is popularly known as “deepfakes” and it, ultimately, refers to deep and hyper-­learning-­ based techniques that can produce fake images and videos by swapping one reality with another such as a person’s face with the face of another person. These types of videos fall into a new band of “synthetic media” (Mahmud & Sharmin, 2020). Usually, creators of deepfakes use AI

U. S. Akpan (*) University of Lagos, Lagos, Nigeria e-mail: [email protected]; [email protected] C. Onwumechili Howard University, Washington, DC, USA e-mail: [email protected] © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_8

147

148 

U. S. AKPAN AND C. ONWUMECHILI

applications that merge, combine, replace, and superimpose images and video clips to create fake videos (Maras & Alexandrou, 2019). With advanced software, pictures, and sound of moving images can be manipulated or even created completely from scratch. Such developments may have far-reaching consequences for public discourse and the collective construction of reality, especially in a sector as revered as sports. The practice has become known as deepfake. Face-­ swapping has always been the trick that creators of deepfakes use in getting the attention, believability, and acceptability of millions of media consumers around the world. It has become ubiquitous and garnered scholarly attention from researchers, policy makers, and media consumers (Mahmud & Sharmin, 2020; Wahl-Jorgensen & Carlson, 2021 and Willingham, 2020). In addition, it is spawning a wave of enforcement and detection (Malik et al., 2022) techniques. Much of the academic work, studying this phenomenon, have focused on politics (Taylor, 2021 and Willingham, 2020). This is not surprising given that the origin of such doctored videos and the acronym can be found in political communication. “Deepfake” as a term first popped up in 2018. It generated public discourse and since then it has gained public attention around the world (Wahl-Jorgensen & Carlson, 2021 p. 1). The term originated from an anonymous user of the platform Reddit by the name “deepfakes” (Hao & Heaven, 2020). The user deployed Artificial Intelligent (AI) technology to montage well-known faces of actresses onto pornographic videos but was blocked from the platform because of this practice. In response, he made a detailed graphic guideline on another platform demonstrating how to create such videos. As a result, the technique quickly gained the attention of more than 25,000 Reddit users (Panyatham, 2021). This single technological creation has given birth to millions of doctored videos circulating online globally. The technology thrives on a face-swapping process in which a source video or picture is used to manipulate a target video. This has become a disturbing cultural and social phenomenon that has marred the ethic, standards, and credibility of journalism practice and that of the victim (Kietzmann et al., p. 137). The key goal of those who create deepfake is to make manipulated pictures and videos acceptable and believable to the generality of people (Thies et al., 2016, p. 2387). Willingham (2020), writing for CNN, reviewed a rash of doctored videos during the United States Presidential elections, and published an article designed to help readers detect such videos. Taylor (2021) argues that the state uses enforcement against deepfakes as

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

149

an extension of securitization of the state and points to an alternative reading of deepfakes as satire. Although Taylor’s interpretation provides an alternate reading of the threat of deepfakes, many authors consider the threat dangerous enough that action is urgently needed. However, politics is not the only field impacted by the surge in the use of deepfake technology. Instead, it has been used in a wide-ranging area including business (Kietzmann et  al., 2020), philosophy (Fallis, 2021), and media (Diakopoulos & Johnson, 2021). Nevertheless, politics has garnered much of the attention because the usage of such deepfakes has focused on well-known global figures including national leaders and prominent politicians. Although most sports tournaments are televised live, and this leaves very little room for manipulation of sports footage, the issue is that the manipulation of this footage could be carried out during the rebroadcast of the event. What is surprising is that deepfakes have yet to permeate the sporting arena. Sports are watched by many people. The Global WebIndex (undated) notes that 85% of internet users watch at least one sport regularly online or on television. This has led to an estimated global sports annual revenue estimated at $198 billion for sporting apparels alone (Statista.com, 2021). This interest in sports makes it a veritable target for the use of deepfakes. Thus, while it may not be prevalent currently, the potential for deepfakes remains high in sports. In this chapter, we consider the importance and popularity of sports, particularly football, to argue that deepfake video is likely to be a scourge soon given the extent of its usage in other spheres. The popularity of televised sporting events is amply demonstrated in where it ranks among the most watched televised events worldwide. FIFA. com (2018) report shows that football is one of the most watched television events of all time. For instance, two FIFA World Cup matches in 1998 and 2002, involving Brazil, rank among the top five most watched televised events in Brazil with ratings over 71% of television viewers. In France, the 2018 World Cup final between France and Croatia was ranked the fourth most watched televised event with 8.2 million viewers accounting for over 90% of viewers tuned in. In Germany, the top five most televised events are all football matches with the 2014 World Cup final between Germany and Argentina ranked as No. 1 with 86.3% share. Sporting activities finding their way to television was for commercial purposes due to the mass audience of such games (Kassing et  al., 2004). Obviously, not everyone can gain entry into the various stadia or venues

150 

U. S. AKPAN AND C. ONWUMECHILI

of sporting activities. However, anyone with a television set or an Internet connection can be a part of the community of sporting consumers. Moreso, fragments or short clips of spectacular and dramatic sports footage are always cut out from the full video and used for various reasons on social media. Also, scholars have said that televised sport creates believability, credibility, legends, and issues (Akpan, 2020; Chung et al., 2016; Coates et al., 2014; Pérez et al., 2015; Salaga & Tainsky, 2015). It is the popularity of football that could make it a target for deepfakes. After all, part of the idea and attraction for instituting the fake is to persuade a large audience on an issue that has interest and/or generates controversy. Therefore, conceptualizing the presence of deepfake videos in football is not far-fetched. Importantly, it does not have to occur in a premier tournament like the FIFA World Cup. What matters is that there is an audience and football, even at the local league levels produces such an audience and always has the possibility of controversy. It is based on the likelihood of the presence of deepfake videos in the sporting field, that we write this chapter examining various aspects in the intersection of deepfakes and sports. To do so, we review the nature of televised sport with particular attention to football and its core fundamentals that rely on objectivity and fair play. The linking of deepfake to its possible use in sports is hinged, as we shall note, on Langdon Winner’s technological determinism hypothesis. We then examine how artificial intelligence has enhanced deepfake and how this makes it a likelihood in the sporting environment. Ultimately, our goal in this chapter is to seek prevention and solutions to the impending use of deepfakes in the sporting arenas. Thus, that forms the crux of our discourse before we conclude.

Theoretical Framework There is little doubt that various technological innovations have fueled global development. Many of those developments have impacted sports and football over the years. For instance, and recently, the adoption of goal-line technology and the video assistant referee (VAR) have both improved officiating and perception of fairness in the game. In essence, technology as a critical part of sport is in ascendancy. The question is how deeply will technology affect sport and determine its competitive outcomes? The term technological determinism is believed to have been coined by Thorstein Veblen (1857–1929), an American sociologist, to identify the impact of technology in fostering social progress.

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

151

However, not all technologies advance society for good. According to Lievrouw and Livingstone (2009), while technological determinism may be altruistic, it could also become problematic. He admits that technologies have an overwhelming power to drive human actions but notes that technologies such as artificial intelligence (AI) technology create deepfakes which focus on human negative rather than positive actions. Nevertheless, what exactly does technological determinism theory tell us about the future of football? Giving an answer to this is herculean. However, we can make predictions as we attempt to do in this chapter. We do this by examining Veblen’s views, along with Marshall McLuhan’s (1964) prescient proposition that media technology shapes how individuals think, feel, and act; and, ultimately, how society operates. Veblen (1919) coined the term technological determinism and argues that technology in any society determines that society’s nature and its interaction. Although both Veblen’s (1919) and McLuhan’s (1964) works are dated, their ideas provided starting points for the technological determinism theory that Winner (1977) later proposed. Langdon Winner (1977) provides a clear and deep conception of a technological determinism hypothesis. He writes: “the social determination of technology is the social circumstances of technological development, deployment and use” (p. 22). In other words, he urges us to pay attention to the social environment and particular circumstances that drive development of a technology and, further, how the technology is subsequently deployed and used. Winner’s work dwells on the political environment and how that creates “particular technologies.” The often-cited theory made a prescient observation of the height of bridges in Long Island, New York. These bridges were purposefully designed to prevent passage of large passenger vehicles for a political reason, that is, to prevent transportation of a poor class of people through the neighborhood. Thus, the bridge (a technology) was adopted for political reasons (to determine who will pass through or not in the neighborhood). Since then, however, the theory has served a broader interpretation. For this particular work, we use a broader view to examine how an artificial intelligence (AI) technology is designed to enhance video interaction, among others, but is now manipulated to achieve other goals including production of deepfake videos. It is important to note that technological determinism and the adherents to its views are conflicted in terms of whether such determinism is soft or hard. The hard adherents argue that society has no choice but to act on

152 

U. S. AKPAN AND C. ONWUMECHILI

technological dictates. The belief is that the efficiency of the technology dictates its use. Importantly, it stresses the importance and power of technology to regulate social activity. The soft adherents argue otherwise. These adherents do not deny that technology is powerful, nor do they deny that it is impactful on social activity. However, rather than view technology as all-powerful, they argue that technology is passive rather than active. This view notes that society ultimately determines how each technology is and will be used without society simply succumbing to the dictate of the technology or its inventor. Although much of Winner’s focus was on the impact of technology on politics, his views are crucial to understanding the relationship of technology and football. Ultimately, as we have seen with the goal line technology and the VAR, these developments change football and how it is played and perceived. We argue that deepfake technology has utility given the ubiquity of controversies that surround footballing on-field aspects and that stakes involved in these controversies will encourage the use of deepfakes. Thus, as Winner (1977) points out, the social needs for the use of the technology already exist and what is now expected is its widespread deployment and use. In the next section, we delve into the footballing environment that will encourage the use of deepfake technology in the sport and point to historical incidents that support this view. Additionally, we will address relevant other issues.

Artificial Intelligence and Enhancing Deepfake Artificial intelligence (AI) is now widely used in numerous human endeavors, including hardware and software manufacturing (Chen et al., 2021), agriculture (Dharmaraj & Vijayanand, 2018), and retail (Weber & Schutte, 2019), among others. AI refers to the ability of machines or electronics to perform human cognitive functions, which include learning, reacting, interacting, and solving problems. While most of these functions are often routine and repetitive, sporting activities involve highly unpredictive action, non-routine, and function on much higher mental levels. Nevertheless, even such highly complicated activities are not beyond replication by artificial intelligence, at least in the future. Deepfakes are segmented into different types such as photo deepfakes, audio deepfakes, video deepfakes, and audio and video deepfakes. Photo deepfakes, technically known as face and body swapping, are used to

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

153

replace the face or body of the person with the other. Audio deepfakes are voice spooling techniques where the voices of different persons are interchanged. Video deepfakes are divided into face swapping, face morphing, and full-body puppetry. Audio and video deepfakes are lip-syncing techniques where mouth movements and spoken words are changed in a talking head video. Ultimately, artificial intelligence (AI) has boosted the acceleration, expansion, and perfection of producing and disseminating manipulated and fake media content, and this has ushered in the age of hybrid reality. Sports, as complicated as it may be, could become a target for deepfake creators aided by artificial intelligence (AI). With a press of a button, pictures, sound, and moving images can now be altered and even generated entirely by computation. Nonetheless, any doctoring or manipulation of sports activities could make the audience lose interest. Studies show, however, that for any sports to enjoy massive global appeal and acceptability, the performance of the sports talents must solely be a product of developed skills and natural talent (Bertrand, 2000; Boyle, 2006; Boyle & Haynes, 2014; Brennen & Brown, 2016; McQuail, 2003; Ramon-Vegas & Rojas, 2017; Thomas & Pauly, 2007). Thus, tampering with this would doom sport. For instance, deepfake technology could pose damaging credibility and ethical challenges soon. Already, the viral nature of altered media contents in the form of fake videos, audios, and images has largely doubled in the last few years. These could generate blackmail, intimidation, sabotage, ideological influencing, incitement to violence, and could lead to broader implications for trust and accountability. In essence, there are potentially far-reaching consequences of deepfakes, yet less attention is paid to the damage it might create for the future of sports. More complicated, however, is that sport has become an opium for the masses in the sense that the activities involved are considered one of the significations of human physical endeavors. Thus, public opinion has become heightened in the sporting arena. Numerous public polls are conducted in search for public perception of different aspects of sports (Carstairs, 2003). This is important considering the increased link between the public and the sport that they consume. It is, therefore, not far-fetched to fathom a situation where public opinion is cultivated in disciplinary situations, especially in cases involving major and popular athletes as well as teams. Will these cases produce a true reflection of human physical action and effort?

154 

U. S. AKPAN AND C. ONWUMECHILI

This question has become relevant because live sport already involves deceptive actions by athletes who fake injuries to get opponents punished by umpires. This phenomenon, rampant in numerous sporting events, has attracted a growing number of academic studies according to Guldenpenning et al. (2017). Guldenpenning et al. focused on whether such deceptive actions negatively impact opponents’ performance. They had investigated research reports on a variety of sports including cricket, soccer, basketball, and rugby. However, their point about the spread of the phenomenon in professional sports is important and has been noted by sporting authorities as well. Authorities mainly address this phenomenon by issuing sanctions against culprits. For instance, warnings are issued to erring footballers. However, the above incidences have mostly involved athletes simulating injuries and fouls suffered during action. The faking has yet to spread to video manipulation of the events. Technological determinism proposes that technology’s use is not only dictated by the technology but also by those who use it. Thus, while video technology may have been developed and is developed to help accuracy and fairness in adjudicating sporting contests, for instance, the same technology may be manipulated to serve selfish ends. Already, these types of manipulations are rife in other arenas, especially politics where opponents or rivals seek to one-up another by putting the other in bad light through doctored videos. For instance, scholars argue that deepfake technology is most relevant in situations where persons are unable or unwilling to appear in footage as saying or doing certain things (Caporusso, 2021; Chesney & Citron, 2019; Citron & Chesney, 2019; Diakopoulos & Johnson, 2021; Fletcher, 2018; Franks & Waldman, 2019; Meskys et al., 2020; Öhman, 2020; Silbey & Hartzog, 2019; Spivak, 2019; Westerlund, 2019). Both athletes’ real-life faking and potential video doctoring to achieve similar results lead to concerns that negatively impact the core fundamentals of sporting competition. All forms of sports have deep fundamental tenets that make people enjoy the spectacle. They are built on fairness, objectivity, and truth. This makes them widely accepted and considered credible by people. Fairness in sports, ordinarily, refers to impartial and just treatment of competitors. Deepfake technology, however, is morally suspect because it is susceptible to usage against the spirit of fair play. Often this technology is used to achieve a subjective aim which is antagonistic to fair play. It is

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

155

directly used to provide an advantage to a party involved in competition with the goal of winning beyond a fair environment. Objectivity can often be confused with fairness, but they refer to different things. Objectivity refers to avoiding prejudices, biases, and subjectivity during decision-making. On the other hand, fairness refers to just treatment of competitors. While the two may overlap in practice, they are dissimilar. Fairness recognizes that there are competing sides and seeks to cater to the well-being of the sides. Objectivity focuses on always making the right decision regardless of the presence of the competitors. It is a thin line of difference but an important one. Unfortunately, deepfakes technology threatens both fairness and objectivity in sporting contests. This is made worse by the lack of gatekeeping in the new world of social media where anyone can create and post content. In such an environment, deepfakes have proliferated. However, assuring the maintenance of both fairness and objectivity requires truth in sporting competitions. This means that there is continuing need to defeat deepfake technology, which is a digital technique aimed at imposing deceptive materials directed to alter an ethically sound objective process. The technology poses significant fundamental and ethical challenges. But it also means that a strategy to defeat deepfake must recognize the social media environment and invent ways to combat the scourge. Ultimately, because our image and voice are closely linked to our identity, credibility and protection against the manipulation of hyper-realistic digital representations of sports talents’ image and voice are a fundamental moral right in the age of deepfakes. The sensitivity to these rights could present a starting point of conviction to combat deepfakes. Already, doctored videos have entered the sporting realm but are not yet widespread in soccer. Not surprisingly, Barstool Sports has ventured into such videos. Barstool Sports is a sports and popular culture blog that has been steeped in controversy, a genre in which it thrives. According to McIntosh (2019): There is a video that shows a bar packed wall-to-wall with fans watching an NBA Finals game between the Toronto Raptors and the Golden State Warriors as if their lives depend on it. A few seconds in, the bar erupts into cheers after Golden State player and basketball legend Kevin Durant sinks to the ground, clutching his leg. The entire crowd begins screaming, jumping, cheering, and tossing their beverages into the air in what looks like an extremely unsportsmanlike display of joy at a player's injury. It didn't hap-

156 

U. S. AKPAN AND C. ONWUMECHILI

pen. Some Raptors fans did cheer the injury, but this particular video, posted online Tuesday morning by the American sports and culture blog Barstool Sports, is an edited version of one from 2016. (para. 1-3)

McIntosh’s report is only a window to a possible future in sports. The question is not whether the “future” will occur, but when? For football, it could be the beginning of the end as a spectacle. The question then is whether there is a solution in sight.

Finding Solutions Creators of deepfake have worked hard in making the content of their faked videos look realistic. Nevertheless, believability depends on the sociological construct and personal judgment of the product’s consumer (Ba et al., 2003; eMarketer, 2008; Koehn, 1996). Therefore, the believability of deepfakes ultimately depends on variables such as personal beliefs. It is dependent on the decision of every individual (Tencé et al., 2010, p. 3). Deciphering the reality of a video, therefore, depends on features like cultural background and experience of the receiver of the information. In addition, credibility is also based on the assumed credibility of the source of the information. Is the source a friend or a family member whose messages they usually trust? Ultimately, source credibility plays a major role in the receiver forming an opinion (Pariser, 2011; Umeogu, 2012). However, these options in detecting deepfake are not adequate nor are they foolproof. More compelling methods are necessary. Better methods are already underway to ensure the prevention of future catastrophes. These more effective measures can stem the impending problem by easily detecting fake videos of important football actions. While some of the solutions do not come from football authorities, they could become relevant in the future. This is important to maintain public trust in the sport. In the following paragraphs we discuss these solutions and actions that are underway. Importantly, the United States’ Defense Research Projects Agency (DARPA) is already working on developing an algorithm for analyzing videos to expose inconsistencies. For football authorities, one envisages the recruitment of video analysis experts not only by clubs but by footballing authorities such as Associations; creating verified video systems; developing new legal systems to prevent and to react to cases in the future; and educating the larger public on the detection of fake videos. These would be an integrated system of solutions.

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

157

Sybert (2021) reports on DARPA’s work, within its media and semantic forensics units, developing algorithms to detect falsified media. According to Sybert, DARPA’s plan includes ways to filter media quantitatively to measure three types of integrity: digital, physical, and semantic. For instance, such analysis may take a video to analyze expected action of an individual on a soccer field based on verified data and compare that with an investigated video data to assess match. In essence, this is character matching. Sybert, citing a high-ranking DARPA official, reports: “We’re about a year into this program and we’ve developed a number of interesting algorithms across the detection, attribution and characterization space” (para. 12). However, DARPA’s work is not specifically designed to stop football’s problems or to ensure football’s future. Instead, DARPA is focused on solving a larger social problem with its sphere—the United States of America. Nevertheless, football stands to gain as we have already insinuated in the preceding paragraph. Football, additionally, can develop solutions specific to its needs. For instance, football authorities could hire video analysts at both association and club levels to assist in analyzing videos to ascertain authenticity. This obviously creates new job opportunities that could emerge in the years to come. Many of those would likely train using research from DARPA’s innovative work. At the football association level, possibly at the FIFA level, a new rating system for football videos may be developed. This system could mirror the film rating system or the disciplinary card system in football (see Table 8.1). Table 8.1 shows a probable system that will give football video consumers’ confidence of authenticity of particular football videos. V represents the top rating for any video. It identifies video already authenticated by football authorities as genuine video. SV represents video submitted for authentication but not yet authenticated. The least rating, assigned UV, represents video produced by an authenticated source or agent but not Table 8.1  Proposed football video rating system Rating Description V SV UV

Verified & authenticated video. Verified by FIFA or its affiliate Video submitted for authentication and currently undergoing verification/ authentication by footballing authorities Video produced by authorized agent/source but not submitted for verification/ authentication

158 

U. S. AKPAN AND C. ONWUMECHILI

submitted for authentication. Any other video without the rating of UV, SV, or V will be viewed at the viewer’s discretion as football authorities have neither certified its source nor has an authority certified the product. The objective, of course, is to alert the public and others of the level of authenticity for each video. Importantly, both at club and association levels, a robust legal system should be developed to both discourage and prosecute those who violate new rules by distributing doctored videos that bring clubs, organizations, and the game to disrepute. These systems would be in place to discourage violators. A critical aspect of arresting the problem is educating the larger public. Why is educating the public important? Note that, ultimately, the goal of the creators of fake videos is to persuade the public. Thus, the fight against fake videos is not simply among the football insiders but also to win over the larger public and to prevent the public from falling prey to fake videos. After all, the future of football lies in sustaining the confidence that the public have in the football product not just on the field but in the recorded aspects of the game consumed by the public. While the video rating and labeling, identified above, partly address this issue given that the public would access such labels, the education aspect remains critical. Here, the goal is to create public information pieces that communicate football’s plans against fake videos, educate the public on how to identify such videos, and point to strategies and plans used by football to fight such videos. We are convinced that the measures identified here—DARPA’s research and countermeasures, football’s use of video analysts, introduction of video rating systems, using a robust legal system, and public education— will integratively go a long way in discouraging the use of deepfake videos in football. Although the use of deepfake is minimal in football, now, we realize that football is ripe for its use and if nothing is done its use could become an epidemic. Already, we noted the use of doctored crowd images and videos by Barstool Sports in the United States.

Conclusion Willingham’s (2020) important article published by CNN is prescient because it recognizes the increasing emergence of deepfakes and makes a point to educate the wider audience on how to recognize unauthentic videos. CNN would not have dedicated a major piece on this issue if it is not a major public concern. Nevertheless, it is important to reiterate that

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

159

doctored video in football reports is not, presently, a major concern. However, there is no question that it has become a scourge in other areas, particularly in political communication as cited by Willingham (2020) and Taylor (2021). Both academic and public interest articles are also increasingly discussing deepfakes in other spheres beyond political communication. This means that football, being of interest to millions of people globally, could be a fruitful target for those who distribute such videos to alter or shape opinions on an issue of interest. Our chapter uses Langdon Winner’s technological determinism theory to make sense of the impact of doctored video on football. Winner’s work, as well as analysis of it since, points to two readings of technological determinism. One is soft determinism, and the other is hard determinism. As we already noted, the soft perspective appears more fruitful as it recognizes the human agency in impacting the use of technology. The deepfake phenomenon underlines that human agency. It reminds us that while people have used the same technology for good, others have instead used them to create deepfakes. Clearly, technology in this case is not so much as deterministic, instead it shapes its public engagement. As this chapter notes, the state and institutions may play a role in that public engagement with technology. We have chosen to use football as a test case in our discourse of deepfakes. This choice is deliberate. Football hinges on fair play and objectivity but those tenets can be directly and negatively impacted by deepfakes. In this article, ultimately, we have addressed not just the threat of deepfakes to football tenets of fair play and objectivity, but we have also pointed to ways that this impending threat could be countered via various means. Already, DARPA is providing leadership in researching ways to counter the scourge of deepfakes but we have added other strategies that could be helpful.

References Akpan, U. (2020). Elite Local Leagues and Transnational Broadcast of European Football. In C. Onwumechili (Ed.), Africa’s Elite Football: Structure, Politics and Everyday Challenges (pp. 34–44). Routledge. Ba, S., Whinston, A. B., & Zhang, H. (2003). Building Trust in Online Auction Markets Through an Economic Incentive Mechanism. Decision Support Systems, 35(3), 273–286. Boyle, R. (2006). Sports Journalism: Context and Issues. Sage.

160 

U. S. AKPAN AND C. ONWUMECHILI

Boyle, R., & Haynes, R. (2014). Watching the Games. In V.  Girginov (Ed.), Handbook of the London 2012 Olympic and Paralympic Games (Vol. 2, pp. 84–95). Routledge. Bertrand, C.-J. (2000). Media Ethics and Accountability Systems. Transaction Publishers. Brennen, B., & Brown, R. (2016). Persecuting Alex Rodriguez: Race, Money and the Ethics of Reporting the Performance-enhancing Drug Scandal. Journalism Studies, 7(1), 21–38. Caporusso, N. (2021). Deepfakes for the Good: A Beneficial Application of Contentious Artificial Intelligence Technology. In T. Ahram (Ed.), Advances in artificial intelligence, software and systems engineering. Proceedings of the AFHE 2020 virtual conferences on software and systems engineering, and artificial intelligence and social computing, July 16–20, 2020 (pp. 235–241). Springer. Carstairs, C. (2003). The Wide World of Doping: Drug Scandals, Natural Bodies, and the Business of Sports Entertainment. Addiction Research & Theory, 11(4), 263–281. Chen, H., Ling, L., & Chen, Y. (2021). Explore Success Factors That Impact Artificial Intelligence Adoption in the Telecom Industry in China. Journal of Management Analytics, 8(1), 36–68. Chung, J., Lee, Y. H., & Kang, J.-h. (2016). Ex Ante and Ex Post Expectations of Outcome Uncertainty and Baseball Television Viewership. Journal of Sports Economics, 17(8), 790–812. Citron, D.  K., & Chesney, R. (2019). Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security. California Law Review, 107, 1753–1819. Chesney, R. & Citron, D. K. (2019). Deepfakes and the New Information War. Foreign Affairs, January/February, 147–155 Coates, D., Humphreys, B., & Zhou, L. (2014). Reference-Dependent Preferences, Loss Aversion, and Live Game Attendance. Economic Inquiry, 52(3), 959–973. Dharmaraj, V., & Vijayanand, C. (2018). Artificial Intelligence (AI) in Agriculture. International Journal of Current Microbiology and Applied Science, 7(12), 2122–2128. Diakopoulos, N., & Johnson, D. (2021). Anticipating and Addressing the Ethical Implications of Deepfakes in the Context of Elections. New Media & Society, 23(7), 1–27. eMarketer. (2008). Consumer Interactions: Social Shopping, Blogs and Reviews. Retrieved January 16, 2022, from www.emarketer.com/Reports/All/ Emarketer_2000482.aspx. Fallis, D. (2021). The Epistemic Threat of Deepfakes. Philosophy & Technology, 34, 623–643.

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

161

FIFA.com. (2018, December 21). More than half of the world watched record-­ breaking 2018 World Cup. Retrieved February 23, 2022, from https://www. fifa.com/tournaments/mens/worldcup/2018russia/media-­releases/more-­ than-­half-­the-­world-­watched-­record-­breaking-­2018-­world-­cup Fletcher, J. (2018). Deepfakes, Artificial Intelligence, and Some Kind of Dystopia: The New Faces of Online Post-fact Performance. Theatre Journal, 70, 455–471. Franks, A., & Waldman, A. E. (2019). Sex, Lies, and Videotapes: Deep Fakes and Free Speech Illusions. Maryland Law Review, 78(4), 892–898. Guldenpenning, I., Kunde, W., & Weigelt, M. (2017). How to Trick Your Opponent: A Review Article on Deceptive Actions in Interactive Sports. Frontiers in Psychology, 8, 1–12. https://doi.org/10.3389/fps.2017.0917 Hao, K., & Heaven, W. D. (2020, 12 24). The Year Deepfakes Went Mainstream. MIT Technology Review. Retrieved 04 25, 2021, from https://www.technologyreview.com/2020/12/24/1015380/best-­ai-­deepfakes-­of-­2020. Kassing, J. W., Billings, A. C., Brown, R. S., Halone, K. K., Harrison, K., Krizek, B., & Turman, P. D. (2004). Communication and the Community of Sport: The Process of Enacting, (re) producing, Consuming, and Organizing Sport. Communication Yearbook, 28, 373–408. Kietzmann, J., Lee, L. W., McCarthy, I. P., & Kietzmann, T. C. (2020). Deepfakes: Trick or Treat? Business Horizons, 63(2), 135–146. https://doi.org/10.1016/j. bushor.2019.11.006 Koehn, D. (1996). Should We Trust in Trust? American Business Law Journal, 34(2), 183–203. Lievrouw, L. A., & Livingstone, S. (2009). Introduction. In L. A. Lievrouw & S. Livingstone (Eds.), New Media. Sage Benchmarks in Communication. SAGE. Mahmud, B., & Sharmin, A. (2020). Deep Insights of Deepfake Technology: A Review. DUJASE, 5(1 & 2), 13–25. Malik, A., Kuribayashi, M., Abdullahi, S., & Khan, A. (2022). DeepFake detection for human face images and videos: A survey. IEEEAccess. Retrieved February 12, 2022, from https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=9712265. Maras, M.  H., & Alexandrou, A. (2019). Determining Authenticity of Video Evidence in the Age of Artificial Intelligence and in the Wake of Deepfake Videos. International Journal of Evidence & Proof, 23(3), 255–262. https:// doi.org/10.1177/1365712718807226 Mcintosh, E. (2019). We fact-checked a fake video of Raptors fans cheering for an injury. Canada’s National Observer. Democracy and Integrity Reporting Project. https://www.nationalobserver.com/special-­reports/democracy-and-integrityreporting-­project. McLuhan, M. (1964). Understanding Media: The Extensions of Man. McGraw-Hill. McQuail, D. (2003). Media Accountability and Freedom of Publication. Oxford University Press Oates.

162 

U. S. AKPAN AND C. ONWUMECHILI

Meskys, E., Liaudanskas, A., Kalpokiene, J., & Jurcys, P. (2020). Regulating Deep Fakes: Legal and Ethical Considerations. Journal of Intellectual Property Law & Practice, 15(1), 24–31. Öhman, C. (2020). Introducing the Pervert’s Dilemma: A Contribution to the Critique of Deepfake Pornography. Ethics and Information Technology, 22, 133–140. Panyatham, P. (2021, 03 10). Deepfake Technology In The Entertainment Industry: Potential Limitations And Protections. Arts Management & Technology Laboratory. Retrieved 25 January, 2022, from https://amt-­lab. org/blog/2020/3/deepfake-­t echnology-­i n-­t he-­e ntertainmentindustrypotential-­limitations-­and-­protections. Pariser, E. (2011). The Filter Bubble: What The Internet Is Hiding From You. The Penguin Press. Pérez, L., Puente, V., & Rodríguez, P. (2015). Are Broadcast Sporting Events of “General Interest”? A Regional Panel Data Analysis of TV Ratings for Spain’s La Liga. Journal of Media Economics, 28(1), 7–19. Ramon-Vegas, X., & Rojas, J.  L. (2017). Mapping Media Accountability Instruments in Sports Journalism. El Profesional de la Información, 26(2), 159–171. Salaga, S., & Tainsky, S. (2015). Betting Lines and College Football Television Ratings. Economics Letters, 132, 112–116. Silbey, J. M., & Hartzog, W. (2019). The Upside of Deep Fakes. Maryland Law Review, 78(4), 960–966. Spivak, R. (2019). ‘Deepfakes’: The Newest Way to Commit One of the Oldest Crimes. Georgetown Law Technology Review, 3(2), 339–400. Statista.com. (2021, October 1). Global sports market  – statistics and facts. Retrieved March 12, 2022, from https://www.statista.com/topics/8468/ global-­sports-­market/#dossierContents__outerWrapper. Sybert, S. (2021, September 16). DARPA launches new programs to detect falsified media. Govcio Media & Research. Retrieved March 10, 2022, from https://governmentciomedia.com/darpa-­launches-­new-­programs-­detectfalsified-­media. Taylor, B. (2021). Defending the State from Digital Deceit: The Reflexive Securitization of Deepfake. Critical Studies in Media Communications, 38(1), 1–17. Tencé, F., Buche, C., Loor, P. D., & Marc, O. (2010, 09 02). The Challenge of Believability in Video Games: Definitions, Agents Models and Imitation Learning. GAMEON-ASIA'2010. Retrieved 04 25, 2021, from http://arxiv. org/pdf/1009.0451v1. Thies, J., Zollhöfer, M., Stamminger, M., Theobalt, C., & Nießner, M. (2016). Face2Face: RealTime Face Capture and Reenactment of RGB Videos. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp.  2387-2395). Las Vegas: IEEE. https://doi.org/10.1109/ CVPR.2016.262.

8  DEEPFAKES: FUTURE OF SPORTS AND QUESTIONING VIDEO EVIDENCE 

163

Thomas, P., & Pauly, J. (2007). Sports Journalism as Moral and Ethical Discourse. Journal of Mass Media Ethics, 22(4), 332–347. Umeogu, B. (2012, 05). Source Credibility: A Philosophical Analysis. Open Journal of Philosophy (2 2), pp.  112–115. https://doi.org/10.4236/ ojpp.2012.22017. Veblen, T. (1919). The place of science in modern civilization and other essays. New  York: Huebach. Reprinted 1990. With a new introduction by Samuels W. New Brunswick, NJ: Transaction Books. Wahl-Jorgensen, K., & Carlson, M. (2021, 04 05). Conjecturing Fearful Futures: Journalistic Discourses on Deepfakes. Journalism Practice, pp. 1–18. https:// doi.org/10.1080/17512786.2021.1908838. Weber, F., & Schutte, R. (2019). State-of-the-Art and Adoption of Artificial Intelligence in Retailing. Digital Policy, Regulation and Governance, 21(3), 264–279. Westerlund, M. (2019). The Emergence of Deepfake Technology: A Review. Technology Innovation Management Review, 9(11), 39–52. Winner, L. (1977). Autonomous Technology: Technics-out-of-Control as a Theme in Political Thought (p. 100). MIT Press. Willingham, A. (2020, October 19). Is that video real? Fake images and videos are everywhere. Here’s how to spot them. https://www.cnn.com/interactive/2020/10/us/manipulated-­media-­tech-­fake-­news-­trnd/ (Accessed February 27, 2022).

CHAPTER 9

Examining the Role of ‘KOT’ in Reinforcing Organizations’ Voices Against Misinformation in Kenya Masaki Enock Magack

Introduction Social Networking Sites (SNSs) have attracted huge interest in analysing audience’s online activities towards carrying out several discourses that directly or indirectly influence people’s daily lives (Pratiti & Lisa, 2017). However, it is also alarming that misinformation has penetrated the online public sphere, thereby shaking the social, economic and political stability of individuals and organizations (Ahmed et al., 2022). In fact, a report by Newman et al. (2020) shows that Kenyans are part of the 76% of respondents from global south that are in a topsy-turvy on how to interpret fake and real information on the digital platforms they interact with daily. As the Swahili proverb says, ‘penye moshi hapakosi moto’ (where there is smoke there must be fire). This has shown why organizations have already shifted their discourses to the online platforms to raise their voices against inaccurate and misleading information from cyberspace.

M. E. Magack (*) Daystar University, Aithi River, Kenya © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_9

165

166 

M. E. MAGACK

Digital communication as the ‘umbrella’ has been beneficial in enhancing communication not just for individuals but for organizations generally in Africa and the world at large (Onwumechili & Amulega, 2020). Also, in examining human interactions on social media, Langmia (2016) posits that indeed SNSs have uplifted interactive communication both synchronously and asynchronously. This shows that, for organizations to thrive, they have to engage with SNSs for their audience’s engagement not just for their advertisement but raise their voices against misinformation. Similarly, all SNSs are mediated environments that have the ability for users to engage on discourses virtually (Liberman, 2016). This is why organizations need to drive engagements and ensure they communicate effectively to all stakeholders with the intention of shedding light on any viral inaccurate information that populate their sites. This chapter has therefore examined specific hashtags by Kenyans on Twitter (KOT) that were used to drive engagement about organizations and their effects on decisions for policy changes and development. At the same time explore how KOT has become innovative in changing narratives on misinformation and misrepresentation. This chapter further illustrates how misinformation and misrepresentation by online talkers and texters in the era of digital communication by Africans, more particularly Kenyans, have used Twitter through hashtags that reinforce voices for/ against organizations. Also, on how parallel hashtags are created to counter misinformation against the same narrative raised. For example, KOT created #kerochelimited while Keroche Limited created #threadstorm. This is equally the same narrative created of Kenya as a ‘#HotBedOfTerror’ but KOT started #TheHotBedOfChampions and #TheMagicalKenya was created by the Tourism Board of Kenya among others which will be mentioned later in the chapter (Al-Bulushi, 2019). In addition, other organizations such as Africa Uncensored have created a hashtag, #pigafirimbi (#blowthewhistle) an initiative to invite the public in dealing with misinformation on Twitter, therefore, helping Kenyans authenticate sources and content shared online. Lovejoy and Saxton (2012) argue that in the past, non-profit organizations have not been able to utilize their websites in engaging with their stakeholders. Therefore, with the emergence of SNSs having inbuilt affordances that are interactive such as Twitter have now generated ways for microblogging. It is in their article that they find a gap and encourage future researchers to investigate reasons and connections that specifically prompt organizations to carry their discourse to social media. Similarly,

9  EXAMINING THE ROLE OF ‘KOT’ IN REINFORCING ORGANIZATIONS… 

167

Jurgens and Ruths (2015) posit that, organizations not only have presence on social media but have rallied their activities for various reasons such as advertisements and stakeholder engagement. However, they also note that little has been researched about the effects of such discourses that come out of misinformation. Nothias and Cheruiyot (2019) show how activists equally in the country have created hashtags that can give voices to organizations on Twitter. For example, the renowned Kenyan activist by the name Boniface Mwangi started #OccupyPlayground in 2015 together with other hashtags that were created by KOT #ListOfShame #ShuleYangu cascading the trending aspect and helping in shaping the matter. It is with this regard that this study bases its foundations in examining the role the public play in reinforcing the voices of organizations in Kenya amidst chaotic and toxic responses that tend to deviate from accurate to inaccurate messages or vice versa.

Twitter, Misinformation and Organizations’ Online Engagement The proliferation of discourse on Twitter in Kenya by organizations is highly rooted on the Westgate Mall attack in Kenya on September 2013. Twitter became one of the busiest SNSs used by the government of Kenya, the Kenyan people and emergency organizations. Since then it has become confusing for the public to trust online sources of information as everyone who has access to Twitter can post anything, and it could be shared widely without authenticating the source. Images, texts, audios and videos have been shared randomly. Goldberg et al. (2014) used TwitterMate to collect and analyse 67,849 tweets from selected organizations. They argue that indeed social media has available affordances that have pushed (mis)information to the publics through a two-dimensional flow between the publics and different authorities. Also, they allude to the fact that statistics showed that Kenya came second in Africa in usage of Twitter after South Africa; 59% of the users using their mobile phones. Additionally, they urge future scholars to analyse utilization of normal discourses and emergency ones by organizations on Twitter. This is a gap this study tends to bridge. Previous research by Wamuyu (2020) on online misinformation, disinformation and fake news found out that 86.2% of the Kenyan participants scrolled through false and inaccurate content and finally shared it. This

168 

M. E. MAGACK

could be content about organizations or individuals that are designed to (mis)inform the user and since it is not easy to tell what is fake and real, users share the same messages to family and friends and colleagues at work. In the year 2015, KOT ran a campaign in response to Cable News Network (CNN) regarding the narrative that they felt was a misrepresentation and misinformation to the world tarnishing their country Kenya as a ‘hot bed of terror’ (Al-Bulushi, 2019) by creating #HotBedOfChampions, #SomeonetellCNN (120,000 tweets in less than a day). The tourism board of Kenya created #MagicalKenya which went viral to an extent of even being used by local media houses’ official Twitter handles like Citizen TV while disseminating news content. This hashtag (1) further led to cancellation of US$1 million a deal the Kenyan government had sealed with CNN for tourism purposes and later reinstated after this discourse on Twitter and (2) forced the CNN’s managing director to make a trip to Kenya and explain their ‘global misrepresentation and misinformation’ of the country. He then offered his apologies. In Africa, hashtags on Twitter have been used by different groups to raise their voices for and against organizations. In South Africa, for instance, #FeesMustFall saw students protest on Twitter due to increased university fees across the country that were associated with racial conflicts and socio-economic status until it became a national debate (Bosch, 2016). This campaign went viral, and the author posit that hashtags can be used to foster activism that could lead to policy changes. In Zimbabwe, Moyo et al. (2022) have found out that Twitter stands out of any form of media in shaping the reputation of corporate organizations and therefore the need for management to create goals that promote good relations related to corporate behaviour. In Uganda, hashtags have been used in various sectors, but some viral ones are spotted during elections like #UgandaDecides in their presidential elections (Nalwoga, 2017). In Nigeria, #BringbackourGirls went viral after members of Boko Haram a Terrorist group kidnapped 273 secondary school girls (Olson, 2016). The hashtag was an organized movement by Nigerians using the online community in this case Twitter in setting the agenda for media houses regarding matters to do with girls and women in the world. In light of all developments mentioned above and later in the subsequent paragraphs, this research intends to use the dialogic communication theory whose five principles as outlined by Kent and Taylor (1998) help in guiding successful penetration into the World Wide Web by organizations’ dialogism (Wirtz & Zimbres, 2018). Kent and Taylor not only offer the

9  EXAMINING THE ROLE OF ‘KOT’ IN REINFORCING ORGANIZATIONS… 

169

guidelines to organization’s public relations, but also practical suggestion that can uplift their interaction through involving their staff with the publics in the online environment in responding to organizational events. Due to the theory’s application on social media, the research adopts it in examining the role of KOT on Twitter in reinforcing organizations’ voices. Based on the assumptions of this theory, the following research questions have been considered: (A) How has Twitter’s digital affordances facilitated the discourse on using the hashtags? (B) What is the impact of the dialogue by Kenyans on Twitter using the hashtags? (C) What is the extent to which (mis)information shared on the hashtags was useful to Kenyans on Twitter towards policy change? and (D) How have KOT and organizations used Twitter to avert misinformation and misrepresentation?

Theoretical Grounding The Dialogic Communication Theory as outlined by Kent, Taylor (1998); Taylor, Kent and White (2001); Kent, Taylor and White (2003) investigates the potential World Wide Web has in building relationships between the publics and organizations and thus the theoretical framework. The five principles coined as highlighted by McAllister-Spooner (2009) in his survey review include use of a dialogic feedback loop that allows a two-way communication where the public can question an organization as they respond to it; use of useful information where organizations are urged to provide all public necessary information all the time; ease use of interface where the platform in use for this discourse has to be easily accessible to interact with; conservation of visitors where sites ought to provide features and links offering value to visitors’ time on them and generation on return visits where creating a relationship for a need to engage. These principles have therefore enabled interactive engagement and increased communication effectiveness across all organizations achieving rich relationships. First, this study seeks to understand the impact of the dialogue by Kenyans on Twitter using the hashtags. (This will be an explication of the principle of Dialogic Loop—allow feedback from audiences to be embedded in the public’s tactic itself.) Second, this study shows the extent to which the (mis)information shared on the hashtags was useful to Kenyans on Twitter towards policy development/change (the principle of usefulness of information). Third, this study will also show how Twitter’s digital affordances facilitate the discourse on using the hashtags (the principle of

170 

M. E. MAGACK

ease of interface) and, fourth, examine how Kenyans and organizations are using Twitter to counter misrepresentation and misinformation.

Methodology Data for this study was culled from Python. Python is a software designed for eliciting data from SNSs using hashtags (Jeswara, 2022). SNSs data is considered a powerful source of information given it can provide a near real-time outlook on both social processes, such as politics, current day events, and natural processes including weather events. Data elicited in this study are text messages, reviews, images and other data posted online about a product, event or organization (Balducci & Marinova, 2018). Twitter data also provides data from media, individuals, organizations, official and other members of the public on these social issues (Earth Lab, 2018). #ThreadStorm vs #KerocheBreweries, #HotBedOfTerror vs #HotBedOfchachampions and #TheMagicalKenya, #SafaricomDisobeyscourt vs #TwendeTukiuke and #MissionToRescue, #PigaFirimbi, #OccupyPlayground vs #ShuleYangu and #ListOfShame hashtags were used to collect the tweets. The selected hashtags were trending on misinformation in Kenya therefore suitable for the study. The tweets were inserted unto an Excel spreadsheet imported into NVIVO: a computer-assisted qualitative data analysis software to help in further analysis of the data. Data analysis was sought to unearth the thematic issues addressed by the tweets. The tweets were classified under the broadly reoccurring issues in the data as guided by the research questions (QSR International, n.d.; Parsons et  al., 2015). Findings from this data have provided important information as would be seen below.

Findings and Discussions The following were some of the general findings gathered and discussions resulting out the four research questions of this study. #KerocheLimited was created by KOT to ‘stand’ with the CEO of the company against what they thought in their opinion was malice and political influence to bring the organization to its foot. As later indicated, the CEO had also alluded to the fact that external competitors had ‘manipulated’ KRA and the government to harass her and her team. She also created #threadstorm to ensure accurate and informed updates about the matter in case there was any official communication she was to make. #SafaricomDisobeyscourt

9  EXAMINING THE ROLE OF ‘KOT’ IN REINFORCING ORGANIZATIONS… 

171

was a hashtag created by KOT raising their voices against the construction of Great Wall Gardens houses which had blocked the sewer line and hence its functionality within that area. Therefore, the Safaricom network users tagged along this initiative to highlight their disappointments with network provider brand. Safaricom countered the hashtag with #TwendeTukiuke and #MissionToRescue with their launch of products and marketing campaigns that would answer to the users’ calls and questions accurately. #PigaFirimbi is among the seldom hashtags that have deliberately been created to blow the whistle on behalf of Kenyans about any misinformed happenings in the society but then it urges the public to report any online content that needs verification and authentication for their team made of investigative journalists to repost with a confirmation if it was accurately posted or misinformed the public. #OccupyPlayGround and #ShuleYangu were created by KOT and a Kenyan activist to express the matter on land ‘grabbing’ and ensure that it counters any misinformation from #ListOfShame which dragged other agendas. Also, #HotBedOfChampions and #TheMagicalKenya were created to counter global misinformation and misrepresentation against its stands that Kenya had become a #HotBedOfTerror.

RQ 1. Ease of Use of the Interface and Facilitation of Discourse Twitter’s digital affordances were reported to be user-friendly. Initially, it enabled the tweets to communicate about the prevailing issues. They were able to communicate their position about the subject matter under discourse. #KerocheLimited, #PigaFirimbi, #HotBedofTerror, #HotBedOfChampions, #OccupyPlayGround, #ShuleYangu (#ourschool), #SafaricomDisobeyedCourt #TwendeTukiuke and #MissionToRescue kept users busy throughout trending with texts, images, videos, and these received likes, retweets, retweets with comments and even downloads. Twitter allowed Keroche Company Limited to state the things they were going through to a verse reach. ‘Common’ Kenyans in this chapter KOT were also able to communicate about their position on the issues that were happening according to the hashtags created. However, the fact that Twitter’s interface creates an openness which enables the users to discuss issues in the industry that they were not able to discuss initially in public due to fear of government punitive measures

172 

M. E. MAGACK

allowed tagging misinformation with the original tweet. Some dragged the sitting government’s name and family into the matter. Additionally, there were other instances where KOT dragged the CEO’s family feuds into the crisis and discussed them on Twitter. Additionally, there were many instances where KOT dragged along unnecessary tweets in the ‘serious professional and correct discourse’. These included the sale of items, people tweeting while riding along the trending hashtags. These created more traffic on the tweets however misrepresenting the matter at hand. Others discussed political issues in the tweets, yet these were matters concerning. “The Kenyatta’s want to take over all the Greener looming investments, the fight against #Keroche wasn't just over tax issues. Let's wait and see that brewery industry succeeded. #omanyala #BreakingNews https://t.co/eQhQXdbd9z In 2013, Tabitha was embroiled in a property row with her co-wife Jane Wambui Muigai, with the latter demanding a share of Keroche, the biggest privately owned brewery in Kenya. #TabithaKaranja #Keroche https://t.co/s105PYQyCr Lavish 3 Bedroom All Ensuite apartments available for Rent in Ruiru | Rent KES 65K &; 70K per month. To let/ view call 07******** or WhatsApp +254 7****** Location Ruiru Kihunguro behind Plainview Hospital

The majority of the sentiments are all misinforming and deviating the matter between involved parties as they could have been politically motivated, and one could assume that the government could be involved without any facts or rather Kenya Revenue Authority (KRA) was being used to coarse the organization.

RQ2. Impact of Dialogic Loops and Feedback The second research question sought to find out the dialogic loops and feedback mechanisms. The impact that (mis)information brings after the discourse was considered as negative or positive. Analysis showed that tweets were either direct tweets or retweets. Others seemingly came from the same location indicating that, a user could have impersonated and used the same account to retweet. The main tweet about a

9  EXAMINING THE ROLE OF ‘KOT’ IN REINFORCING ORGANIZATIONS… 

173

specific issue was a retweet by several KOT users indicating that a simple tweet has the capacity to be overblown into a sporadic tweet. Tweets about the organizations had intentions to celebrate change that was brought about by the intensity to celebrate impact. Direct tweets’ users came on Twitter and were directly tweeting about #KerocheLimited, #OccupyPlayGround, #HotBedOfChampions, #SafaricomDisobeysCourt and #TheMagicalKenya. Some tweets were original tweets. These mostly came from the organizations’ official Twitter account. Other tweets for example were by the Keroche CEO handle thanking KOT for standing with her and the organization she represented. Other tweets discussed the negative actions that they ‘have known’ Keroche for. This created a loophole that brought out some negated perceptions of companies. Some tweets showed solidarity, therefore, seeking the contribution of fellow Kenyans. Some tweets showed the contribution of Kenyans to the dialogue. Tweets were directed to showcasing the need of Kenyans to come up and argue the case for Keroche. Others started manipulating other users not to buy products produced by companies owned by first Kenyan president family allegedly associating it with the matter at hand. Equally, other users who were expressing their frustrations to Safaricom urged Kenyans to boycott their services and move to competitors. Kenyans thus used Twitter to create some room for discussions of pertinent issues on Twitter. In discussing pertinent issues, Kenyans for example got critical about the context of Keroche and KRA. Kenyans questioned the case at hand. Some Kenyans thought that Keroche was using the issue of employees as a scapegoat to avoid paying taxes. Others went vulgar and expressed their frustrations on the matter by just tweeting their opinion on these matters. Sometimes we pretend to be intellectuals and yet our minds are just fully of s***t. How does it make sense to @KRACorporate @KRACare to close down #keroche while you can waiver tax with conditions.#NCBA got merger waiver. ‘Buy Kenya build Kenya my a**’ we all know who owns NCBA

Similarly, tweets on #OccupyPlayGround dragged other individuals into the matter different from what was being discussed. This could ruin an individual’s career if not accurate. #AmKenyan I can’t. I feel sorry for her. She did not think #OccupyPlayground was her business Chase has woken her up https://t.co/tMSR5AJGHj

174 

M. E. MAGACK

This is to say that majority of the tweets did not speak on the matter under discussion. This therefore impacted the individuals and organizations that were mentioned and dragged into these hashtags. Some tweets analysed showed that organizations mentioned had to create campaigns to rally themselves behind for what they stood for and what they represented. The fact that some highlighted issues that needed change didn’t mean that they were real issues that needed attention. Others were competitor-­ motivated and others just went off the cuff.

RQ3. Usefulness of (Mis)information on Twitter to Policy Change KOT contributed highly into the discourses for all Twitter hashtags used in this study with different opinions, different goals that saw them trend once they were created. For example, the hashtag on Keroche through their CEO sent out communication on Twitter appreciating KOT for contributing to a large extent to the rescue journey. The communication was received from her appreciating KOT. There were also various tweets that brought about the criticisms by the Kenyans towards actions by the government. These were broadly discussed in terms of economic and political aspects prevalent in Kenya. However, other KOT users misused the online space misinforming the public. For example: “Eti makosa ya mama wa Keroche ni kwa sababu yeye amekataa kuenda Azimio” Speaking during a campaign rally in Kisii County, Deputy President William Ruto reveals why Keroche Breweries has been shut down again

The public also went ahead to critique the deliberations about actions that the government took on Keroche and question existing policies. The public also categorically questioned the various policies that the government has put across for business people but together with the critique, the users misinformed the public that the government had planned to fight Greener looming investments mentioning some Danish companies and that this was all about a state capture commission. Kenyans appeared to have looked out for information that could implicate the actions of the government. They stated that they know that everything in Kenya happens for a reason. I have been looking for this tweet @MutahiNgunyi #keroche it is scripted. Someone also hinted the first family want to venture into alcohol manufacturing pale Mai-mahiu. https://t.co/UqStWegKRy

9  EXAMINING THE ROLE OF ‘KOT’ IN REINFORCING ORGANIZATIONS… 

175

#OccupyPlayGround saw the activist Boniface Mwangi KOT in shaping the matter with #ShuleYangu and #ListOfShame that had deviated and politicized the matter. This pushed Kenya’s interior minister then the late Joseph Ole Nkaissery visit the school and apologize to the pupils whom police had ‘mishandled’ by throwing teargas at them when they headed to their playground that had been ‘grabbed’. The entire Kenyan mainstream including The Guardian published articles that had risen out of these hashtags but with accurate information about the matter. Majority of tweets from the hashtags expressed feelings that they should always stand to reclaim that which has been taken away from them or mislead them if they courageously stand up for those matters. #MagicalKenya as well as #HotBedOfChampions went viral to an extent of even being used by local media houses’ official Twitter handles like Citizen TV while disseminating authenticated news content about the matter. This hashtag further forced the CNN’s managing director to make a trip to Kenya and explain their ‘global misrepresentation and misinformation’ of the country and apologize.

RQ4. KOT and Organizations’ Use of Twitter to Avert Miscommunication and Misrepresentation Kenyans also used Twitter to avert miscommunication. These were enabled through creating room for discussing the pertinent issues. Africa Uncensored started an initiative that could analyse a trending piece of content whether a meme, GIF, video, screenshot image and pictures with suspicious information and expose it on their official Twitter and website with a FALSE/TRUE/FAKE STAMP.  Five hundred and two misinformed tweets were harvested and analysed. Majority of them indicated that, some users could be used by one person to tweet and retweet several times. Other accounts had same images of either renowned activists or personalities therefore manipulating other users to retweet without even checking the content being shared. The #PigaFirimbi hashtag now could analyse a particular tweet that has been reported trending and purporting misinformation and fake news and expose the misinforming tweets by reposting them. Africa Uncensored uses the #tag to ensure that Kenyan share any tweet that looks malicious and inaccurate so that their investigative team analyses and posts it back.

176 

M. E. MAGACK

RT @AfUncensored: The Safaricom statement on Rigathi Gachagua that's going round is FAKE. Check out what our @pigafirimbi team found out! RT @pigafirimbi: The data on internet penetration is MOSTLY TRUE. #PigaFirimbi https://t.co/DRNGoBAiOi

Safaricom too regarding #SafaricomDisobeysCourt noticed that KOT took advantage of the situation to express their dissatisfaction on several products. The majority of the tweets highlighted issues they were facing. Therefore, they went ahead and started a campaign with #TwendeTukiuke and #TheRescueMission that saw its CSR programmes take on stage in promoting talents. Regarding the crisis at Keroche, tweets reported the kind of damage that was caused by the crisis at Keroche among the misinformation being laying off 400 staff among other issues due to the battle with KRA. KEROCHE BREWERIES is set to lay off over 400 employees as it battles its recent closure by the Kenya Revenue Authority (KRA) over tax arrears. "The #staff at #Keroche will be joining millions of other #jobless #Kenyans as #economic #conditions continue to #worsen and push more Kenyans to #misery," she added.

The CEO Tabitha used this avenue to create an actual picture of the kind of losses they were making as an organization and instead of letting misinformed statistics go viral gave the litres of beer in stock and the worth it was. Most of the content that was contained in the countermeasures by organizations was for stating facts. KOT and Keroche made use of Twitter to counter the misinformation that was going round on Twitter. Tabitha used the #ThreadStorm to explain to Kenyans what the actual standoff between Keroche and KRA. #HotBedOfTerror was largely countered by #HotBedOfChampions and #TheMagicalKenya; hashtags that saw Kenyans shape the matter by portraying their country as rich with wild animals and cultures, clean cities, peaceful and tranquil adventures as opposed to what was purported about it being a major hit region by terrorists.

Conclusion Kenyans on Twitter have taken advantage of digital communication platforms by talking and texting through sharing, tweeting and retweeting on Twitter through hashtags that have shown huge reach locally and internationally. However, there is a huge risk about organizations that have not

9  EXAMINING THE ROLE OF ‘KOT’ IN REINFORCING ORGANIZATIONS… 

177

been looked as created by online communities and this is misinformation. There should be a crucial element to campaign against misrepresentation and misinformation while reinforcing organizations’ voices. That is why the Kenyan organizations have also ridden on such initiatives to ensure they counter any kind of misinformation that arises. Organizations’ response to misinformation should be guided by the rise of cam jobs, inaccurate political facts, global misrepresentation, social and economic-­ related issues among others as shown from the findings. This study contributes to the little research works in the global south that have been done on misinformation and deepfakes showing different measures that have been taken to avert the same. It also contributes to communication as a discipline. I would recommend comparative studies on hashtags from different countries in Africa on how organizations like Africa Uncensored have taken initiatives to counter misinformation with #PigaFirimbi among others. This could give a bigger perspective on how misinformation is being countered amidst exponential growth of social networking sites. One could also further study how misinformation is countered on other social networking sites other than on Twitter which this study based its foundations on.

References Ahmed, S., Madrid-Morales, D., & Tully, M. (2022). Social Media, Misinformation, and Age Inequality in Online Political Engagement. Journal of Information Technology & Politics, 1, 1–17. Al-Bulushi, S. (2019). # SomeoneTellCNN: Cosmopolitan militarism in the East African warscape. Cultural Dynamics, 31(4), 323–349. Balducci, B., & Marinova, D. (2018). Unstructured Data in Marketing. Journal of the Academy of Marketing Science, 46(4), 557–590. https://doi.org/10.1007/ s11747-­018-­0581-­x Bosch, T. (2016). Twitter and Participatory Citizenship: # FeesMustFall in South Africa. In Digital Activism in the Social Media Era (pp.  159–173). Palgrave Macmillan. Earth Lab. (2018, February 5). Use twitter social media data in python - an introduction. Earth Data Science  - Earth Lab. Retrieved August 6, 2022, from https://www.earthdatascience.org/courses/use-­data-­open-­source-­python/ intro-­to-­apis/social-­media-­text-­mining-­python/ Jeswara, S. (2022, January 16). Social media harvesting using Python. Medium. Retrieved August 6, 2022, from https://medium.com/analytics-­vidhya/ social-­media-­harvesting-­using-­python-­a94f3a4d7baa

178 

M. E. MAGACK

Langmia, K. (2016). Social Media “Teleco-presence” theory of identity. The Journal of Social Media in Society, 5(1), 265–290. Liberman, C. J. (2016). The Application of Traditional Social Network Theory to Socially Interactive Technologies. Social Networking: Redefining Communication in the Digital Age, 25-44. Lovejoy, K., & Saxton, G. D. (2012). Information, Community, and Action: How Nonprofit Organizations Use Social Media. Journal of Computer-mediated Communication, 17(3), 337–353. McAllister-Spooner, S.  M. (2009). Fulfilling the Dialogic Promise: A Ten-year Reflective Survey on Dialogic Internet Principles. Public Relations Review, 35(3), 320–322. Moyo, T., Proches, C. G., Mutambara, E., & Singh, U. G. (2022). The Relationship Between Twitter and Top-Level Management in Improving Corporate Reputation Behavior in the Telecommunications Industry in Zimbabwe. Gender and Behavior, 20(1), 18852–18882. Nalwoga, L. (2017). Examining Agenda Setting Effects of Twitter Users during the 2016 Uganda Presidential Election. Newman, Nic, Fletcher, Richard, Schulz, Anne, Andi, Simge, & Nielsen, Rasmus Kleis. (2020). “Reuters Institute Digital News Report”. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2020-­06/DNR_2020_FINAL.pdf Nothias, T., & Cheruiyot, D. (2019). A “hotbed” of Digital Empowerment? Media Criticism in Kenya Between Playful Engagement and Co-option. International Journal of Communication, 13, 24. Onwumechili, C., & Amulega, S. (2020). Digital Communications: Colonization or Rationalization? In Digital Communications at Crossroads in Africa (pp. 23-40). , Cham. Parsons, S., Atkinson, P. M., Simperl, E., & Weal, M. (2015). Thematically analyzing social network content during disasters through the lens of the Disaster Management Lifecycle. Proceedings of the 24th International Conference on World Wide Web. https://doi.org/10.1145/2740908.2741721 QSR International. (n.d.). Approaches to analyzing Twitter data. NVivo 11 for Windows Help - Approaches to analyzing Twitter data. Retrieved August 6, 2022, from https://helpnv11.qsrinternational.com/desktop/concepts/ approaches_to_analyzing_twitter_data.htm Tully, M. (2022). Everyday News Use and Misinformation in Kenya. Digital Journalism, 10(1), 109–127. Wamuyu, P. K. (2020). The Kenyan Social Media Landscape: Trends and Emerging Narratives, 2020. Wirtz, J. G., & Zimbres, T. M. (2018). A Systematic Analysis of Research Applying ‘Principles of Dialogic Communication to Organizational Websites, Blogs, and Social Media: Implications for Theory and Practice. Journal of Public Relations Research, 30(1-2), 5–34.

CHAPTER 10

Akata Night Masquerade: A Semblance of Online Deepfakes in African Traditional Communication Systems Unwana Samuel Akpan

Introduction Some scholars have proven and maintained the notion that mother Africa is the bedrock and the origin of modern humanity where the inspiration for ideas for modern Western inventions or creations originated (Alagoa, 1966; Stover, 1984; Ugboajah, 1985; Wilson, 1988; Johanson & Edgar, 1996; Kwasi Ansu-Kyeremeh, 2005; Gamble, 2007; Molefi Kete Asante, 2018; Akpan, 2019), and within the framework of this article, the modern deepfake is indisputably one of the creations that its roots and similarity can be traced to the West African artistic creation of Akata night masquerade. The United Nations Educational, Scientific and Cultural Organizations’ (UNESCO) list of World Heritage sites clearly shows that 79 of the 878 listed sites are naturally domiciled in Africa, thereby showing that 9% of the total UNESCO world heritage sites are native to Africa

U. S. Akpan (*) University of Lagos, Lagos, Nigeria © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5_10

179

180 

U. S. AKPAN

(UNESCO). Moreover, a careful study of The Commission for Africa’s (2005: 26) report reveals that most Western creations have their roots from Africa’s rich cultural heritage. In a similar development, Oliver Duff, who is the editor of The Independent, UK, in trying to dispel the West’s one-sided assessment of the African rich cultural heritage in one of his articles in the year 2004, entitled: “The Rich Art of Africa Goes on Show to Dispel ‘Caricature’ of a Dark Continent”, aver that a right-thinking scholar would not dispel the similarities of the Western creations with that of Africa’s age-long tradition creations. Thus, these clearly show that the continent of Africa can be viewed as the traditional and cultural melting point and the abode of the modern man, and humanity as a whole. For instance, before Edmund Burke in late eighteenth century in England could come up with the concept that the media is the watchdog of the society in monitoring the activities of the government and exposing them to the members of the public (Norris, 2014), the traditional African society had designed the Akata night masquerade to function as the watchdog in the society in monitoring nefarious activities of the people and name shame them at night. Also, before Siebert, Peterson and Schramm in 1956 in their book “Four Theories of the Media” propounded the Social Responsibility Theory of the Press, the Akata night masquerade was formally designed by the African traditional societies in those days to assume the social responsibility of indigenous media in news gathering and dissemination. Similarly, the social responsibility theory of the press and the watchdog concept of the media would have been a perfect theoretical fit for this chapter if this chapter was to adopt them for its theoretical explanations; however, it is some African-based theoretical framework that would fit the theoretical explanation of this article. Despite all these glaring facts that stare in the eyes, unfortunately the African continent is the only continent that some Western scholars have relentlessly painted in their writing as epitomizing hopelessness, abject poverty, destitution and backwardness. For instance, The End of Poverty, a book authored by Jeffrey Sachs, clearly showcases African content as artless in documentation of her artifacts, and this explains why an important African artistic media-like creation such as the Akata night masquerade was never studied as an African creation that has similarities with the modern deepfake, and predates modern deepfake, hence this article. Sadly, some Africans such as Semple (1911), Lele (1981), Narayan et al. (2000), Alagoa (1966), Obadina (2004) among others seem to agree with the Western scholars in their writings that Africa is culturally backward, is

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

181

artless in artistic documentation, is poor in cultural and traditional heritage and has non-material traditions of oral history. Most often, several attempts by early African scholars to trace the semblance and similarities of Western creations with that of the early African creations are always characterized by variegated experiences (Bianchi & Boniface, 2002; Broadman, 2007; Dieke, 2000; Easterly, 2006; Gamble, 2007), which to some extent, according to some Western scholars, most times these claims of Africans’ conceptual prowess cannot be proven or substantiated due to lack of written documented evidence (Hyden, 1980; Mercer, 2003; Reid, 2008; Saith, 2001). These views are not only factually incorrect and theoretically misleading, but practically unacceptable. This is why a “single story” is very dangerous if we must go by the postulation of Chimamanda Ngozi Adichie’s, 2009 Ted Talk, especially in a bid to correct historical errors. With these kinds of Western postulations, research in this area to prove these similarities or semblance often become cumbersome, herculean, complex and complicated. That is why in verifying the veracity of Africans’ creations that have striking similarities and semblance with the Western world’s creations should not be studied from the lenses of a Western ontology, axiology or epistemology, but from the eyewitness account of the creators who are still alive, the beneficiaries of such inventions or art creations, and from African theoretical perspectives. Therefore, basically, this article explores the similarities of the invention and scientificity of the traditional African Akata night masquerade with the modern deepfake that is historically phenomenal and typical of the activities of today’s deepfake. Quite a number of Western scholars across disciplines had ignorantly said at one time or the other that Africans do not have documented histories. Some Western historians (Austin, 2012; Decker, 2010; Lorenz, 2011; McNeill, 1986; van den Bersselaar, 2004), Western sociologists (Balandier, 1963; Gamble, 2007; Munro, 1979; Ritzer, 2000), Western anthropologists (Asad, 1973; Boahen, 1987; Horton, 1971; Mitchell, 1991) have all made this factual error in their postulations about documented historical facts about Africa, especially when it comes to African inventions that have similarities or semblance with Western culture. Some early Western scholars claim that some Western creations are first of its kind (Comaroff & Comaroff, 1992; Comaroff & Comaroff, 1997; Rabinow, 1989), never found elsewhere in the world, but a closer look at the annals of African historical societal perspectives would reveal a lot concerning semblances and technological similarities with most Western sociological creations. Early African scholars such as p’Bitek (1970), Wilson (1981), Mudimbe

182 

U. S. AKPAN

(1988), and of recent, Yager et al. (2007), Reid (2008), Akpan (2019) have proven that most African artistic creations predate Western creations. Therefore, the purpose of this chapter is three fold. First it explores how the ancient indigenous communication systems employed by the traditional Akata Masquerade in Africa are similar to what obtains in the Western deepfake. It briefly shows how these two communication systems are similar. Second, it shows how, even in a media environment saturated with Western information technologies, traditional media will have a place to play in contemporary Africa. Third, it demonstrates the importance of combining the African and Western media systems employed by the Akata Masquerade as a watchdog in the society. Therefore, in this chapter, using these three (3) fold analysis, the author dismisses the impression often created that there is no historical traditional deepfake in Africa. Though some contemporary mass-mediated forms of communication are not native to Africa, back in the day, traditional African societies had practical institutionalized and structured systems of communication that were used in expressing ideas among Africans, and there were some striking similarities with the Western media systems. Proponents of African indigenous communication systems (Ansah, 1988; Ansu-Kyeremeh, 2005; Ascroft, 1969; Ginsburg, 1991; Hachten, 1971; Moemeka, 1984; Opubor, 1981; Ugboajah, 1985; Ugboajah, 1986; Wilson, 1987) have all argued that in what the West calls modern mass media systems, you could have a trace or a semblance of what the traditional African societies had already used as a form of communication. A good example is the Akata masquerade which was a form of indigenous media. In the study and research of Africa, the indigenous communication dimensions of how Africans used structured communication systems to check ills in the society are often ignored and marginalized. Indigenous here according to (Wilson, 2015) means the autochthonous social and cultural attributes and practices that characterized pre-colonial and are still in many cases features of the social and cultural conditions of people in societies across the continent. Wilson (1988) also avers that “in the field of communication, in particular, and more so in mass communication, indigenous forms and dimensions of communication are often dismissed as inconsequential, or only casually mentioned in the mainly eurocentric mainstream research”, and that is why no scholar has been able to decipher the fact that deepfake as a threat in modern information system, until now. Also, according to Wilson (2015), “the subject of indigenous forms of communication may

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

183

not seem that significant in an age of complex, fast and sophisticated technologically mediated communication systems”, but to a large extent it is, because the Akata masquerade revealed how deepfake is its semblance.

Theoretical Perspectives This article adopts and explores ubuntu, communalism, maatic and Afrokology, which are all philosophical, epistemological and methodological approaches in theorizing communication from the African perspectives, as it relates to Akata night masquerade’s communal activities. These serve as the theoretical framework and guide for this chapter. Conventional traditional communication and media theories tend to be conceptualized, constructed and weaved around the notion of Western ideology, or better put “white supremacy” and “anything white is superior” to the detriment of conceptualizing communication and media theories from the African perspectives. This notion has given birth to radical new kids on the block— a new crop of African scholars who live in the United States and in Europe who are now beginning to give a second thought in theorizing, prioritizing and reconstructing the concept of African communication. They are beginning to go back to the elements of the African aspect of communication that the West discarded as being obsolete or archaic, and theorize from the Afrocentric angle. In fact, they have started recording some striking paradigm, philosophical and thematic shifts in conceptualizing African-­ based and inspired communication theories, and these theories are hinged on African concepts. In their latest work on “Rethinking African Communication Scholarship: Critical Perspectives on Research and Theory”, Eddah M. Mutua, Bala A. Musa and Charles Okigbo note that there are growing interests among African scholars in theorizing from African perspectives: In Kehbuma Langmia’s edited collection Black/Africana Communication Theory, multiple contributors provide original perspectives on African communication theorizing. For example, some theories developed include the Igbo ethnic communication style; Ujamaa communication theory; the Afro-cultural Mulatto theory of communication; and venerative speech code theory. (Mutua et al., 2022)

These theorizations based on African perspectives have sparked up and revolutionized interest in revisiting concepts such as the Western deepfakes and trace its semblance with the African Akata night masquerade.

184 

U. S. AKPAN

Mutua et al. (2022) admonish that “ultimately, efforts to theorize African communication must welcome the larger African project to challenge Western practices about how the African story is told”, and this article brings to fore the untold African story by an African who was born in the rural area of Western African, grew up there to witness the Akata night masquerade. And then writes on the fact that before the West invented deepfakes, the African continent had had their own version of deepfakes. Again, if Africans must tell their stories, we have to adhere to the call of Mutua et al. (2022) that “African Communication scholars must preserve the integrity of telling African stories”. Therefore, this article digests, explores and sieves the role of Akata night masquerade having a semblance with deepfake through the theoretical postulation of Mutua et  al.’s (2022) analysis of four recent African communication theories and paradigms: African Communication scholarship has recorded paradigms, philosophies, and worldviews that serve as foundations for exploring African communication practices. Four such paradigms, philosophies, and worldviews are Ubuntu (“humanity”, “humanness” and “humaneness”); communalism (the sanctity of authority, utility of the individual, supremacy of the community, respect for elders, and religion as a way of life); the African philosophy of Maat (“truth, righteousness, justice, order, balance, harmony, and reciprocity”); and Afrokology (a transdisciplinary approach to reimagine media and communication studies and to unite its practices with philosophical roots in Africa).

First, ubuntu is a South African sociological philosophy of “I live because you live”. Ubuntu depicts “humanity”, “humanness” and “humaneness”, aspects of the Akata night masquerade. As earlier stated, the Akata masquerade is displayed by humans which is “humanness”, and who feel the sense and need of “humaneness” that justice should be carried out in the community. Second, communalism here as postulated by Andrew Moemeka portrays the spirit of brotherhood, placing the continual existence of the community above everything and everyone else, and the sanctity of authority Akata has in carrying out its functions unchallenged in the community. Third, the Maatic theory of communication grounded in the ancient classical African idea of ethics and moral idea in ancient Egypt by Molefi Kete Asante depicts “truth”, of which the Akata night masquerade stands for in gathering and disseminating secret

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

185

information to the members of the community. This is what fosters believability among the people. Lastly, “Afrokology” made popular by the works of contemporary scholars such as Dani Wadada Nabudere, Winston Mano and Viola Milton postulate the need for the achievements of Africans to be recognized. This collaborates with the views expressed in this chapter that traditional deepfake was long created by Africans, decades before the AI’s deepfake; therefore there is a need to recognize and document this fact. This therefore shows how Akata as a form of rural media uses communication native to the rural people in discharging its duties based on the African theoretical perspectives of ubuntu, communalism, Maatic and Afrokology.

Defining African Communication Systems/ Indigenous Media Research in African communication scholarship has thrown out shades of meanings by Africanfuturists, Afrofuturists, Afrokologists and African scholars of what the concept of African communication is. Early research in indigenous modes of communication in Africa by scholars such as Sonia Restrepo-Estrada, Escarpit, Ronald, Charlayne Hunter-Gault, Leonard William Doob, Colin Cherry, Ronald Escarpit, William A.  Hachten, Dhyana Ziegler, Kwasi Ansu-Kyeremeh, Molefi Kete Asante, Des Wilson, Faye Ginsburg, Paul Ansah, Joseph Ascroft, Francis B. Nyamnjoh, Louise M. Bourgault, Alfred Opubor and Frank Ugboajah, among others, opened the floodgates of concepts and ideation to contemporary scholars like Eno Akpabio, Winston Mano, Abigail Ogwezzy, Kehbuma Langmia, Viola C. Milton, Unwana Samuel Akpan and others in African communication systems scholarship. The pioneering efforts of scholars in African traditional communication systems have emboldened and lightened up the path and views of these modern scholars. These scholars postulated a plethora of meanings of what African communication is to them, starting with Charlayne Hunter-Gault’s (2018) assigning signs and meaning, to Leonard William Doob’s (1961) ways Africans shared meanings. For Colin Cherry and Sonia Restrepo-Estrada (Cherry, 1978), it is hinged in tradition, while Ronald Escarpit (1968) avers that it is the traditional symbols of communication that would never lose relevance. William A.  Hachten’s (1971) familiar sound of meaning agrees with Dhyana Ziegler and Asante’s (1992) definition that it is a shared meaning rooted in traditional elements and symbols. Kwasi

186 

U. S. AKPAN

Ansu-­Kyeremeh (2005) sees it as venue-oriented communication, which is akin to what Stover (1984: 68) calls community media. Molefi Kete Asante (2018) sees it as a structured historical system of communication, while Des Wilson (1988) looks at it as a taxonomical institutionalized communication based on traditional values and meaning. Faye Ginsburg (1991) argues that it demonstrates congruence with the structures of the local culture for meaning sake, and Paul Ansah (1988) avers that it is based on African normative values. Joseph Ascroft’s (1969) notion of the ideation of indigenous meanings synchronizes with Francis B.  Nyamnjoh’s (2005) definition of an indigenous dimensions in speech and symbols. Louise M.  Bourgault (1990) looks at it simply as indigenous forms of communication, and Alfred Opubor (1981) says it is the characterization of meaning that is native with the people. Akpan (2019) says African indigenous communication is traditionally transactional in native signs, symbols and artefacts for shared meaning. Milton and Mano (2021) simply believe that communication at this level hinges on culture and value; on the other hand, Frank Ugboajah (1985) sees it as an oramedia and communication based on communal values and artefacts. For Akpabio (2003), it is communication-­based on common traditional ideologies. In the light of all of these definitions, whether it is African communication systems, indigenous communication, indigenous media, folk media or oramedia where the Akata night masquerade is classified, it is clearly seen that it is a customized and clannish form of communication. This explains why Gillies (1989) posits that: It is a scholarly theory and a social axiom to say that communication is culture, and that knowing a culture requires knowing its systems and media of communication. Culture is defined by communication systems [and] since culture is a system of shared beliefs, attitudes, values and behaviour, then clearly there is a communication medium involved. (p. 3)

This is obviously what happens in the case of Akata night masquerade as a traditional medium where the people use it as a recognized institutionalized medium for conveying and revealing secret activities and information that were carried out in the dark. However, the purpose for the Akata medium was later abused because of blackmailing and slandering as in the case of the modern deepfake. Gillies (1989) citing Ferguson and Ferguson (1980:4) went further to stress that “conversely, where there is a communication system, the users will in some measure constitute a

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

187

culture”, just like the case of the African people where they use created cultural traditions such as the Akata night masquerade for communication purposes. Similarly, Wilson (2015) has the same notion that: Communication in a pre-literate society can be seen in line as communication in a traditional system and cannot be discussed in isolation from the culture of that society. Rural communication among the people of Southwest and Northwest provinces of Cameroon has some semblance with what obtains in other ethnic groups. Communication, be it traditional, rural or modern involves sending a message (encoder) and receiving a response (decoder) in a particular way (feedback). What constitutes traditional communication is today a hybrid from other fields of study such as religion, anthropology, mythology and an amalgam of multifarious cultural practices, which have become standard. (Wilson, 2015: 282)

In another of his texts, Wilson (2008) maintains that “traditional communication is a mixture of social conventions and practices”, such as the Akata masquerade which has “become sharpened and blended into veritable communication modes and systems”.

Deepfake as a Modern Concept The popular term “deepfakes” refers to deep and hyper-learning-based techniques that can produce fake images and videos by swapping someone’s real face with the face of another person. Looking at Google Trends search, “deepfake” as a term first popped up in the early years of 2018. It generated public discourse and since then it has gained public attention around the world (Google Trends & Wahl-Jorgensen & Carlson, 2021 p. 1). Deepfakes as a term is reported to have been founded by an anonymous user of the platform Reddit who went by the name “deepfakes” (Hao & Heaven, 2020). The user deployed Artificial Intelligence to montage well-known faces of famous actresses into pornographic videos and was blocked from the platform as a result of this practice. When the creator of this app realized he was blocked, he made a detailed graphic guideline on another platform on how to create such videos. As a result, the technique quickly gained the attention of more than 25.000 Reddit users (Panyatham, 2021). This single technological creation has given birth to the millions of doctored videos circulating online today globally. This technology thrives on the face-swapping process in which a source video

188 

U. S. AKPAN

or picture is used to manipulate a target video, and this is a very disturbing cultural and social phenomenon that has marred the ethic, standards and credibility of journalism practice and that of the victim (Kietzmann et al., 2020, p.  137). The key goal of those who created deepfake is to make their manipulated pictures and videos as acceptable and believable as possible to the generality of the people (Thies et al., 2016, p. 2387). Recently, deepfake has emerged as a new digital cultural threat in some sectors, garnering scholarly attention from researchers, policymakers and media consumers. Nowadays, it is no rocket science to alter original media contents with the aid of digital technology, and most often, to the degree of unrecognizability. With advanced software, pictures, sound and moving images can be manipulated or even created completely from scratch. Face-­ swapping has always been the trick creators of deepfakes use in getting the attention, believability and acceptability of millions of media consumers around the world. Though the manipulation of media contents has always existed as long as their invention, recent advances in Artificial Intelligence (AI) have aided the deepfake creators in the width of their tampering techniques. With a press of a button, pictures, sound and moving images can now be altered and even generated entirely by computation. Those whose stock-in-trade is in deepfakes have sufficient technological skills in producing audio and video clips that are realistically looking and sounding like people doing or saying things never said or never done, and this could offer scenarios of unprecedented opportunities for deception. These days, the viral nature of altered media contents in the form of fake videos, audios and images has largely doubled in the last couple of years. All thanks to the advanced digital manipulation tools and techniques which have made it a lot easier to generate fake content and post it on social media platforms. Those who engage in deepfakes might see it as fun, but this practice raises the concern for ethical implications and credibility since it could be potentially used for blackmail, intimidation, sabotage, ideological influencing and inciting violence. This therefore has broader implications on trust and accountability.

Systems of Indigenous Communication in Africa In Africa, indigenous media are structured under the traditional institutions, and they play critical social and political roles in rural communities in Nigeria prior to colonialism and Western democracy. Today, most of

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

189

these roles that were played by the traditional institutions are predominantly taken over by the government, so that the rural people can feel the need of government presence. But according to Wilson (2008), “traditional practices still find fertile grounds in both rural and urban dwellers who, even though exposed to the realities of new religions and western civilization, still hold tenaciously to those aspects of culture that serve their inner identities.” This is the more reason, even some very educated Africans have an understanding that regardless of space and time that technology has bridged, reliance on traditional forms of communication such as the Akata night masquerade for information gathering cannot be underestimated or underscored. Even these educated Africans have come to appreciate the important position of the Akata night masquerade in secret information gathering and destination. This is because its operations span time and it is situated in institutionalized tenets. The plethora of concepts Africans use in the understanding of their communication space and time underscores the prominent position of certain institutionalized elements. One such example is the Akata night masquerade which played a key role in African society in those days. The systems of traditional media or communication in Africa were highly institutionalized and structured. The system was skewed on respect for traditional values. The believability of the medium was sacrosanct to their survival and society’s functionality. For instance, in almost every rural community in Ibibio, Annang and Efik lands there were different forms of instruments and elements used in communicating. In Yorubaland, the talking drum is an instrument used in communication, and in the Annang, Ibibio and Efik lands, the fresh leaf of a palm tree popularly called “eyei” is an element used in communication in these communities. When eyei is tied to a vehicle, one automatically knows that the vehicle is conveying a corpse, and if it is tied around a piece of land, it means the piece of land is under dispute, and all parties are restrained from entering the piece of land. The categorization of these systems has been segmented by Moemeka in Akpabio (2003: 5), where he categorized indigenous media in Ibibio land into two forms: traditional and modern ones, such as social forums, town-crier, village market, village school, newspaper and radio. Also, Wilson (2015) tabularized modes used in indigenous communication in Ibibio, Annang and Efik lands, of which Akata masquerade belongs (Table 10.1). It is clear from the table that the Akata masquerade is an institutionalized and a categorized form of traditional media in the Efik, Annang and

190 

U. S. AKPAN

Table 10.1  Modes of indigenous Nigerian communication media S/ NO

Mode of communication

Media/Channels used

1.

Instrumental

2.

Demonstrative

3.

Iconographic

4.

Extramundane

5.

Visual

6.

Institutional

Idiophones – wooden drum, metal gong ritual rattle, woodblock, etc. Membranophones – skin drum. Aerophones – whistle, ivory horn, reed pipes, etc. Symbolography – bamboo rind, nsibidi, tattoo, chalk marks. Music – songs, choral/entertainment music, griots, dirge, elegy, ballad, pop, rap, spiritual Signal – canon shots, gunshots, whistle call, campfire, drum. Objectics – charcoal, kolanut, white clay, egg, beads, flag. Floral – fresh palm frond, plantain stems, boundary trees. Incantatory – ritual, libation, vision, prayer. Graphic – obituary, in memoriam. Colour – white cloth, red cloth, yellow, etc. Appearance – dressing, hairstyle, body language. Social – marriage, chieftaincy, festival. Spiritual – shrine, masquerade.

(Adapted from Wilson, 2015)

Ibibio lands that performs various social and political functions in these Nigerian communities prior to colonialism and the arrival of Western media. These functions and roles that are similar to deepfake would be enumerated shortly. Even until today, the Akata masquerade plays a critical and a functional role in the society, and its semblance with deepfake is quite striking.

Origin of Akata Night Masquerade Early scholars of traditional masquerades have traced the origin of Akata night masquerade to the Ejagham-speaking peoples of Cross Rivers State (Onor, 1994; Talbot, 1912). According to Essien and Oqua (2020), “one popular song tends to support this opinion: ‘Ekpri Akata oto Ekoi - This small Akata comes from Ekoi.’” It has now been widely accepted that the Akata night masquerade is native to the Ejagham people. The traditional arts and skills of the masquerade were admired by the artisans who were from the Annang, Efik and Ibibio lands, and upon their return to their

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

191

homeland, they transported the Akata art to their homelands. Due to their admiration of the night masquerade, they offered themselves to be initiated into the Akata cult. For the Ejagham and the Efik people, Akata night masquerade was a cult and an art, and only the initiates knew themselves. But for the Annang and the Ibibio people, the Akata night masquerade was not a secret cult, but a strictly secretive masquerade whose art is only known by its members.

Akata as a Form of Indigenous Media No doubt, one would find volumes of research efforts carried out by erudite anthropologists, dramatists, communication and media specialists, etc., on Annang, Ibibio and Efik masquerades. For instance, scholars such as Amoury Talbot (1912), Edet Ekpo (1968), Oyin Ogunba (1978), Des Wilson (1981), Inih Ebong (1989) and Conrad Phillip Kottack (2004) among others have carried out pioneering research on masquerades that are native to the Annang, Ibibio and Efik lands, of which the Akata masquerade falls into. In those days, Africans used things around them in communicating, and this corresponds with what Ogwezzy (2008) avers that communication in Africa is iconographic, while Wilson believes it is institutionalized. The Akata masquerade was not a mere masquerade, but its existence is hinged on societal functionality. The people in the rural areas of Annang, Ibibio and Efik land depended on the Akata masquerade then, to furnish them with hidden information, the behaviour of people that they carried out in the dark. Akata night masquerade is a form of what Ugboajah describes as oramedia. Forerunners of indigenous communication, such as Ugboajah and Hachten, conceptualized their notion of indigenous media; for Ugboajah it is “grounded on indigenous culture produced and consumed by members of a group” (cited in Akpabio, 2003: 3). On the other hand, Hachten (1971) conceptualized it as “informal channels of communication”. For Edeani (1995), oramedia denotes media represented by a “diffusion network of lower chiefs, age groups, the marketplace, women’s organizations, traditional priests, stall heads, village heads and the indomitable village town crier”. Furthermore, Soola (2006) avers that “oramedia” is an age-long form of communication which predates the contemporary or modern media of communication. In essence, it refers to the “indigenous means of communication”.

192 

U. S. AKPAN

The Ibibio, Annang, Efik’s Akata masquerade is a kind of traditional means of indigenous media serving their community in secret service information gathering and dissemination. Its semblance with deepfake is quite striking. According to Akwang (2010), “the most visible and the most ubiquitous forms of Ibibio traditional theatre are embodied in its robust masquerade tradition.” Ebong in Akwang (2010) gives a graphic and theatrical make-up of Akata: A massive, pyramidal, wooden enclosure covered with thick, bright clothes, raffia, and palm fronds known as Akata. It has a carved wooden mask usually placed at the peak of the triangle. It also has a guttural speech. Its diminutive version is known as Ekpri (meaning small or junior) Akata. It employs carved or moulded animated wooden puppets that use gestures and twisted speech to tell stories of atrocities that people committed in the dark.

The “twisted speech” (Akwang, 2010), used by Akata night masquerade, was a kind of traditional technological innovation where the initiates used in those days in disseminating information to the members of the public at night so that their real identity would not be identified. This in essence is what the modern deepfake does by tampering with someone’s voice, or by swapping someone’s real face or voice with the face of another person. Essien and Oqua (2020) capture the Akata tampered-theatrical mimicked voice thus: Akata has a ‘mother’ masquerade, Eka-Akata, a magnificent soloist, presented in day time and in full view of everybody. Its musical accompaniment is very inviting and attracts many, both old and young, who often accompany it about. The dance steps involve simple side movements and strutting – reminiscent of a peacock.

Akata is likened to today’s news broadcast on radio or television, where the rural people would stay awake late at night in order to hear the cans of worms of people in the village the Akata masquerade would roll out. Akata masquerade is a revelatory means of communication that is satirical, and used in bringing to light atrocities and social misdemeanours of individuals and groups within a society. The Akata reports are always investigative and damaging to peoples’ reputation, and because it is institutionalized, people tend to believe its reports.

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

193

As a young man who grew up in the rural areas of Ibibio, Annang and Efik lands, I have always noticed that Akata displays its performance at night between the hours of 1:00 a.m. till about 4:00 a.m. In the course of the masquerade’s display, he would be dancing and dishing out revelatory singing about the evil deeds of unscrupulous individuals or groups in the community. In the course of the display, the stem of a plantain tree is pitched in the offender’s compound. The plantain stem is a form of iconographic communication by the Akata masquerade to the offender or those involved in the act to desist from heinous acts or be ready to be disgraced publicly. The rural West African people regarded the Akata as the eyes and the mouthpiece of the society in exposing the evil doings of scrupulous elements in the community.

The Purposes Akata Media Served in the African Community The basic traditional functions of mass communication were always to inform, educate and entertain. However, some scholars such as Harold Lasswell, a communication theorist in a paper he titled: “The Structure and Function of Communication in Society” in 1948 and Charles Wright (1964) have expanded the functions of mass communication to include surveillance of the environment, transmission and dissemination of information, correlation of parts of the society and transmission of culture; in essence which in turn shapes or alters the attitudes and opinions of information recipients. The Akata night masquerade in those days essentially engaged in surveillance of the environment (looking out for deeds that were done in secret), correlation of parts of the society( putting the pieces of the information together), and transmitting this information to members of the public. The Akata night masquerade serves its community three (3) distinctive functions, which are to entertain, inform and reveal secret deeds in the community. Essien and Oqua (2020) opine that: Angbo-Akata is a masquerade performed by the Qua, Efut and Efik lingual groups in Calabar – the Capital of Cross River State, Nigeria (plate 1). As part of a sacred cult, the hallmark of Akata (as it is popularly called) lies in the entertainment it provides and the secret detective characteristics it possesses. When performed in the afternoon and while touring the communities, it exchanges pleasantries with prominent individuals. The voice is

194 

U. S. AKPAN

disguised to avoid the masker’s recognition. Intermittently, it raises songs that captivate its hearers.

However, as it is a night masquerade that reveals secrets, people were scared that Akata could be used against them for things they did not commit. As it was a secret informer and a watchdog in the society, there were fears that people could be defamed. Again, Essien and Oqua (2020) aver that: Another exciting but feared aspect of Akata is its secret service report. It is exciting when secret escapades of members of the community are brought to the fore and lampooned. However, people still nurse fears because their own dealings could equally become a subject matter. In the course of lampooning, some members of the communities are actually named and their evil acts disclosed. Sometimes, however, the identities of the individuals concerned are not explicitly disclosed. But Akata has a way of describing the fellow(s) or giving other social pointers and indicators that, unmistakably, lead to the identity of the personality in question. Acts often tackled include illicit love affairs (fornication and adultery), theft, drunkenness, diabolism, etc. Before presenting such issues, Akata pretends to be consulting an oracle  – to give the impression that whatsoever it says is at the instance of the gods.

One might ask how the Akata night masquerade manages to conceal its identity in the community while the display is ongoing, looking at the fact that some people might not be asleep that time. Again, Essien and Oqua (2020) shed more light: Akata is highly averse to illumination. In the course of its display (especially in the rural areas), it beckons on people to put lit lanterns out. In an urban area like Calabar, forerunners have the task of putting out such lights. This phobia informs the presentation of Akata in times of the year when the moon has hidden itself from the earth. In an interview with Chief Ukpanyang Otu, the village head of Obubit Isong, OkurikangOkoyong in Odukpani Local Government Area of Cross River State, he disclosed that this aversion on the part of Akata is as a result of its desire to hide its identity (Personal Interview). To further conceal its identity, the masker wears an object that distorts the nose, which is responsible for Akata’s changed voice.

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

195

As we know, the activities of deepfake creators are shielded in secrecy, so are the activities of the Akata masquerade. The Akata tempers with facial and voice recognition, so do deepfake creators.

Why Deepfake Is Not New in the African Society Modernity comes with its multi-powerful colourful display of features and its replacement of medieval systems of operations, which makes its invention and ideation very attractive and luring. Some traditionally minded schools of thought can be parochial in their defence and argument to preserve traditional methods of doing things in the society, and as well allow the traditional status quo to remain. However, even the die-hard traditionally dogmatic guardians and apostles of traditional preservation in their sober moments admit the existential realities of not the dominance of modern technologies, but also agree to a large extent of their cultures’ primordial primitive cultural practices that cannot match the mileage and the acceptability of the modern technology. Researchers of deepfakes cannot shy away from the obvious fact that the new technologies that gave birth to deepfake cannot be said to be the first of its kind in creating deepfake, rather, it should look at it from the primeval perspective that there was a semblance of cultural representation of deepfake back in the medieval day in some African communities. This is the thrust of this chapter. In those days, each cultural setting had their concept of elements of communication which helped them to share meaning among themselves. They also created instrumentalities for social control that were culturally based, which helped them to administer their communities. Most of the writings from the West have traced the concept of deepfake to the West, and this is factually incorrect. Whatever position anyone might hold about the so-called Eurocentric analogy and arguments that the concept of deepfake emanates from the West does not automatically dismiss the fact that the concept of deepfakes can be traced to Annang, Ibibio and Efik lands in the West African region. It is in line with this thought that this article critically appraises the semblance of Akata night masquerade with the modern-day deepfakes.

The Akata Night Masquerade as a Deepfake The argument has always been that the Akata night masquerade is fair and impartial. That is why once again Essien and Oqua extensively assert that:

196 

U. S. AKPAN

Angbo-Akata serves the purpose of restraining people from engaging in activities that are at variance with the society’s accepted norms and values. Akata makes secret investigations into the activities of individuals within the society. Its satirical inclination exudes social auras that serve to curtail deviant attitudes and make citizens governable. Angbo’s satires often sink deep into social marrows, and can be damaging especially in a rural setting, where everyone knows everyone else. Such lampooning often leaves indelible psychological marks that can stigmatize and confine people/families to subtle forms of isolation. An institution like marriage may then be contracted with strains, as members of the lampooned family may be subtly disdained. Akata does not dwell on the destabilization of society. Its ultimate aim is stabilization, social cohesion and progress. The masquerade serves as the community’s motivational force. Angbo-Akata is believed to possess the third eye and, as such, sees beyond the normal perception of humans. Apart from this, the Angbo is believed to have sojourned and experienced the decorum that exists in other communities. If it then pronounces a particular act inimical and calls for redress, the people are moved to action in order to weed-off social anomaly.

But a personal interview with ten (10) elderly respondents who were active members of the Akata night masquerade was an eye opener. One of the respondents boldly revealed: My son, as a young person then, we as members of Akata took pleasure in investigating peoples’ secret acts in the community, and revealed it at night. But much later, some of our members started collecting money from some people in the community who had chieftaincy title tussle, and promised to falsely accuse the innocent in acts these people did not commit, so as to tarnish their reputation before the people and the council of elders. This way, the person whose image has been smeared by Akata would withdraw from the race. It was a terrible situation. As a result, some of us left the group because its traditional tenets were compromised because of money.

All the respondents agreed that the Akata night masquerade deviated from its tenets and started attacking the innocent in their traditional night news reportage. As noted by the above respondent, the principal targets of Akata as a deepfake are famous personalities in the society, influencers, celebrities and politicians, and in the Artificial Intelligence (AI) era, whose faces are transposed onto others without their approval (Pantserev, 2020). Deepfakes, just like the Akata night masquerade then, thrives on two key psychological elements; these are believability and acceptability. Its

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

197

believability and acceptability were hinged on tradition. This is further collaborated by Kietzmann and his fellow scholars, in their studies (Kietzmann et al., p. 136). Deepfake technology has gained prominence and millions of doctored videos and pictures, as well as audio materials, have flooded the internet in the last couple of years. Anyone who is a social media user would vividly recall Obama calling Trump a “total and complete dipshit” (Peretti & Sosa, 2018); John F.  Kennedy discussing the famous cartoon show Rick and Morty in a state address (Ekian, 2020); millions of TikTok users laughing at a Hollywood actor, Tom Cruise stumbling in a TikTok video (Ume, 2021); or being shocked and asking oneself, since when famous actresses like Scarlett Johansson, Maisie Williams, or Gal Gadot are actively contributing to the porn industry (Cole, 2017). In her 2020 Christmas video, Her Majesty the Queen Elizabeth II performed a surprisingly agile dance routine. And that same year, North Korean dictator Kim Jong Un warned the world’s population in a recorded speech that “democracy is a fragile thing, more fragile than you want to believe”. In reality, none of the above-mentioned media content is real. These videos fall into a new band of “synthetic media” (Meckel & Steinacker, 2021), also called “deepfakes”, a portmanteau of the AI technique of “deep learning” and the inauthenticity marker “fake”. Creators of deepfakes use AI applications that merge, combine, replace and superimpose images and video clips to create fake videos that appear almost, sometimes even very authentic (Maras & Alexandrou, 2019). It is no surprise that the faces of the mentioned politicians and celebrities were either manipulated or swapped onto unknown actors, thereby creating a new form of make-­believe media that is presently and generally considered as deepfake. In the real sense of it, when the Akata night masquerade got adulterated, they engaged in character assassination of those they deem as enemies. Just like the AI technology started with good intentions, Akata started with good intentions but got adulterated along the way, with its operations resembling today’s deepfake. The aforementioned are classical examples that the Akata night masquerade has a semblance of deepfakes. In a related development, five young respondents who studied in University of Calabar, Cross River State, and University of Uyo, Akwa Ibom State, in Nigeria recounted in 2022 how some students at night would display the Akata night masquerade in these various campuses and used such to name and shame female students who are having elicit sexual relationship on campus with their lecturers. Also, the Akata group in school would also name and shame male and female students who are

198 

U. S. AKPAN

members of secret cults on campus. Therefore, as the activities of deepfake have progressed in width and in length in recent times, thereby making it herculean for the innocent and naive social media content consumers to distinguish deepfakes from real content; equally, contemporary Akata night masquerade members on campus are also using this traditional art to attack the personality of people on campus in Nigerian universities. So, with these can one boldly say today that the Akata masquerade has not taken the semblance of deepfake in defaming peoples’ character.

Conclusion and Recommendation This article has extensively revealed that there are striking semblance and similarities between the African Akata night masquerade and the modern deepfake in their operations. The Akata night masquerade existed even before the Western creation, or AI invention of deepfake. AI only improves on it because of technology. Ronald Escarpit (1968) and Colin Cherry (1978) have severally given empirical evidence that proves that no one newly introduced mode of communication or combination of new modes wholly replace or supplant the traditional ones. They aver that they would always supplement the old ones or replace some of their functions but never all of their functions, as in the case of Akata night masquerade and deepfake. The Akata night masquerade mirrors the traditional realities and similarities that the contemporary deepfake possesses. As Mutua et al. (2022) succinctly capture, “we now know that local cultures remain relevant in modern communication practices and in providing directions for culturally relevant research and theoretical orientations”, African scholars should learn to research and tell their stories because presently, African continent is a victim of Chimamanda Adichie’s axiom of a single story. In closing, there are loads of the things Africans created, conceived and invented and used in their society that have similarities with the Western creations and inventions. Only research can help dig these out.

References Adichie, Chimamanda Ngozi (2009). The Danger of a Single Story. TED Talk. YouTube,. https://www.youtube.com/watch?v=D9Ihs241zeg & t=979s. Akpabio, E. (2003). African Communication Systems: An Introductory Text. B Prints Publications.

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

199

Akpan, U. (2019). Traditional African communication: Representations in modernity. Public lecture at the Cathy Hughes School of Communications, Department of Communication, Culture and Media Studies on October 22, 2019 4:10.p.m - 6:30.p.m., @ C.B. Powell RM 118. Akwang, E. (2010). Culture in a Dynamic Society: A Dialogic Assessment of Ibibio Masquerades in the Age of Technology. Ejomas: Ekpoma Journal of Theatre and Media Arts, 3, 1–2. Alagoa, E.  J. (1966). Oral Tradition Among the Ijaw of the Niger Delta. The Journal of African History, 7(3), 405–419. Ansah, P. (1988). In Search of a Role for the African Media in the Democratic Process. Africa Media Review, 2(2), 1–16. Ansu-Kyeremeh, K ed. (2005). Indigenous Communication in Africa: A Conceptual Framework, In Indigenous Communication in Africa: Concept, Application, and Prospects. Accra: Ghana Universities Press. Asad, T. (Ed.). (1973). Anthropology and the Colonial Encounter. Ithaca Press. Asante, M. (2018). The Classical African Concept of Maat and Human Communication. In K. Langmia (Ed.), Black/Africana Communication Theory (pp. 11–23). Palgrave Macmillan. Ascroft, J. (1969). Modernization and Communication: Controlling Environmental Change, Ph.D. dissertation, Michigan State University. Austin, G. (2012). History, Archives and Development Policy in Africa. Paper presented at the Using History to Inform Development Policy: The Role of Archives, Washington DC. Balandier, G. (Ed.). (1963). Sociologie Actitelle de I’Afrique Noire. Press. Univ. France. van den Bersselaar, D. (2004). Establishing the Facts: P.A. Talbot and the 1921 Census of Nigeria. History in Africa, 31, 69–102. Bianchi, R., & Boniface, P. (2002). Editorial: The Politics of World Heritage. International Journal of Heritage Studies, 8(2), 79–80. Boahen, A. (1987). African Perspectives on Colonialism. Johns Hopkins Univ. Press. Bourgault, L. (1990). Talking to People in the Oral Tradition: Ethnographic Research for Development Communication. A paper delivered at the Annual Meeting of the African Studies Association. Baltimore, Maryland, USA. Broadman, H.  G. (Ed.). (2007). Africa’s Silk Road: China and India’s New Economic Frontier. The World Bank. Cherry, C. (1978). World Communications; Threat or Promise. MIT. Press. Cole, S. (2017). AI-Assisted Fake Porn Is Here and We’re All Fucked. VICE.  Retrieved 04 25, 2021, from https://www.vice.com/en/article/ gydydm/gal-gadot-fake-ai-porn Comaroff, J., & Comaroff, J. (1992). Ethnography and the Historical Imagination. Univ. Chicago Press.

200 

U. S. AKPAN

Comaroff, J., & Comaroff, J. (1997). Of Revelation and Revolution (Vol. 2). Univ. Chicago Press. Commission for Africa. (2005). Our Common Interest: Report of the Commission for Africa. Commission for Africa. Decker, S. (2010). Postcolonial Transitions in Africa: Decolonization in West Africa and present day South Africa. Journal of Management Studies, 47(5), 791–813. https://doi.org/10.1111/j.1467-­6486.2010.00924.x Dieke, P. U. C. (Ed.). (2000). The Political Economic of Tourism Development in Africa. Cognizant Communication. Doob, W. (1961). Communication in Africa: A Search for Boundaries. Yale University Press. Easterly, W. (2006). The White Man’s Burden: Why the West’s Efforts to Aid the Rest Have Done so Much Ill and so Little Good. Penguin. Ebong, Inih A. (1989). Drama and Theatre among the Ibibio. Aesthetics and the Semantic Bi-polarity of the Mask Idiom”. A paper presented at the Ninth International Conference on African Literature and the English Language, University of Calabar, (May 2-5): 1-49 Edeani, D. (1995). Role of Africa Media Review in the Sustainable Development of African Communication Research. Africa Media Review, 9(1), 24–52. Ekian, M. (2020). [DEEP FAKE] President Kennedy Discusses Rick and Morty [Motion Picture]. Retrieved 04 24, 2021, from https://www.youtube.com/ watch?v=GlrrvAYklRA Ekpo, E. (1968). The Quas  – A Historical Perspective and Beliefs System. Bacon Publishers. Escarpit, R. (1968). The Book Revolution. George Harrap/Unesco. Essien, E., & Oqua, K. (2020). Angbo-Akata: The Social Dimensions of a Night Masquerade. Theatre and Media Studies Journal of Contemporary Research, 17, 124–135. Ferguson, Stewart, & Sherry Devereaux Ferguson. (1980). Cultural Policy Priorities Scaling. Paper given at the Annual Conference of the Canadian Communication Association. Gamble, C. (2007). Origins and Revolutions: Human Identity in Earliest Prehistory. Cambridge University Press. Gillies, D. (1989). Technological Determinism in Canadian Telecommunications: Telidon Technology, Industry and Government. Canadian Journal of Communication., 15, 2. Ginsburg, F. (1991). Indigenous Media: Famous Contact or Global Village? Cultural Anthropology, 6(1), 93–112. Hachten, W. (1971). Muffled Drums: The News Media in Africa. The Iowa State University Press, Iowa. Hao, K., & Heaven, W. D. (2020). The Year Deepfakes Went Mainstream. MIT Technology Review. Retrieved 04 25, 2021, from https://www.technologyreview.com/2020/12/24/1015380/best-­ai-­deepfakes-­of-­2020/

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

201

Horton, R. (1971). African Conversion. Africa, 41, 85–108. Hunter-Gault, C. (2018). New News out of Africa: Uncovering Africa’s Renaissance. Oxford University Press. Hyden, G. (1980). Beyond Ujamaa in Tanzania: Underdevelopment and an Uncaptured Peasantry. University of California Press. Johanson, D., & Edgar, B. (1996). From Lucy to Language. Simon and Schuster. Kietzmann, J., Lee, L. W., McCarthy, I. P., & Kietzmann, T. C. (2020). Deepfakes: Trick or treat? Business Horizons, 63(2), 135–146. https://doi.org/10.1016/j. bushor.2019.11.006 Kottack, C. P. (2004). Cultural Anthropology (10th ed.). McGraw Hill. Lele, U. (1981). Co-operatives and the Poor: A Comparative Perspective. World Development, 9, 55–72. Lorenz, C. (2011). History and Theory. In A. Schneider & D. Woolf (Eds.), The Oxford History of Historical Writing Vol. 5: Historical Writing Since 1945 (pp. 13–35). Oxford University Press. Maras, M.  H., & Alexandrou, A. (2019). Determining Authenticity of Video Evidence in the Age of Artificial Intelligence and in the Wake of Deepfake Videos. International Journal of Evidence & Proof, 23(3), 255–262. https:// doi.org/10.1177/1365712718807226 McNeill, W. H. (1986). Mythistory, or Truth, Myth, History, and Historians. The American Historical Review, 91(1), 1–10. Meckel, M., & Steinacker, L. (2021). Hybrid Reality: The Rise of Deepfakes and Diverging Truths. https://scholar.google.com/citations?user=22CmQHQAA AAJ&hl=en&oi=sra Mercer, C. (2003). Towards a Critical Political Geography of African Development. Geoforum, 34(4), 419–436. Milton, V., & Mano, W. (2021). Afrokology as a Transdisciplinary Approach to Media and Communication Studies. In W.  Mano & V.  C. Milton (Eds.), Routledge Handbook of African Media and Communication Studies (p. 270). Routledge. Mitchell, T. (1991). Colonizing Egypt (2nd ed.). Univ. Calif. Press. Moemeka, A. (1984). Socio-Cultural Environment of Communication in Traditional/Rural Nigeria: An Ethnographic Exploration. Communicatio Socialis Yearbook, III, 41–56. Mudimbe, V. (1988). The Invention of Africa. Indiana Univ. Press. Munro, D. (1979). Rotter Internal-External Scale in an African Context: Problems of Construct and Equivalence. South African Journal of Psychology, 9(1-2), 61–66. Mutua, M.  E., Musa, B.  A., & Okigbo, C. (2022). (Re)visiting African Communication Scholarship: Critical Perspectives on Research and Theory. Review of Communication, 22(1), 76–92.

202 

U. S. AKPAN

Narayan, D., Patel, R., Schafft, K., Rademacher, A., & Koch-Schulte, A. (2000). Voices of the Poor: Can Anyone Hear Us? Oxford University Press, for the World Bank. Norris, P. (2014). Watchdog Journalism. In M.  Bovens, R.  Goodin, & T.  Schillemans (Eds.), The Oxford Handbook of Public Accountability (pp. 525–541). Oxford University Press. Nyamnjoh, F. (2005). Africa’s Media Democracy and the Politics of Belonging. Zed Books. Obadina, T. (2004). Getting a measure of African poverty, Africa Economic Analysis, http://www.africaeconomicanalysis.org/articles/gen/povertymeasurehtm.html. Ogunba, O. (1978). Traditional African Festival Drama. In O.  Ogunba (Ed.), Theatre in Africa. University of Ibadan Press. Ogwezzy, A. (2008). A Functional Approach to African Communication Systems. Concept Publications Limited. Onor, S. (1994). The Ejagham Nation in the Cross River Region of Nigeria. Kraft Books. Opubor, A. (1981). Intercommunication: Creating the Global Black Community. Présence Africaine, 1, 117–118. p’Bitek, O. (1970). African Religions in Western Scholarship. Kenya Lit. Bur. Pantserev, K. A. (2020). The malicious use of AI-based deepfake technology as the new threat to psychological security and political stability. In: Cyber defence in the age of AI, smart societies and augmented humanity. : Springer, 37–55. Panyatham, P. (2021, 03 10). Deepfake Technology In The Entertainment Industry: Potential Limitations And Protections. Arts Management & Technology Laboratory. Retrieved 04 25, 2021, from https://amt-­lab.org/ blog/2020/3/deepfake-­t echnology-­i n-­t he-­e nter tainmentindustr ypotential-­limitations-­and-­protections Peretti, J., & Sosa, J. (2018). You Won’t Believe What Obama Says In this Video! [Motion Picture]. Retrieved 04 25, 2021, from https://www.youtube.com/ watch?v=cQ54GDm1eL0 Rabinow, P. (1989). French Modern. Norms and Forms of the Social Environment. MIT Press. Reid, R. (2008). A History of Modern Africa. Blackwell. Ritzer, G. (2000). Classical Sociological Theory (3rd ed.). McGrawHill. Sachs, J. (2005). The End of Poverty: How We Can Make It Happen in Our Lifetime. Penguin Books. Saith, R. .(2001). Social exclusion: the concept and application to developing countries. Oxford: Queen Elizabeth House (QEH Working Paper Series, 72) Semple, E. C. (1911). Influence of Geographic Environment on the Basis of Ratzel’s System of Anthropo-Geography. Henry Holt and Company.

10  AKATA NIGHT MASQUERADE: A SEMBLANCE OF ONLINE DEEPFAKES… 

203

Soola, E. O. (2006). “Communication and Educational Approach and Strategies for Forest Management: A Nigerian Perspective” A paper presented at conference. Stover, W. (1984). African Music Performed. In P. Martin & P. O’Meara (Eds.), Africa (2nd ed., pp. 233–248). Indiana University Press. Talbot, P. A. (1912). In the Shadows of the Bush. NUP. Thies, J., Zollhöfer, M., Stamminger, M., Theobalt, C., & Nießner, M. (2016). Face2Face: RealTime Face Capture and Reenactment of RGB Videos. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 2387-2395). : IEEE. doi:https://doi.org/10.1109/CVPR.2016.262 Ugboajah, F. (1985). Oramedia in Africa, In Mass Communication, Culture and Society in West Africa, London: Hans Zell: 165–186 Ugboajah, F. (1986). Communication as Technology in African Rural Development. Africa Media Review, 1(1), 1–9. Ume, C. (2021). The Chronicles of DeepTomCruise [Motion Picture]. Retrieved 04 25, 2021, from https://youtu.be/nwOywe7xLhs?t=37 Wahl-Jorgensen, K., & Carlson, M. (2021). Conjecturing Fearful Futures: Journalistic Discourses on Deepfakes. Journalism Practice, 1, 1–18. https:// doi.org/10.1080/17512786.2021.1908838 Wilson, Desmond. (1981). From the Gong to Electronics: A survey of mass communication in Old Calabar. In MA Dissertation of the Department of Communication and Language Arts, University of Ibadan, . Wilson, D. (1987). Traditional Systems of Communication in Modern African Development: An Analytical Viewpoint. African Media Review, 1(2), 87–104. Wilson, D. (1988). A Survey of Traditional-modern Communication Systems in Old Calabar (1846-1986). A Ph.D Thesis of the University of Ibadan. Wilson, D. (2008). Towards Integrating Traditional and Modern Communication Systems. In R.  Akinfeleye (Ed.), Contemporary Issues in Mass Media for Development and National Security. Malthouse Press Limited. Wilson, D. (2015). Ethnocommunicology Trado-Modern Communication and Mediamorphosis in Nigeria. 44th Inaugural Lecture of the University of Uyo: Uyo: University of Uyo Press Ltd. Wright, C. R. (1964). Functional Analysis and Mass Media. In L. A. Dexter & D. M. White (Eds.), People, Society and Mass Communications (pp. 91–109). The Free Press. Yager, T. R., Bermúdez-Lugo, O., Mobbs, P. M., Newman, H. R., & Wilburn, D. R. (2007). The mineral industries of Africa, in: USGS (2007). 2005 Minerals Yearbook, Africa, Washington D.C.: US Department of the Interior, US Geological Survey Ziegler, D., & Asante, M.  K. (1992). Thunder and Silence: The Mass Media in Africa. Africa World Press.

Index

NUMBERS AND SYMBOLS #TheHotBedOfChampions, 166, 168, 171, 173, 175, 176 #HotBedOfTerror, 166, 170, 171, 176 #OccupyPlayGround, 167, 170, 171, 173, 175 #RodesMustFall, 34 #UgandanDecides, 168 #VaccineEquity, 117 A Adichie, Chimamanda Ngozi, 181 Africa-focused disinformation, 42 African Diasporic digital media, 6 Africa News, 17 African folklore, 105, 106, 111, 123 Africans and miscegenation, 62 Afro-cultural Mulatto communication theory, 183 Afrokology, 183–185 Afrophobic, 46 Akata night masquerade, 179–198

Algorithm, 52, 59, 60, 67 Artificial intelligence (AI), 147, 150–156 Asante, Molefi Kete, 179, 184–186 Atlantic Council’s Digital Forensic Research Lab (DFRLab), 42 Audio deepfake, 152 B Barstool sports, 155, 156, 158 Baudrillard, J., 132 Beckham, David, 81 Black communications Tiktok, 3, 5, 8 WhatsApp, 3, 5, 8, 10 Black Communication Theory, 183 Boda boda, 122 C Center for Disease Control, 116 Citizens TV, 6

© The Author(s), under exclusive licence to Springer Nature Switzerland AG 2023 K. Langmia (ed.), Black Communication in the Age of Disinformation, https://doi.org/10.1007/978-3-031-27696-5

205

206 

INDEX

Clinicians, 107, 108, 111–117, 122–126 modern day, 125 CNN, 148, 158 Colorism, 51–68 COVID 19, 75–76, 79, 83 Cyberbullies, 17 Cybercrimes, 46 D Deep fakes deep fake election tampering, 57 Deeptrace, 56 filters, 57–61 DeepNude, 98 Defense Research Projects Agency (DARPA), 156–159 Digital immigrants, 3 Doctored videos, 148, 154, 155, 158, 159 E Ebola communication, 139, 140 e-journalism, 7–8 F FaceApp, 58 Face swapping, 187, 188 FactCheckHub, 43 FakeApp, 53 Fake injuries, 154 Fake videos, 52, 54, 56, 60 Fandom, 73–85 Fanship digital, 79–81 sports, 78, 84 Fifa.com, 149 FIFA World Cup, 149

Florida Dali museum, 90 Folk devils, 134 G GateKeeper, 85 Gen X, 8 Gen Z, 15, 16 GIF, 175 Githinji Gitahi (GG), 107–109, 114–119, 121, 123, 124, 126, 127 Global North, 21, 47 Government-controlled media, 36 Griot, 105–127 Grounded Theory/grounded theory, 107, 108, 115, 122 H Hackers and hoaxsters, 17 Health communication, 133, 138 Hyperrealities, 10, 105–127 I Ideal images, 95 Image dissatisfaction, 95 Indigenous media, 180, 182, 185–189, 191–193 Infocalyptic, 132 Infodemic, 2 Information Communication Technologies (ICTs) bloggers, 37 e-Government, 36, 37 Information and Communications Technologies in Africa (CATIA), 37 internet connectivity, 36 mobile phones, 37

 INDEX 

International Business Machines Corporation (IBM), 52 International Telecommunication Union (ITU), 4, 6

N Nollywood, 27 NVIVO, 170 Nyamnjoh, Francis, 185, 186

J James, LeBron, 76, 80

O Ogre, 105–127 Oral traditions, 112 Oramedia, 186, 191 Orange Money, 23, 25 Ordinarization, 116

K Kaepernick, Colin, 84 Kenyan Revenue Authority (KRA), 170, 172, 173, 176 Kenyans on Twitter (KOT), 165–177 Kenyan women, 93 KTN, 6 L Lock downs, 138 M Maatic, 183–185 Magufuli, John (President), 40 Malaria Must Die Initiative, 90 Manipulation, 53, 55 McLuhan, Marshall, 131, 132 Memes, 84 Micromanage, 98 Misinformation, 15–47 Moral panic, 134 Morrison, Jim, 7

207

P Photo deepfake, 152 Photoshop, 52, 59 Postmodern technologies, 131, 132 Python, 170 R Racial Democracy Effect Theory, 65, 66 Reddit, 148 Ruto, William, 92 S Safaricom, 171, 173, 176 Screenshot image, 175 Sexism, 89–100 Sex video, 98 Skin-whitening, 61, 62 Smoke-screens, 107, 114, 125, 126

208 

INDEX

Social media algorithms, 60 Synthetic media, 89–91, 93, 95 T Technological determinism, 132 Telechurch, 57 Telemedical, 57 Teleworking, 57 TikTok, 3, 5, 8 Transracialism, 59 TwitterMate, 167 Twitter suspension, 37 U Ubuntu, 183–185

Ujamaa communication theory, 183 UNESCO, 47 User-generated content, 80 V Vaccine Apartheid, 117 Video assistant referee (VAR), 150, 152 Video deepfake, 152, 153 W Western technology, 26 WhatsApp, 3, 5, 8, 10 World Bank, 21, 22