Scholarly Crimes And Misdemeanors: Violations Of Fairness And Trust In The Academic World 1138504122, 9781138504127

This book explores the problem of scientific dishonesty and misconduct – a problem that affects all disciplines, yet who

842 84 2MB

English Pages 170 Year 2018

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Scholarly Crimes And Misdemeanors: Violations Of Fairness And Trust In The Academic World
 1138504122,  9781138504127

Table of contents :
Cover......Page 1
Title......Page 4
Copyright......Page 5
Contents......Page 6
List of boxes and figure......Page 7
List of abbreviations......Page 8
Acknowledgments......Page 9
Preface: Help! My brainchild’s been kidnapped!......Page 11
1 Intellectual misconduct: Backwards, forward, and sideways......Page 18
2 The world of scholarship: Rituals and rewards, norms and departures......Page 35
3 Structural and organizational causes of scholarly misconduct......Page 61
4 Cultural causes of scholarly misconduct......Page 72
5 Individual and situational causes of scholarly misconduct......Page 84
6 Scholarly misconduct as crime......Page 90
7 Criminological theory and scholarly crime......Page 111
8 Implications for theory and research......Page 123
9 Preventing and controlling scholarly crime......Page 136
Afterword: Against all odds, a code is born......Page 160
Appendix: Others’ stories of scholarly misconduct......Page 164
Index......Page 169

Citation preview

Scholarly Crimes and Misdemeanors

This book explores the problem of scientific dishonesty and misconduct  – a problem that affects all disciplines, yet whose extent remains largely unknown and for which established standards for reporting, prevention, and punishment are absent. Presenting examples of research misconduct, the authors examine the reasons for its occurrence and address the experience of victimization that is involved, together with the perpetrators’ reactions to being accused. With consideration of the role of witnesses and bystanders, such as book and journal editors and reviewers, students and professional organizations, the book covers the many forms of academic misconduct, offering a theorization of the phenomenon in criminological terms as a particular form of crime, before examining the possibilities that exist for the prevention and control of scholarly crime, as well as implications for further research. An accessible treatment of a problem that remains largely hidden, Scholarly Crimes and Misdemeanors will appeal to readers across disciplines, and particularly those in the social sciences with interests in academic life, research ethics and criminology. Mark S. Davis is a social scientist whose scholarship has appeared in such journals as the Journal of Research on Adolescence, Social Psychiatry and Psychiatric Epidemiology, Journal of Criminal Justice, and Science and Engineering Ethics. He is the author of The Concise Dictionary of Crime and Justice, 2nd Edition and The Role of State Agencies in Translational Criminology: Connecting Research to Policy. Mark holds a PhD in sociology from Ohio State University where he maintains an affiliation with the Criminal Justice Research Center. Bonnie Berry is the Director of the Social Problems Research Group, USA and formerly university faculty. Her areas of research interest include appearance bias, animal rights, academic misconduct and ethical violations, and all measures of social inequality. She is the author of Social Rage: Emotion and Cultural Conflict, Beauty Bias: Discrimination and Social Power and The Power of Looks: Social Stratification of Physical Appearance and numerous research articles on a range of social problems topics. Her most recent work, Physical Appearance and Crime: How Appearance Affects Crime and the Crime Control Process, is forthcoming in 2018. She is the recipient of the Inconvenient Woman of the Year Award (Division of Women and Crime, American Society of Criminology), the Herbert Bloch Award, and the Mentor of Mentors Award (the latter two from the American Society of Criminology).

Scholarly Crimes and Misdemeanors

Violations of Fairness and Trust in the Academic World Mark S. Davis and Bonnie Berry

First published 2018 by Routledge 2 Park Square, Milton Park, Abingdon, Oxon OX14 4RN and by Routledge 711 Third Avenue, New York, NY 10017 Routledge is an imprint of the Taylor & Francis Group, an informa business © 2018 Mark S. Davis and Bonnie Berry The right of Mark S. Davis and Bonnie Berry to be identified as authors of this work has been asserted by them in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-in-Publication Data Names: Davis, Mark S. (Mark Stephen), 1952– author. | Berry, Bonnie, author. Title: Scholarly crimes and misdemeanors : violations of fairness and trust in the academic world / Mark S. Davis and Bonnie Berry. Description: Abingdon, Oxon ; New York, NY : Routledge, 2018. | Includes bibliographical references and index. Identifiers: LCCN 2017058001 | ISBN 9781138504127 (hbk) | ISBN 9781315145822 (ebk) Subjects: LCSH: Learning and scholarship—Moral and ethical aspects. | Research—Moral and ethical aspects. | Scholars—Professional ethics. | Professional ethics. | Criminology. Classification: LCC AZ101 .D38 2018 | DDC 001.2—dc23 LC record available at https://lccn.loc.gov/2017058001 ISBN: 978-1-138-50412-7 (hbk) ISBN: 978-1-315-14582-2 (ebk) Typeset in Times New Roman by Apex CoVantage, LLC

Contents

List of boxes and figure vi List of abbreviations vii Acknowledgments viii Preface: Help! My brainchild’s been kidnapped!x 1 Intellectual misconduct: Backwards, forward, and sideways

1

2 The world of scholarship: Rituals and rewards, norms and departures

18

3 Structural and organizational causes of scholarly misconduct

44

4 Cultural causes of scholarly misconduct

55

5 Individual and situational causes of scholarly misconduct

67

6 Scholarly misconduct as crime

73

7 Criminological theory and scholarly crime

94

8 Implications for theory and research

106

9 Preventing and controlling scholarly crime

119



Afterword: Against all odds, a code is born

143

Appendix: Others’ stories of scholarly misconduct Index

147 152

Boxes and figure

Boxes 2.1 2.2 2.3 4.1 5.1

In the beginning (as in the end), it’s all about money Legitimate, predatory, or something else? Administrative sanctions for scholarly misconduct Scholarly credentials for sale Structural pressure or individual pathology?

20 24 39 56 68

Figure 3.1 Levels of explanation for scholarly misconduct

46

Abbreviations

CWBs DRPs EAP F&A FF&P IF I/O IP NIH NIMH OCBs ORI OSI P.I. Program QRPs RCR RIO

Counterproductive work behaviors Detrimental research practices Employee Assistance Program Facilities and administrative Fabrication, falsification, and plagiarism Impact Factor Industrial/organizational Intellectual property National Institutes of Health National Institute of Mental Health Organizational citizenship behaviors Office of Research Integrity Office of Scientific Integrity Washington University in St. Louis’s program designed to promote professionalism and integrity in research. Questionable research practices Responsible conduct of research Research Integrity Officer

Acknowledgments

First of all, I have to thank my co-author, Bonnie Berry, because without having shared her story, this book would not have been written. Although our friendship goes back decades to graduate school, this was our first collaboration. She proved herself to be a knowledgeable, patient and supportive writing partner. Many years ago, I had the opportunity to become acquainted with several staff members and consultants of the U.S. Office of Research Integrity: Chris Pascal, Alan Price, Larry Rhoades, Mary Scheetz, Nick Steneck and Peter Yeager. In my opinion, this group did far more for advancing research on research integrity than they have been given credit for. Their enthusiasm, encouragement and support reenergized me to continue my work on research misconduct, and for that I belatedly thank them. There are two people in particular whose unique perspectives have informed my recent thinking in noteworthy ways. Donald Kornfeld, professor emeritus of psychiatry at Columbia University, took the time to read and comment on the draft manuscript. He raised important points Bonnie and I hadn’t considered, suggested sources to consult, and I am grateful that he was willing to share his wisdom. Likewise, Cristy McGoff, research integrity officer at Harvard University, offered important insights that only come from someone who has worked on the front lines of research misconduct. She, too, read the draft and made a number of helpful suggestions. I must also thank Stephanie Milan of Red Pen Prose whose editorial expertise uncovered numerous opportunities for improving the manuscript’s clarity and readability, but since she’s also my daughter, I must declare an emotional conflict of interest. These individuals bear no responsibility for any remaining errors or omissions. I owe the staff at the Criminal Justice Research Center at Ohio State many thanks for allowing me to periodically hang out and absorb their energy. Current CJRC director Dana Haynie and former director Ruth Peterson, in particular, have offered financial, intellectual and emotional support over the years. I value their friendship. I also would like to thank Neil Jordan and Alice Salt of Routledge, Neil for seeing merit in our proposed book, and Alice, for shepherding us through the submission and production process. Last of all, but perhaps most importantly, I have to thank my wife, Jan, who for decades now has prodded me to get my thoughts onto paper or into a computer,

Acknowledgments ix even when there were no career pressures or incentives to do so. She has indulged occasional absences, questionable expenditures, disagreeable moods, and other inconveniences caused by my lifelong obsession with ideas and their expression. I’ve been lucky to have her in my corner and in my life. Mark S. Davis * * * When things go bad, I adhere to my usual philosophy: “Never let a good nightmare go to waste.” Through analysis, we may gain understanding and clarity, learn from the damage, and, under the best of circumstances, share our interpretations with an eye toward preventing bad experiences from happening to others. I could not have successfully managed the experience described herein without the help of many, many people to whom I am deeply indebted. Among these are my colleagues: Bob Agnew, Joanne Belknap, Donald Black, Brenda Blackwell, Becky Block, Denise Paquette Boots, Brenda Chaney, Ellen Cohn, Candace Kruttschnitt, Stephen Muzzatti, Ruth Peterson, Claire Renzetti, Elicka Peterson Sparks, and Brent Teasdale. Overlapping with my staunch academic buddies are those who served with me on the American Society of Criminology’s ad hoc ethics committee and who were absolutely indispensable in getting a code of ethics passed: the late Bob Bursik, Mark Cooney, Mark Davis, Joanne Kaufman, Nathan Pino, Paul Stretesky, and Marjorie Zatz. Also, I have the privilege of becoming acquainted with Dave Hudson, Research Integrity Officer and Dean of Research at the University of Virginia. Dave shared his knowledge and experience as a RIO, including hair-raising stories about RIOs’ lives being threatened by suspect researchers as well as incredible tales about the lengths to which some researchers will go to hide their misconduct. Three social science organizations were immensely helpful — the American Sociological Association, the Society for the Study of Social Problems, and the American Society of Criminology. I am grateful to the many anonymous and identified informants who passed along their accounts of scholarly misconduct. Of course, this project would not have been successful without the guidance of my good friend and co-author, Mark S. Davis. I am forever grateful for his expertise and rock-solid support. Allow me to also thank Neil Jordan and Alice Salt, commissioning editor and editorial assistant for Routledge/Taylor and Francis. They made it happen and they made it easy. Finally, words fail me in expressing adequately gratitude to my best friend, Pete Lara, the captain of my life. He had my back, still does, and I was not expecting that. Bonnie Berry

Preface Help! My brainchild’s been kidnapped!1 Bonnie Berry

In the summer of 2010, I saw a review in the New York Times Book Review of a book with the same title as one of my books published in 2007. I was pleased to think that the New York Times Book Review had reviewed my book even though the review appeared three years after publication. Then I noticed that the title of the 2010 book, though the same as my 2007 book, was different only in that it contained the added article “The.” And then I noticed that the author’s name was not mine. I read the review and discovered that the 2010 book appeared to cover the same ground as my 2007 book. I  told myself to forget it. After all, two books with remarkably similar titles, covering the same material, relying on many of the same citations, offering the same policy recommendations, and arriving at the same conclusions could be a coincidence. I told myself that it doesn’t matter: If the topic is a good one and is researched well and presented well, it doesn’t matter if the later-published work borrowed from the previously-published work. If the message is an important one and reaches a wider audience, it doesn’t matter who relays the message. I reminded myself that my own book had done well, had been adopted for classroom use, and had been translated into at least one other language, and thus I should not be so territorial and should let someone else have a crack at the same topic. This strategy worked until Sociologica, the European review journal, asked me to review the 2010 book. The Sociologica editor emailed me and requested the review because, as the editor said, the 2010 book “has the same title of one of your books!”2 With reservations, I agreed to review the book, and upon reading it, counted at least 64 overlaps in topics covered in the 2010 book that are previously discussed in my 2007 book, most of the same legal cases were cited, and approximately 50 citations from my book are repeated in the 2010 book. Of these citations, many are sociological – even though the other author is not a sociologist – and some of them are vintage and exotic, and thus not readily available. In short, as to the literature sources, many of those copied in the 2010 book are found in my personal library, the library of a sociologist of over 30 years’ standing who has collected books on arcane topics from used and new bookstores and from sociology journals.

Preface xi It took two days to read the 2010 book and at the end of the two days, I felt as though I would vomit. After three days of drafting the review, I did vomit. There are no instances in which the wording is identical across the two publications. The other author does cite my work in her chapter notes, which indicates she was aware of my book. But my work does not appear in the index or in the body of the text. It is, of course, possible that two scholars, working independently on the same topic, find and use the same sources and arrive at the same conclusions. After all, some of the most exciting innovations were invented around the same time in different parts of the world.3 Yet it remains puzzling how it happens that, authors working independently, with the later-publishing author knowing of the previously-published work, can publish a substantially similar work as though the later-published work is original. Originality is one of the hallmarks of scientific research.4 Replication has its place, to be sure, but replication is useful only under very constrained circumstances such as retesting hypotheses with different samples or differing methodologies. Aside from these exceptions, work that is repeated not only lacks parsimony, it is redundant and therefore unworthy of publication. We can applaud all work that is well-founded, well-written, and distributed widely, with the purpose of our work, scientifically and socially, being to advance understanding and to educate. One of our main tasks is to advance knowledge and to arrive at new discoveries. It is not our task to re-examine phenomena that have already been thoroughly examined and reported. As scientists, we are trained to adhere to the principled search for scientific truths. It is a maxim of all sciences that redundancies in research are unnecessary; the redundancies themselves negate the utility of the redundant work.

Similar stories For me, dismay eventually turned to curiosity. I needed to know how common this experience is, what other people do when it happens to them, and other basic facts that, until now, were beyond understanding. That curiosity is the catalyst for this book. Since this practice of copied work seems to occur in all scholarly disciplines, it would be helpful to have this phenomenon better understood and information about its occurrence disseminated to academic audiences. First, I asked the memberships of three social science organizations – the American Sociological Association, the Society for the Study of Social Problems, and the American Society of Criminology – for their stories of same work/different author experiences. An essay was published in these three outlets, detailing what had happened and promising anonymity for similar stories.5 The second thing I did was consult the first author of this book, fellow sociologist Mark Davis, known for his decades-long analysis of questionable scholarly behavior. Third, a rousing session ensued at the annual meetings of the American Society of Criminology during which there was great interest in the topic on the part of the audience which

xii  Preface included scholars from abroad.6 Thus began a journey that led down many paths, using varying sources including librarians, professional organizations, copyright and intellectual property attorneys, ethics boards, government entities that investigate research misconduct, data bases of misconduct cases, scientific literature on the topic, news articles, interviews with victims, the use of quantitative and qualitative analytic techniques, and theoretical interpretations of this behavior from the best minds in scholarly research. What was discovered from my solicitation of the memberships of the American Sociological Association, the Society for the Study of Social Problems, and the American Society of Criminology were stories from prominent people such as respected professors at prestigious universities as well as from students and untenured assistant professors, those with power and those with little power (see Appendix A for the stories). I learned of serial offending and serial victimization. Of the cases of the multiply-victimized, I noted repeatedly that victims in these stories are in fear of reprisal from the offenders while the offenders are often unscathed or even rewarded. Perhaps the fact that offenders are often unchallenged partly explains why they continue in their behavior. Theft of scholarship is only one form of academic misconduct, as the reader already knows or will discover in the course of reading this book. Other significant forms of misconduct include data fabrication and data falsification. One can say that fabrication and falsification are far more harmful to scientific discovery and to society at large than plagiarism. But back to the subject of theft, the methods by which scientific theft is accomplished vary, from offenders copying and presenting as their own the ideas of those whose work they review for journal publication and funding proposals to outright theft of students’ and colleagues’ papers. Commonly, the offenders defy the victims to confront them. And more often than not, the victims back down.

What the law says Against the usual thinking, Mark Davis and I propose that these scholarly offenses violate the same fundamental norms as do many crimes. We challenge the ambiguous definitions that scholars have so far used to loosely describe scholarly misconduct and we discuss the reasons for scholarly resistance to tightening the definitions of such offenses. At this point, however, we have a restricted legal definition of scholarly misconduct as well as restricted recourse to restorative justice. The same is true of the academic response to academic violations; the academic definitions of academic misconduct, offered by professional organizations such as the American Sociological Association, are vague if they exist at all. For instance, the American Society of Criminology is one of the few professional organizations that, as of 2016, had no formal code of ethics and thus no definitions of academic misconduct. Whether codes of ethics exist or not, there is a startling dearth of channels by which victims can pursue remedial justice as well as little guidance on how to prevent such offenses from occurring in the first place. Our

Preface xiii aim in this book is to lay out a plan for clarifying definitions and to offer workable solutions for prevention and control. At present, the law on copying others’ work without appropriate credit is vague. Not content with the confusion surrounding what constitutes academic misconduct and the seeming helplessness of victims to correct the wrongs done to them, I  consulted three intellectual property (IP) and copyright attorneys to see what avenues, if any, are open to those who have experienced this particular offense. One is in private practice, another is a member of a large and prestigious law firm, and one is a university law professor, all in Seattle. All three, having heard and read my story, described a copyright case against the author of the 2010 book as an uphill battle, not because of the lack of evidence but because of the financial costs involved, the relative power of the other author and of her publisher compared to me, the victim, and the absence of verbatim copying. Helpfully, one of them remarked that, within academics, a function of bringing infringement to the forefront of public awareness or to a court of law would be a regulatory function resulting, hopefully, in shame or damage to the reputation. What occurred in my case, and what occurs much of the time in such instances, is a violation of social norms, not a violation of law. Now, this is an enlightening perspective in itself, one that will recur throughout this book. In the case of academics, much of our social power is reliant upon the public’s view of academics as selfless seekers of the truth. As such, we are expected to be above petty offensive behaviors such as insufficiently crediting others’ work. Well, of course that is not an accurate portrayal of all of us, as has been amply documented in news stories as well as in the primer on the topic, William Broad and Nicholas Wade’s Betrayers of the Truth.7 It is interesting that the attorneys I consulted also recognized informal control, which will be discussed in detail later in this text, as a method of prevention and punishment. Getting down to legalistic basics, one of the primary principles of copyright law is that ideas are not protectable; only the expressions of those ideas are protected, so long as they’re fixed in a tangible medium such as a book or journal article. This principle stems from 17 U.S.C. Sec. 102 (b), which states that, in no case does copyright protection for an original work of authorship extend to any idea, procedure, process, system, method of operation, concept, principle, or discovery, regardless of the form in which it is described, explained, illustrated, or embodied in such work.8 Copying portions of a book, verbatim, is a clear case of copyright infringement. Appropriating topics, ideas, organizational structure, and references makes the case a bit more complicated. Whether the latter behavior would be an infringement depends on a legal doctrine called the substantial similarity test. Essentially, if a lay reader, given the two books would recognize the original author’s material within the second, that’s probably a case of substantial similarity, and the second author should have gotten the permission of the original author to use the material.

xiv  Preface An author can paraphrase another author’s words and ideas, called managed copying, and those who practice it are probably safe from legal action by the original author. Rephrasing a work can only be tested as a substantial similarity case, which is much harder to prove in court than blatant plagiarism. However, there is much disagreement and confusion over the precise meaning of plagiarism. As we will see in this book, some scholars say that plagiarism is not limited to verbatim copying but rather plagiarism can include the theft of ideas. There are definite substantial similarities across my 2007 book and the 2010 book, as determined by the attorneys who read both books. However, these tidy legal issues aside, power and privilege are often key determinants of case outcome. All three of the attorneys reminded me that relative power (status, economic power, and other resources) of the offender compared to someone of far lesser power greatly influences the chances of successfully challenging copyright cases, no matter the rightness or wrongness. Had the “demand” letter (the ceaseand-desist letter), which one of the attorneys sent, been received by a lesser publisher than one of the oldest and most respected publishers in the business, or a lesser author than an esteemed professor at a prestigious university, the outcome might have been different. As it was, the publisher of the 2010 book responded to the cease-and-desist letter, which requested that the book be removed from sales, with aggressive disdain. My attorney was informed by the publisher that I was incorrect in my concerns about this duplicate publication, they refused to halt sales and marketing, and warned me to not publish any online remarks about the experience. I dropped the case, but my curiosity remained.

Conclusions As unethical as it may seem, it is legal to paraphrase a work, even a book-length work. One of the primary foci of this present text is to understand the variance among ethicality, legality, and morality. One would think that ethicality and morality are subjective whereas legality is not. However, that does not seem to be the case; the legality of academic misconduct is not consensually understood or well-defined. Moreover, as we will discuss, ethicality, morality, and legality fluctuate over time, as they probably should. All are subject to re-evaluation and re-definition and, as such, they evolve in a direction of greater clarity. As noted earlier, Mark Davis was one of the colleagues I consulted about my victimization. Upon reading both books, he agreed that the similarity seemed too close to be coincidental. Mark had been studying research misconduct since graduate school and for some time had wanted to write a book that provided a criminological analysis of these phenomena. With the closure of realistic legal options in my case, a collaborative work seemed like a logical next step for us, hence this book. The scope of this work ranges far beyond the topic of this preface. Scholarly misconduct occurs in all scientific disciplines, which is hardly a surprise, but also in a number of fields including the humanities and arts and in the composition of instructional manuals such as recipe books.9 Beyond a comparison to the arts, we

Preface  xv can also compare powerful academic individuals and institutions to powerful corporate actors and corporations along the important dimension, invulnerability.10 Some corporate offenders, as with academic ones, are or seem to be untouchable. Their victims are fearful of confronting them, with good reason. The main concentration in this text, though, will be on the sciences with cases in computer science, physical sciences, medicine, and natural sciences offering useful comparisons to the social sciences. With analyses such as we hope to offer, scholarly misconduct, suspected or substantiated, can be better understood and, equally significantly, prevented and controlled. What follows is an examination of types of scholarly offenses, their forms and functions; a description of the relationship, though usually distant and always conflicted, between the victims and the offenders; an investigation of the avenues by which to prosecute scholarly crime; an overview of selected theoretical explanations of scholarly violations and; a presentation of the concrete and practical ways to prevent and control such offenses. In a way, this is a story of great expansiveness as to the harm inflicted on whole societies as well as a story of deep personal pain inflicted by one individual upon another. The common thread is the betrayal of trust and behavior that is unfair. As we wrote this book, one of the more disturbing components we dealt with is resistance to accepting that scholarly misconduct occurs on a largely unknown but likely grand scale. More perniciously, we encountered resistance to doing anything about it. Social change doesn’t come about without conflict. The same is true for the small but important component of social views of scientific offenses, in other words, the recognition and understanding of scholarly misconduct committed by respected scientists. To pinpoint the topic of resistance even more, a change in the definition of scholarly misconduct as crime has generated staunch refusal to consider such a change in perspective. The reason for this refusal is obvious since we as scientists want to retain our hold on esoteric expertise as well as the respect that our work and social standing engender. A change in perspective on scholarly misconduct doesn’t count as broad social change but it is a notable change in the scholarly world, a shift in perspective, practice, and policy. And the impact of this shift in understanding and controlling scholarly misconduct can be massive, indeed global, since science is an intensely integral part of our world. Offenses we are discussing pose more social and personal harm than most would think. They create victims who, unlike those of more traditional crimes, get little sympathy and have few options for relief or support. Most of us give little thought to such violations and brush them off as insignificant to our daily lives. Believe it or not, and we will demonstrate this, offenses of this type do affect us, sometimes badly and permanently. One example is biomedical research gone awry. There is no cause for despair, however. We need to understand this phenomenon which, to date, we do not. We can comprehend it and we are perfectly capable of reducing the occurrence of these offenses, keeping in mind the goal of improving the scientific world as well as society, which is deeply dependent upon science. Our analyses have implications far beyond the world of scholarship and science. They force us to ask questions that concern not only science and scientists

xvi  Preface themselves, but also legislators, policymakers, social control entities, and ordinary citizens. How should scholarly offenses be defined? If it is not illegal, what norms is this behavior violating and why does it bother victims so much? The answers lie not in obscure norms of science, but in ordinary standards of fairness that govern all social behavior. Once we agree on what is and what is not misconduct, how best to control it and by whom? And what do these behaviors say about human nature and social exchange? We hope this will be a mind-expanding journey, enjoyable at times, perhaps disturbing at others. We warmly invite the reader to come along.

Notes 1 The title of this chapter was borrowed – with permission – from an unpublished paper presented by Mark Davis at the 1982 annual meetings of the American Society of Criminology in Toronto, Canada. Given that the paper discussed firsthand accounts of alleged thefts of ideas and written work, the title seemed apropos. 2 Ghigi, R. (2010). Personal communication, July 20. 3 Merton, Robert K. (1961). “Singletons and Multiples in Scientific Discovery: A Chapter in the Sociology of Science.” Proceedings of the American Philosophical Society, 105: 470–486. 4 Merton, Robert K. (1942/1973). “The normative structure of science.” In Norman W. Storer (ed.) The Sociology of Science: Theoretical and Empirical Investigations, pp. 267–278. Chicago, IL: University of Chicago Press. 5 Berry, Bonnie. (2011). “Same work, different authors: An invitation for your stories.” A  variation of this essay was published in three separate newsletters: The Criminologist, 36(1): 20 (American Society of Criminology); Footnotes, 39(9) (American Sociological Association, available online at: www.asanet.org); and Forum, 42(1): 7 (Society for the Study of Social Problems). 6 Berry, Bonnie (2012). “Very wrong but not illegal: Law, ethics, and the theft of scholarly work.” Paper presented at the annual meetings of the American Society of Criminology, Chicago, IL. 7 Broad, William & Wade, Nicholas (1982). Betrayers of the Truth: Fraud and Deceit in the Halls of Science. New York, NY: Simon and Schuster. 8 U.S. Copyright Office. (1999). Available online at: www.copyright.gov 9 Rips, M. (2012). “Fair use, art, Swiss cheese and me.” New York Times, June 17, p. 5. 10 Uhlmann, D. M. (2013). “Prosecution deferred, justice denied.” New York Times, December 14, p. 21.

1 Intellectual misconduct Backwards, forward, and sideways

This chapter will offer a brief overview of intellectual misconduct, its history as well as its contemporary forms, followed by a description of misconduct as it takes place across multiple professions, scientific and otherwise. The reason for viewing misconduct in this way is to illustrate that ethics violations, as they take place in any occupation, have an important element in common: The violation of trust. In August of 2015, we read in the New York Times that social science studies and publications from these studies cannot be confirmed as valid. High-profile social psychologists and political scientists were found to have fabricated data, resulting in retractions from the top journals where the studies were published. It was discovered through “a painstaking yearlong effort to reproduce 100 studies in three leading psychology journals that more than half of the findings did not hold up when retested.”1 There are a number of unfortunate aspects to this phenomenon, one of them being that scholarly work provides the core knowledge of not only what we understand about science, but, in many cases, how we use that knowledge. That is, we as scholars rely on these vetted studies as the foundation and framework upon which to build more scientific findings. We assume that all studies published in reputable journals are valid and reliable, in other words, they measure what they say they measure and can be retested to show the same findings. We also rely on these findings to guide decisions about important problems, to train therapists, to educate students, to aid the needy, and to cure illnesses. So yes, the discussion we raise in this chapter is not limited to the world of science and scholarship but reaches far beyond to the public and to entire societies. If we as a global society are relying on false data that appears to “prove” something, we will make continuous mistakes on a wide array of concerns, be they medical, political, psychological, environmental, familial, and so on. Indeed, while the problem with fabricated data was larger than originally thought, it is thought to be even greater in certain fields such as cell biology, economics, neuroscience, clinical medicine, and animal research.2 Is this problem with reproducibility new? Yes, and no. Contemporary science is characterized by a hypercompetitive culture that “favors novel, sexy results and provides little incentive for researchers to replicate the findings of others, or for

2  Intellectual misconduct journals to publish studies that fail to find a splashy result.”3 Yet as history shows, scholars of yore also sought recognition, even to the point of committing fraud and sabotaging other scholars’ work. Long ago, there were no scientific journals and the work of scientists was done largely on a solitary basis. Investigators performed their work, at least partly if not entirely for the sheer pleasure of discovery. Now, we have teams of investigators competing for scarce research dollars and hoping for, at minimum, survival in the academic world, if not recognition and greatness. This is not to say that modern scholars no longer experience the thirst for knowledge and the joy of discovery. Many, hopefully most, are excited by new questions and the answers to those questions. We are even thrilled at trying new methodologies to get to the answers. However, two changes, alluded to above, have taken place that negate, or at least get in the way of, science for the sake of science. One change is the oft-discussed influence of the Internet which allows instant “knowledge” which isn’t knowledge at all but rather a conduit for appearing to know a great deal about a topic when one has but superficial knowledge and fingers that can move over a keyboard. The second is the replacement of the thirst for knowledge with the intense need for survival in the scholarly world. It is as though, in the terms of sociologist Robert K. Merton, we have placed a far greater emphasis on the ends such as publications, funding and promotion, than on the means to achieve those ends. The scientific pursuit itself has been forgotten in many instances. We could say that the ends have eclipsed all other concerns.4 We might also consider this empty activity to be anomic, an alienating function of what was supposed to be scholarly activity. The ends have overshadowed the means, and the manifest function of discovery takes second place to the dominant latent function of evidence of success in the scholarly world. We refer to nineteenth century sociologist Emile Durkheim’s anomie theory to explain deviance. Where social norms are weak, conflicting or absent, Durkheim argued, a state of normlessness can exist. This condition of normative confusion occurs wherein rules and values have little impact, as seems to be the case in scholarly misconduct.5 We will revisit this theory as we home in on the specific norms violated by scholarly misconduct. For now, we will present cases from centuries ago, followed by a description of more contemporary cases. We will compare the aged and current cases of scholarly misconduct in terms of what is different and what is similar. Specifically, we will use for comparison: the methods of research such as the advent of the Internet; the reasons, for instance, the pressure to publish, obtain funding, and to be “the first” to discover a new phenomenon; outcomes of discovered misconduct; the topics of study such as natural science, physical science, social science; and the effect of power differentials between scholars. Some things never change in the field of scholarly misconduct. And yet some aspects remain fluid. Scholars are human and some of us are selfish, entitled egotists. These “scholars” will do what they can to advance themselves regardless of the consequences for their victims. Most of us are not of this ilk.

Intellectual misconduct  3

A whirlwind history of scholarly misconduct Who can we rely upon if not scholars for the truth? William Broad’s and Nicholas Wade’s 1982 book Betrayers of the Truth6 lays out an impressive and startling history of scientific misconduct through the ages, completing their analysis with more up-to-date and horrifying cases of the twentieth century. Given that their book was published in 1982, we try to fill in the blanks with this present text. But first, let’s consider the history. Broad and Wade offer a handy list and brief chapter on “the greats” in science who have been found to have engaged in fraudulent behavior. To our shock, we learn that the likes of Ptolemy, Galileo, Isaac Newton, John Dalton, Gregor Mendel, Charles Darwin, and the Nobel prize winner Robert Millikan all committed forms of scientific misconduct. Darwin, for example, was accused of failing to acknowledge the previous work of other scientists upon which his theory of evolution was based. A supporter of Darwin’s theory remarked: You have no idea of the intrigues that go on in this blessed world of science. Science is, I fear, no purer than any other region of human activity, though it should be.7 Throughout this book we will show how accurate this sentence is. The misuses and abuses of science received international attention in the twentieth century with studies such as the Tuskegee syphilis experiment in which hundreds of African American men received treatment for the disease without having given their informed consent, those unfamiliar with this tragic episode in American history. These and other human abuses in the name of scientific research gave rise to a concerted international effort to curb abuses to human subjects in the course of medical and scientific research. As a result, university researchers now undergo training on the responsible conduct of research (RCR), a topic we will revisit throughout this book.8 While the above is a very brief sketch of scholarly misconduct from earlier times, and many more cases could be cited, it suffices to illustrate that certain dimensions of such misconduct have been with us since recorded time. Human nature and human behavior have not changed. There have always been people who are dishonest, lazy, unscrupulous, and narcissistic. Others of us, hopefully a majority, are honest and above reproach. This assumption makes us ask what has changed over time in the picture of scholarly misconduct. One answer is technology. Technological advancements have eased the ability and increased opportunities to engage in intellectual misconduct, as we will see below.

Contemporary intellectual misconduct In the present day we have new developments that make intellectual misconduct easier. One, of course, is the Internet. Another facilitator is Google and other

4  Intellectual misconduct search engines. Most of us cannot imagine life without these conveniences, but as we will see, they have given rise to collateral consequences, some of which are important for understanding intellectual misconduct. First, let us consider the Internet. Plagiarism has been much in the news in the context of high school and college students lifting, without crediting the source, research in part or in whole, from various online and other sources. One of the dimensions of these news stories is the astonishing fact that students do not view their behavior as wrongful, which may mean that they require better education earlier in their careers on matters of ethical misconduct. What defies explanation even more is that students are aware teachers and professors use plagiarism-detection software such as Turnitin. But certainly, nonstudents, academic researchers, and professors, know better; they realize that extensive use of another’s work while representing it as their own is well understood to be unscholarly and dishonest. When the authors were undergraduate students in the 1970s, there were numerous term paper “mills” which, for a fee, would produce college terms papers. This was years before the advent of the Internet. Writer Suzanne Choney reports on the increase in plagiarism in college and points to the role of technology in facilitating it.9 She cites a survey conducted by the Pew Research Center, in conjunction with The Chronicle of Higher Education, of 1055 college presidents from private and public two – to four-year schools. She found More than half of those top officials said they’ve seen an increase in plagiarism in the past 10 years. Nearly all of them say computers and the Internet have played a major role in the rise of stealing others’ work and claiming it as their own. Textbooks have gone digital, there are cell phones and computers in the classroom, and the courses are online – all of which may influence intellectual property theft. We would argue that free and accessible sources of other people’s work, via Wikipedia and Google, enable the theft of scholarly work. We wonder if students even know that it is wrong to use other people’s work without crediting the original authors. In David Segal’s piece in the New York Times, he comes to the aid of a writer who claimed that her entire e-book had been copied and sold on Google Play under a different cover and a different author’s name.10 The same images, same layout, same interviews, and so on were all copied and re-presented as a book under another author’s name. As a result, the original author’s royalties plummeted. The author and her publisher approached Google about this offense, but no action was taken. So “the haggler” (as Segal is known) contacted Google, confronted them, and was told that Google was aware of the problem with e-books being copied and misrepresented as original work to be sold on Google. Google, though aware of the problem, has “responded slowly to complaints from authors and publishers and sometimes did not respond at all. As bad, when the company acted [in response to an author saying that this copying phenomenon was “rampant” on Google], it would often remove pirated e-books but allow e-book pirates to

Intellectual misconduct  5 remain on the site.”11 Google Play was selling pirated publisher-published books as well as self-published e-books. Indeed, “pirated books were being uploaded by people using Google Play through its self-publishing channel.” People were opening accounts, ostensibly to publish their own work, and then selling digital copies of popular, and not so popular, e-books they had not written. Google “stopped enrolling any new self-publishing authors”,12 after many complaints from authors and legitimate publishers. And then there are questionable open-access journals that promise to publish academic work speedily and for a fee.13 The titles of these journals look legitimate enough, such as International Journal of Advances in Case Reports, and they cover an enormous range of disciplines. There are several serious problems with this means of publication. An obvious one that it is impossible to have a team of scholars adequately review a scientific paper in a day’s time; finding qualified reviewers alone takes longer than that. Another problem is that, if the author has to pay for publication, she or he is virtually guaranteed that her or his paper will be published. Third, there does not appear to be any check on whether the author has submitted the paper to more than one outlet. In legitimate journals, authors are permitted to have papers reviewed solely by the journal to which it is submitted. So, with these questionable journals, a paper conceivably may appear multiple times in multiple outlets. The origin of this unprecedented increase in false findings is laid at the doorstep of the hypercompetitive culture across the sciences. An alternative or additional explanation for the increase is the advanced technology that allows extensive and intensive review of journal articles that can quickly find unreported errors. In other words, we have increased transparency. No discipline is immune from this behavior. Psychology was exposed for its elevated chicanery. But the problem may be far worse in fields such as cell biology, economics, neuroscience, clinical medicine, and animal research.14 Retractions are increasingly occurring, which is a good sign since it means that poorly-conducted research and false findings are brought to light. On the down side, though, an increase in retractions means that more bad research is being published. As Benedict Carey has reported: The crimes and misdemeanors of science used to be handled mostly inhouse, with a private word at the faculty club, barbed questions at a conference, maybe a quiet dismissal. On the rare occasion when a journal publicly retracted a study, it typically did so in a cryptic footnote. Few were the wiser; many retracted studies have been cited as legitimate evidence by others years after the fact. But that gentlemen’s world has all but evaporated.15 In the academic world, there is a range of errors that academics may make or poor behavior in which they may engage. We assume, rightly or wrongly, that academics, like other workers, are honest and take their work seriously. We assume that academics would not go out of their way to hurt others. Yet some do via their

6  Intellectual misconduct research, their teaching, and in a spillover category of, say, sexual harassment of students and fellow workers. In 2015, a renowned astronomy professor was forced to resign his post at the University of California at Berkeley after being found guilty of sexual harassment of students.16 That story is from October 2015. By February 2016, we learn of a molecular biology professor at the University of Chicago being investigated for sexual misconduct against his female students.17 This prominent professor, Jason Lieb, was also relieved of duty for the same reasons from Princeton and the University of North Carolina. Cases like these, those publicly acknowledged, may be the tip of the iceberg. Without better reporting, it is impossible to know how much sexual abuse occurs in academic settings.18 Such behavior is more common than we might wish, but there is one important distinction we want to make about academic misconduct: Fellow scholars have bad reactions to sex offenses committed by their academic co-workers but not so much against the co-workers who engage in offenses related to scholarship. We’ll say more about this in Chapter  2. And, sometimes, it is difficult to differentiate between sex offenses in academe and scholarly misconduct. Erich Goode, a sociologist well-known for his studies of deviant behavior, is said to unapologetically engage in sexual relations with his research subjects;19 for instance, he has admitted that he has had sex with members of fat-acceptance organizations and weight-watchers’ groups he has studied. The point is that scholars and scientists are not beyond reproach, but they should be further beyond reproach for strictly scientific violations. In early 2017, the New York Times published an article that highlighted suspicions about a prominent cancer researcher, Carlo Croce, of The Ohio State University.20 At the time the piece was written, Croce, a physician-scientist, had brought in over $86 million in research grants. But he was alleged to have engaged in misconduct in the preparation of grant proposals and published articles. The allegations were not the first time Croce had been accused of wrongful scientific behavior. In the New York Times article, it was noted that the university benefitted financially from Croce’s successful grantsmanship, a fact that could compromise the objectivity of internal investigations of these matters. In March 2017, after the publication of the Times pieces, Professor Croce became the recipient of the Margaret Foti Award from the American Association for Cancer Research. This was the eleventh such award made by the AACR, and it was given for Croce’s contributions for translating basic cancer research into clinically beneficial findings. The New York Times article prompted Croce to file suit for defamation. Croce’s case, as well as those of other prominent researchers, suggests that allegations of scholarly misconduct do not always involve insecure, untenured faculty or those poorly acquainted with the norms of science. This text will offer numerous examples of scholarly misconduct. We concentrate on research misconduct, but there are other forms of violations committed by scientists which require attention. Chapter 2 will describe an array of academic misdeeds, their definitions, and the means to more clearly define these acts. In summary: People are people. The culture of competitiveness may always have been the same except that now (a) the culture is larger and more complex,

Intellectual misconduct 7 (b) the resources for conducting research, especially funding, are different, and (c) the technology is permissive.

A sideways glance: other forms of professional betrayal Now let’s move sideways on misconduct’s directionality. Here we offer just a glimpse of the wide range of occupational misconduct to give the reader an idea of the breadth of the problem. Professional misconduct includes just about every line of work imaginable: religious leaders, politicians, artists, locksmiths, glass repairers, lawyers, you-name-it. Some people engage in these and other forms of occupational misconduct while others do not. The exact methodology may vary and the types of harm experienced vary. But all forms of occupational misconduct involve betrayals of trust. In the twentieth and twenty-first centuries, occupational misconduct is listed in, among other places, an edited book by Nikos Passas and Neva Goodwin.21 In this collection aptly entitled It’s Legal but It Ain’t Right, we learn of the harms committed by legitimate industries, including tobacco, firearms, gambling, antiquities, pesticides, food, energy, and pharmaceutical industries. As the authors and editors in this collection point out, it is staggering that behaviors of these kinds, behaviors that cause grievous harm, are legal. We make the same argument in this text, that redefinitions are needed to reframe harmful acts as criminal even if doing so defies tradition. Through news stories we learn of unprofessional behavior as it takes place in the art world (paintings, photography, literature), cooking (recipe books), commercial aviation, the allied medical professions (including nurses, dentists, and medical doctors), politicians, and religion (notably priests). There are two key points here: One is honesty and the other is trustworthiness. As to the latter, we hope and expect that people in charge of our wellbeing, such as airline pilots, medical professionals, and priests, will not harm us. Religious figures that commit sexual abuse of children, are among the most startling and egregious because they are supposed to be spiritual and kind-hearted. Yet the news is replete with stories of their misconduct,22 including the case of a Vatican cardinal who was charged with sexual abuse. But the question arises as to why they, more than airline pilots, doctors, and professors cause more horror when we discover that they have harmed us. We trust them. And mostly our trust is warranted. Of course, people who may or may not engage in work-related misconduct vary in important ways. Some are well-educated; some are stringently controlled by their unions, their employers, and their professional organizations; and some are freer to misbehave under artistic license and freedom from oversight. Some work closely with the public (priests, doctors, dentists, nurses) and some are more distant from the public (pilots, relatively speaking). But one thing we hope that they all have in common is our societal need for honesty, a concern for the wellbeing of others, and our dependence on their trustworthiness. Yet we are sometimes betrayed and this betrayal affects us in varying degrees. Below we offer a few illustrations of the many forms of intellectual property misconduct.

8  Intellectual misconduct Art, music, and literature The music world is not immune to unfair copying, as many cases have shown. To name three cases, we refer the reader to the Beach Boys’ copying of Chuck Berry’s 1958 hit “Sweet Little Sixteen;” the case of the song “Money (That’s what I Want)” which was uncredited to the writer of the song (Barrett Strong); and the reproduction without permission of Marvin Gaye’s song “Got to Give It Up.”23 In more detail, the well-known Beach Boys’ song “Surfin’ U.S.A.” (1963) features lyrics by one of the Beach Boys, Brian Wilson, but the music is written by Chuck Berry. In a rare moment of justice, Berry’s publisher, Arc Music, sued and won the copyright to Wilson’s lyrics.24 Barrett Strong first recorded “Money” in 1959, and was originally listed as the writer of the song, which has made millions of dollars as recorded by other bands, notably the Beatles. But Strong never saw a penny of those profits. Strong was less troubled by the absence of profits than he was about the lack of credit he got for his work. His reaction is not unlike the computer science professor described below whose work was ripped off by Apple; he also wanted recognition more than the profits. In a more recent case of music plagiarism, Robin Thicke, Pharrell Williams, and T.I. performed a song called “Blurred Lines” in 2015 that was originally recorded in 1977 by Marvin Gaye as “Got to Give It Up.” The Thicke et al. rendition earned $16 million in profits, which Gaye’s family objected to and was awarded $7.3 million.25 While it seems fair that Gaye’s work was recognized as his, clearly the later, copied work paid off. In May of 2017 late-night comedian Conan O’Brien learned that a lawsuit filed several years earlier alleging he had stolen jokes would proceed. Writer Robert Kaseberg, a prolific comedy writer whose work had been used by Jay Leno, had earlier claimed that O’Brien had used his original material, some of which was modified, but the setup and payoff were similar.26 Regardless of how such cases are resolved, they illustrate yet another realm in which the originality of material and its alleged theft make national news. Richard Prince was found to have illegally used photographs to create a series of collages and paintings from a 2000 book “Yes Rasta” by Patrick Cariou. Prince used dozens of Cariou’s photos to create a series of dystopian works, exhibited in a gallery, and which generated over $10 million in sales. But that ruling was overturned when Prince argued that his use of Cariou’s photographs was protected by “fair use” exceptions. The appeals court found that the re-use of the photos was “permissible under fair use because they ‘have a different character’ from Mr. Cariou’s work.”27 This sounds like the “substantial similarity” argument remarked upon in the Preface to this text: so long as the two works are distinct enough from each other, one cannot say that a copyright was violated. Left unsettled is the question of how distinct the works like books or photographs must be, given that one work is derivative of another. In December  2016, Emma C. Bunker and Douglas A. J. Latchford, two of the world’s foremost experts in Southeast Asian art and antiquities, along with a well-known New York art dealer, were named in a criminal complaint filed in

Intellectual misconduct  9 Manhattan District Court.28 The experts, who quite literally wrote the books on their subject, were accused of falsifying the provenances of pieces they handled and sold. One of the accused asserted that their methods had saved priceless, irreplaceable pieces from the ravages of war and time, one of several types of rationalizations, as we will discuss in Chapter 7, commonly used by criminals. Scientists don’t have a monopoly on faking evidence to achieve their ends, or on offering up weak excuses when caught. Journalist Jonah Lehrer was regarded as a gifted writer. His articles were wellread and positioned him to author best-selling books. In 2012, however, it was discovered that he had plagiarized some of the material appearing on his blog posts.29 Upon closer examination, analysts found that he fabricated quotes by Bob Dylan in his book. Both published books were pulled from the shelves and Lehrer lost his job at The New Yorker. Less than two years after his misconduct was exposed, Simon and Schuster gave him a contract to write a new book. If that weren’t enough insult, the Knight Foundation paid Lehrer $20,000 to give a speech in which he told attendees he had made mistakes and that he had adopted a new system of organization that should eliminate future problems.30 Lehrer had abused his position, betrayed the trust of many, took advantage of his employer and his readers, and failed to show genuine contrition for his misdeeds. Had Lehrer committed these very same acts as a scientist working under the support of a federal grant, he could have been charged with a felony. One of the most disappointing findings is that Truman Capote’s work In Cold Blood was at least partly fiction. Capote’s book was presented as nonfiction, made into a movie, and used as background evidence against the two murderers of a farm family in Kansas. Much of the work was an invention, although Capote declared that every word of his book was true.31 This is an issue that will continue to rear its head as creative non-fiction becomes a more popular genre of literature. In 2003, Doubleday published the memoir, A Million Little Pieces, by writer James Frey. The story purportedly chronicled Frey’s own struggle with substance abuse. After Oprah Winfrey featured Frey and his story on her syndicated TV program, the book became a bestseller. Some observers were skeptical of aspects of the story, and some digging suggested the events had never happened. Oprah herself confronted Frey on a subsequent program during which he admitted that portions of his story had been exaggerated.32 In spite of this betrayal of the public trust, in 2007 Simon & Schuster awarded Frey a three-book, seven-figure book contract. This reinforces our point that offenders not only often escape justice, they sometimes are rewarded in the wake of the scandals they create. In their book on writing serious non-fiction, literary agents Susan Rabiner and Alfred Fortunado33 lamented the blurring of lines between nonfiction and fiction. According to these agents, many academic writers who could turn their facts into a fascinating narrative resist the urge for fear of crossing the line between fiction and nonfiction. They note that other writers, like the biographer of Ronald Reagan, invented characters and even concocted false footnotes to make his case more convincing. Rabiner and Fortunado warn their readers that this is a trend that started some time ago, and what we have now is a slippery slope upon which

10  Intellectual misconduct bending the borders between fact and fiction becomes easier and easier. They admonish their readers: You cannot introduce made-up facts or dialogue, no matter how trivial. You cannot put your words into someone else’s mouth or attribute motive absent solid factual evidence of motive. While the focus of these literary agents is popular non-fiction, the concerns they raise carry over into the world of scholarship and science. Michiko Kakutani of the New York Times asserts that this blurring of the truth is part of contemporary society in which there is less concern for facts than entertainment.34 Business In a clear-cut case of business misconduct and the problems with regulation and enforcement, a piracy case was brought against Apple Computers by David Gelernter. Gelernter, of Yale University, pioneered computer software that was misappropriated by Apple. It was obvious that intellectual property theft had occurred, the jury verdict ruled in Gelernter’s favor, but no punishment ensued. Why? Though Gelernter is powerful within his discipline, we are comparing his power relative to that of Steve Jobs. Among the evidence is an “internal Apple email from Steve Jobs that left one patent law expert not affiliated with the case saying, simply, ‘Wow’.” Gelernter’s and his company’s “patents were valid, but his company (Mirror Woods) had not proved that Apple had infringed them.”35 Apple had introduced new versions of its software, using new technologies that resembled and behaved more than a little like Professor Gelernter’s invention. Among the documents obtained from Apple was an email from Mr. Jobs sent in 2001 to his employees after seeing an article in the New York Times about Gelernter’s brainchild, Scopeware. Jobs wrote “Please check out this software ASAP.… It may be something for our future, and we may want to secure a license ASAP.” According to a patent law expert, the email from Mr. Jobs was “as close as it gets to a smoking gun.” Gelernter “clearly had a case.” Gelernter appealed the decision against his claim. He said the money mattered to him but it was more than that. He wanted the intellectual credit, on record, for his role in the creation of Scopeware. He knew he was ripped off: “I know my ideas . . . when I see them on a screen,” he said. He wanted vindication. There is something to that need for exposure and vindication. As scientists, the main point of our work is the discovery of something new, the discovery taking years of hard work, effort, and creative genius. Most of us do not go into academics for the money; so, in a sense, our work products are all we have and all we want. What may seem to some of us as clear cases of intellectual property theft are muddied by specific norms of various cultures. A mutual friend of the authors, a successful programmer and consultant, describes a common practice in his field: A few weeks ago I decided I wanted to create a function to measure password strength. Could I have written it from scratch? Probably. Did I? Nope! That

Intellectual misconduct  11 would be a needless waste of time. I used the interwebs and had a choice of 3 or 4 perfectly good functions within about a minute. That’s how coding works today. And if you’re not making use of other people’s code you’re not doing it right. A very interesting paradox.36 In 2016 Mylan, the manufacturer of the EpiPen, a device on which allergic individuals rely to counter anaphylactic shock, came under fire for raising the price of the EpiPen beyond the means of ordinary citizens.37 What many observers don’t know is that Mylan’s CEO at one time claimed an MBA from West Virginia University (WVU). The university tried to award Heather Bresch the degree although she had not completed all the necessary requirements.38 Several insiders knew about the unearned degree, but were afraid to speak out for fear of reprisal. A panel convened to look into the matter concluded that she indeed had not earned the degree. Two university officials resigned as a result of the incident. It should be noted that Ms.  Bresch is the daughter of Joe Manchin, who at the time of the incident was West Virginia’s Governor. Moreover, Bresch had gone to high school with the president of WVU. What we see in this case is an attempt not only to claim unearned credentials, but to use clout and connections in an attempt to regularize the irregular. The faking of data is not solely a scholarly offense. Several years ago, it was discovered that there might be a serious problem with the functioning of automobile airbags. Motorists involved in minor auto accidents were getting killed or injured by pieces of metal shrapnel when their airbags deployed.39 All of the airbags in question had been manufactured by the Takata Corporation, one of the largest manufacturers of airbags in the world. As is the case with other auto defects that can pose a hazard, a recall was issued. What became known several years later was that senior officials at Takata had been made aware of the potential danger of the airbags, but had orchestrated the falsification of the data. Takata was forced to pay $1 billion in fines, victim compensation, and compensation to the auto manufacturers that had paid money for recall servicing. Three Takata executives faced federal charges in the U.S., and the company was forced to file for bankruptcy. This illustrates that the faking of data occurs outside academe, suggesting that scholars are not the only ones with the motivation and means to invent data out of whole cloth to achieve their ends. Given the reputation of corporate behavior, no one is surprised that corporations behave in ways that are detrimental to humans, nonhuman animals, and the planet, but beneficial to executives and shareholders. With that in mind, we present a few recent examples which are also discussed elsewhere in this text. But just to list a few corporate entities that engage in serious misconduct, think of pharmaceutical companies, banks and mortgage lenders, automobile parts manufacturers, and energy companies which include mining, oil extraction, and natural gas fracking. It is disheartening to know that corporations engage in practices that destroy lives and exploit the environment and to know that they go unpunished.40 At least in some cases, we have a clue as to how this happens when we examine the connection between corporations and bad science. Corporate cash can and does buy bad science in order to say what the corporations want to have said. To

12  Intellectual misconduct name one example, climate change “reality” was influenced by a climate “scientist” who worked at the Harvard-Smithsonian Center for Astrophysics and whose work has been tied to corporate funding.41 Government and politics We seem to trust our political figures less than we do other professionals. Since politicians are not necessarily trained to be politicians, they have no advisory boards to provide oversight, and some have long histories of untrustworthiness. Here are some examples. In 2016, the Dietary Guidelines for Americans, released by the U.S. Department of Agriculture and the U.S. Department of Health and Human Services, lists what they consider to be good nutrition. Most of what the guidelines instruct is not surprising, such as don’t eat too much sugar, refined carbs, or “bad” fats. But while the Agriculture and Health and Human Services knew that too much red meat in the diet was not good, they backed off from saying so because of pressure from the National Cattlemen’s Beef Association, an industry trade group, and its supporters in the U.S. Congress. The nutrition “agencies’ advisory committee initially sought a recommendation to cut back on such foods, which are associated with an increased risk of cancer as well as heart disease; this advice was deleted from the final guidelines after vehement protests” from the cattlemen’s association and those beholden to them in Congress. In other words, the government agencies knew that eating red meat was harmful but refrained from saying so. This is dishonest and it is harmful to public health.42 Also on the topic of public health, our second example takes place in Flint, Michigan, where it was discovered in 2014 that the public water source was contaminated with lead.43 To save money, and at Governor Rick Snyder’s request, the water supply was switched from the Detroit supply, which was safe, to water from the Flint River, which is decidedly unsafe. As an important aside, Flint’s population is primarily poor and Black, in short, people who have little power and who do not vote for a Republican governor. Within months of the switch, the water was tested to reveal E. coli bacteria and later discovered to have dangerous levels of lead, which can cause irreversible brain damage in children. By 2015, the city of Flint found high levels of trihalomethanes, a disinfectant byproduct, in violation of the safe Drinking Water Act. Yet the city website reported that this water was safe to drink. The Environmental Protection Agency sent memos stating that the city of Flint did not use chemicals to control the corrosion that causes lead to leach from water pipes into drinking water. These warning memos were not made public until the American Civil Liberties Union leaked them. Many tests later by reputable testers and many disclaimers later by political figures hoping to protect themselves and the Governor, in October 2015, the water supply was switched back to the safe Detroit source. But it took a lot of testing and a lot of demands from the public and testing agencies before the water was made safe again. Indeed, in January 2016, it was revealed that three months earlier a state legislator had asked the state attorney general to launch a probe, yet

Intellectual misconduct  13 the request was rejected.44 As the case continues in the news, we learn that the officials were “dismissive” of complaints and that, as the problems grew in terms of reports about Legionnaire’s disease, lead poisoning, and so on, officials “belittled” the complaints.45 How else to deal with real human suffering than to say that this suffering does not exist, to dismiss and belittle the sufferers? This case led to the indictment of several state officials, but no amount of criminal processing could undo the harm caused by the negligence. During the 2016 Presidential campaign in the United States, Melania Trump, wife of then Republican nominee Donald J. Trump, gave an impassioned, articulate speech to the Republican National Convention in Cleveland, Ohio. During a process where every promise, denial and fact was being checked by the media, it was soon discovered that some of Melania’s text appeared to be borrowed from a speech given by Michelle Obama several years earlier.46 The Trump organization denied any wrongdoing, but a Turnitin analysis of both sets of text yielded irrefutable similarities. In time, a speechwriter confessed, saying she was so moved by Michelle Obama’s words that inspiration crept into her draft material. Although the speechwriter responsible for borrowing First Lady Obama’s words fell on her sword, it is difficult to understand how an organization responsible for promoting a Presidential candidate would not vet any and all material to be made public. Plagiarism, which is considered a serious scholarly offense, can be found anywhere we find the written word. Plagiarism again visited the Donald Trump camp during the post-election transition. As the President-Elect was in the process of making appointments, he selected Monica Crowley, a columnist and political commentator, for a post on the National Security Council. After it was discovered that Ms Crowley had plagiarized portions of her 2012 book about the Obama administration, What the (Bleep) Just Happened,47 she withdrew from further consideration. The electronic version of her book, which had sold at least 20,000 copies in hardcover, was pulled by her publisher, Broadside Books, an imprint of HarperCollins. It was not the first time Ms Crowley had been accused of misappropriating the words of others. The Trump team also demonstrated that the faking of factual information infects politics as much as academe. During the 2016 Presidential campaign, Donald Trump asserted that he witnessed thousands of Muslims cheering in New Jersey after the 9–11 attacks in 2001.48 No proof was ever offered for this assertion and when challenged, the controversial candidate doubled down on his claim. In order to explain his loss of the popular vote, Trump complained that millions of undocumented immigrants had voted illegally. Again, he offered no evidence for such an outlandish claim. What was labeled as “alternative facts” by the Trump team49 is akin to fabricated data in science and scholarship. We live in a world in which integrity and honesty have taken second place to entertainment and vanity. As we put the finishing touches on this manuscript in the summer of 2017, we, the authors, and the world, are stunned by the falsehoods that are passed off as “truth” by a presidential administration that seems unable to discern the difference between the two and, worse, does not care that they foment falsehoods so long as their desires are met.

14  Intellectual misconduct Even those reporting on government figures are susceptible to factual missteps. In June 2017, CNN issued a story in which the writers alleged that a U.S. Senate committee was investigating a Russian bank linked to one of President Trump’s associates. CNN’s three-part system for verifying facts had broken down.50 CNN issued a rare retraction and the three reporters involved in the story were forced to resign. Retractions, one means of correcting the official record, are not solely the province of academe. Although these examples are from the U.S., the pilfering of material for speeches and written work is not exclusively an American phenomenon. In 2016, it was alleged that Mexico’s President Enrique Pena Nieto plagiarized part of the thesis submitted for his law degree in the 1980s at Pan American University.51 Analysis of the thesis showed that close to a third of the thesis was borrowed from the works of others without attribution. The Mexican president is one of a number of visible figures in the international community who have been accused of plagiarism in recent years. Why do prominent people hide the seamier aspects to their histories? Why do they think they can get away with it? Perhaps because they often do.

Conclusions As we saw above in the discussion of scientific and other misconduct, technology enables misconduct. For example, we expect locksmiths to fix our locks for the price named, only to discover that some are unqualified or that they demand multiples of pay greater than the agreed upon price.52 But we find that Google abets in the dishonesty, wittingly or not, by allowing searches for an unlimited number of repair companies which may not even exist. A search for glass repairers, for instance, shows over 3000 fake auto glass repair listings in the U.S. Among the platforms that the unsuspecting user uses to find false workers are: Google My Business, which is Google’s version of the telephone Yellow Pages and Mapmaker, Google’s crowdsourced online map of the globe.53 Several common threads run throughout these disparate examples of misconduct. There are identifiable ends that served as incentives for the protagonists. Melania Trump no doubt wanted to deliver a moving, memorable speech to win over the Cleveland crowd. Jonah Lehrer wanted to solidify his reputation as a science journalist with best-selling books. Mylan’s CEO claiming an unearned MBA could provide credibility in the world of business. The Takata Corporation was more concerned with the bottom line than with consumer safety. The point is that the structure of modern society provides irresistible incentives to engage in deviant behavior. As we will argue throughout this analysis, most if not all of these spheres of endeavor would collapse under the strain if most individuals gave in to the temptation to take shortcuts. Most important, intellectual misconduct is not confined to the academy. The picture of occupational and professional deviance is too fast-moving and too complex for anyone, including the authors of this book, to keep up with. For that reason, we can offer an overview of such misconduct. By the time this

Intellectual misconduct  15 manuscript appears in print, many more occupations and many more acts of misconduct will be in the news. It is overwhelming.

Notes 1 Carey, Benedict (2015a). “Science Under Scrutiny.” New York Times, June 6, pp. D1, D3. 2 Ibid., p. 13. 3 Ibid., p. 13. 4 Merton, Robert K. (1938). “Social Structure and Anomie.” American Sociological Review, 3, 672–82 and Merton, Robert K. (1957). Social Theory and Social Structure. New York, NY: Free Press. 5 Durkheim, Emile (1951). Suicide: A Study in Sociology. New York, NY: Free Press. 6 Broad, William & Wade, Nicholas (1982). Betrayers of the Truth: Fraud and Deceit in the Halls of Science. New York, NY: Simon and Schuster. 7 Huxley, L. (1900). Life and Letters of Thomas Henry Huxley. MacMillan: London. 8 For a comprehensive overview of RCR, including its history, see Adil E. Shamoo & David B. Resnik, Responsible Conduct of Research, 3rd Edition. New York: Oxford University Press. 9 Choney, S. (2011). “Steal this Report: College Plagiarism Up, Says Pew Report.” MSNBC, August 30. Available online at: http://digitallife.today.com 10 Segal, David (2015). “Rousting the Book Pirates from Google.” New York Times, August 30, p. 3 of Business Section. 11 Ibid., p. 3. 12 Ibid., p. 3 13 Kolata, Gina (2017). “A Scholarly Sting Operation Shines a Light on ‘Predatory’ Journals.” New York Times, March 22. Available online at: www.nytimes.com/2017/03/22/ science/open-access-journals.html?_r=0 (accessed 7-10-2017). 14 Carey, Benedict (2015b). “Psychology’s Fears Confirmed: Rechecked Studies Don’t Hold Up.” New York Times, August 28, pp. A1, A13. 15 Carey (2015a). p. D1. 16 Overbye, Dennis (2015). “Embattled Astronomer Is Leaving Berkeley.” New York Times, October 15, p. A21. 17 Harmon, Amy (2016). “Chicago Professor Resigns amid Sexual Misconduct Investigation.” New York Times, February 3, p. A11. 18 See also Hartocollis, Anemona (2016). “Professors’ Group says Efforts to Halt Sexual Harassment Have Stifled Speech.” New York Times, March 24, p. A21. 19 Bursik, Bob (2015). Personal communication, November  18. Available online at: https://en.wikipedia.org/wiki/Erich_Goode 20 Glanz, James & Armendariz, Augustin (2017). “Years of Ethics Charges, but a Star Cancer Researcher Gets a Pass.” New York Times, March 8. Available online at: www. nytimes.com/2017/03/08/science/cancer-carlo-croce.html (accessed 7-10-2017). 21 Passas, Nikos & Goodwin, Neva (eds.) (2007). It’s Legal but it Ain’t Right: Harmful Social Consequences of Legal Industries. Ann Arbor, MI: University of Michigan. 22 See, for example, Seattle Times Editorial Board (2016). “Open the Secret Files on Clergy Sexual Abuse of Minors in Western Washington.” February 3. Available online at: www.seattletimes.com/opinion/editorials/open-the-secret-files-on-clergy-sexualabuse-of-minors-in-western-washington 23 Rohter, Larry (2013). “For a Classic Motown Song About Money, Credit Is What He Wants.” New York Times, September 1, pp. A1, A4; Sisario, Ben (2015). “Side Issues Intrude in ‘Blurred Lines’ Case.” New York Times, March 2, pp. B1, B3; Sisario, Ben & Smith, Noah (2015). “Hit Single Plagiarized 1977 Song, Jury Rules.” New York Times, March 11, pp. B1, B2.

16  Intellectual misconduct 24 Studwell, William Emmett & Lonergan, David F. (1999). The Classic Rock and Roll Reader: Rock Music From Its Beginning to the mid-1970s. Psychology Press, p. 81. Available online at: https://en.wikipedia.org/wiki/Sweet_Little_Sixteen 25 Sisario & Smith (2015). 26 Deb, Sopan (2017). “Conan O’Brien to Face Joke-Theft Allegations in Court.” New York Times, May  15. Available online at: www.nytimes.com/2017/05/15/arts/television/conan-obrien-joke-theft-allegations.html (accessed 7-10-2017). 27 Kennedy, Randy (2013). “Court Rules in Artist’s Favor.” New York Times, April 26, pp. C23, C30. 28 Blumenthal, Ralph  & Tom Mashberg (2017) “Expert Opinion or Elaborate Ruse? Scrutiny for Scholars’ Role in Art Sales.” New York Times, March 30. Available online at: www.nytimes.com/2017/03/30/arts/design/expert-opinion-or-elaborate-ruse-scrutiny-for-scholars-role-in-art-sales.html (accessed 7-10-2017). 29 Bosman, Julie (2012). “Jonah Lehrer Resigns from The New Yorker after Making Up Dylan Quotes for His Book.” New York Times, July 30. Available online at: https://mediadecoder.blogs.nytimes.com/2012/07/30/jonah-lehrer-resigns-from-newyorker-after-making-up-dylan-quotes-for-his-book/?action=click&contentCollectio n=Media&module=RelatedCoverage®ion=Marginalia&pgtype=article (accessed 7-11-2017). 30 Schuessler, J. (2013). “Plagiarism pays: Jonah Lehrer gets $20,000 for speech.” New York Times, February  12. Available online at: http://artsbeat.blogs.nytimes.com/2013/02/12/ plagiarism-pays-jonah-lehrer-gets-20000-for-speech/ (accessed 8-15-2016). 31 Hood, Michael (1999). “True Crime Doesn’t Pay: A Conversation with Jack Olsen.” Point No Point Winter 1998–99. Available online at: https://en.wikipedia.org/wiki/ Truman_Capote#Veracity_of_In_Cold_Blood_and_other_Nonfiction 32 Kakutani, Michiko (2006). “Bending the Truth in a Million Little Ways.” New York Times, January 17. Available online at: https://query.nytimes.com/gst/fullpage.html?re s=9C0CEEDA143FF934A25752C0A9609C8B63 (accessed 7-10-2017). 33 Rabiner, S. & Fortunado, A. (2002). How to Write Great Serious Nonfiction – and Get it Published. New York: W.W. Norton. 34 Kakutani (2006). 35 Schwartz, J. (2011). “Pursuing a Piracy Claim Against Apple.” New York Times, November 5, p. 12. 36 Rack, Phil (2014). Personal communication with Mark Davis. 37 Carroll, Aaron E. (2016). “The EpiPen, a Case Study in Health System Dysfunction.” New York Times, August 23. Available online at: www.nytimes.com/2016/08/24/ upshot/the-epipen-a-case-study-in-health-care-system-dysfunction.html (accessed 7-12-2017). 38 Sterbenz, Christina (2016). “The CEO of EpiPen maker Mylan once claimed she had an MBA that she never earned.” Business Insider. Available online at: www.businessinsider.com/mylan-ceo-heather-bresch-west-virginia-university-mba-scandal-2016-8 (accessed 7-9-2017). 39 Soble, Jonathan (2014). “Toyota Widens Recall of Cars with Takata Airbags.” New York Times, November  27. Available online at: www.nytimes.com/2014/11/28/ business/international/toyota-widens-recall-of-cars-with-takata-airbags.html (accessed 7-11-2017). 40 Uhlman, David M. (2013). “Prosecution Deferred, Justice Denied.” New York Times, December 14, p. 21; Uhlman, David M. (2015). “Justice Falls Short in G.M. Case.” New York Times, September 20, p. 5; Conniff, Richard (2015). “Revenge of the Jetta.” New York Times, September 27, p. 5. 41 Gillis, Justin & Schwartz, John (2015). “Deeper Ties to Corporate Cash for Doubtful Climate Researcher.” New York Times, February 22, pp. A1, A15. 42 Brody, Jane E. (2016). “Guidelines, if Not Clarity, for U.S. Diets.” New York Times, January 19, p. D5.

Intellectual misconduct  17 43 Lin, Jeremy C. F., Rutter, Jean & Park, Haeyoun (2016). “Events that Led to Flint’s Water Crisis.” New York Times, January 21. Available online at: www.nytimes.com/ interactive/2016/01/21/us/flint-lead-water-timeline.html?_r=0 (accessed 7-11-2017) 44 Rappleye, H., Seville, L. R. & Connor, T. (2016). “Bad Decisions, Broken Promises: A  Timeline of the Flint Water Crisis.” NBC News, January  19. Available online at: www.nbcnews.com; Bosman, Julie & Smith, Mitch (2016). “Governor Offers Apology to Flint for Water Crisis.” New York Times, January 20, pp. A1, A14. 45 Bosman, J., Davey, M. & Smith, M. (2016). “As Water Problems Grew, Officials Belittled Complaints from Flint.” New York Times, January 21, pp. A1, A17. 46 Haberman, M., Rappeport, A., Healy, P. & Martin, J. (2016). “Questions over Melania Trump’s Speech Set off Finger-Pointing.” New York Times, July 19. Available online at: www.nytimes.com/2016/07/20/us/politics/melania-trump-speech.html (accessed 7-11-2017). 47 Alter, Alexandra (2017). “HarperCollins Pulls Book by a Trump Pick after Plagiarism Report.” New York Times, January  10. Available online at: www.nytimes. com/2017/01/10/business/harpercollins-pulls-monica-crowley-book-for-plagiarism. html (accessed 7-9-2017). 48 Kessler, Glenn (2015). “Trump’s outrageous claim that ‘thousands’ of New Jersey Muslims celebrated the 9/11 Attacks.” Washington Post, November 22. Available online at: www.washingtonpost.com/news/fact-checker/wp/2015/11/22/donald-trumps-outrageous-claim-that-thousands-of-new-jersey-muslims-celebrated-the-911-attacks/?utm_ term=.001529844d41 (accessed 7-11-2017). 49 Rutenberg, Jim (2016). “ ‘Alternative facts’ and the Cost of Trump-Branded Reality.” New York Times, January 22. Available online at: www.nytimes.com/2017/01/22/business/media/alternative-facts-trump-brand.html (accessed 7-9-2017). 50 Ember, Sydney & Grynbaum, Michael. M. (2017). “At CNN, Retracted Story Leaves an Elite Reporting Team Bruised.” New York Times, September 5. Available online at: www.nytimes.com/2017/09/05/business/media/cnn-retraction-trump-scaramucci.html (accessed 12-31-2017). 51 Associated Press (2016). “University: Mexican President Copied Texts in Thesis.” 52 Segal, David (2016). “When Locksmiths Pick Pockets.” New York Times, January 31, pp. 1, 4, 5. 53 Ibid.

2 The world of scholarship Rituals and rewards, norms and departures

Scholarly misconduct, whether regarded as criminal or not, requires an understanding of the social organization of scholarly work and the role that scholars play in society. The world in which they operate is complex, one which requires advanced training and which places heavy demands, particularly on those early in their careers at research-centric institutions. In this chapter we discuss this world, how it continues to change, and its impact on scholarly misconduct. Most people who identify as scholars work in colleges or universities. There are independent scholars,1 to be sure, and we do not mean to imply that they or their work are unimportant, or that they are incapable of engaging in misconduct. Our analyses focus on the majority, those who participate in the larger enterprise of what we refer to as organized scholarship, that is, research and writing that takes place within institutions such as colleges, universities and other research facilities. The relevant actors include professors, research faculty who conduct research but don’t teach, research assistants and technicians, postdoctoral trainees and other personnel who contribute in one way or another to scholarship and science. Those who aspire to a career as a scholar work toward a doctoral degree. This includes the Doctor of Philosophy (PhD), as well as other research doctorates such as Doctor of Business Administration (DBA), Doctor of Public Health (DPH), Doctor of Science (ScD), Doctor of Nursing Science (DNS), and Doctor of Education (EdD). A research doctorate is the union card for academic scholars, and it is during the PhD training process that most candidates gain exposure to the norms of academic work. It also includes the Doctor of Medicine (MD) and Doctor of Osteopathy (DO) degrees, degrees most often associated with physicians and surgeons, but which are also common credentials for researchers in the biomedical sciences. In any of these forms, the doctorate is the recognized and necessary legitimation of scholarly and scientific work. There are some exceptions to the rule  – those with a terminal master’s degree such as an MFA who make award-winning contributions to their respective fields. But in general, a PhD or other earned doctorate is necessary to gain entrée and respectability. As we will see in Chapter 4, there are countries and cultures in which having a doctorate is considered so desirable and prestigious, its importance extends well beyond academe.

The world of scholarship  19 Graduate and professional schools are the training grounds for would-be scholars. Students hope for acceptance into the most prestigious programs in their chosen field. There is a definite pecking order in graduate programs, and this order is expressed in such formal rankings as those listed in U.S. News & World Report.2 The better the program a person enters, the better the chances that person will be able to work under an eminent mentor, and the better the chances he or she will be considered for a coveted job upon graduation. We mention this because the process to get into these programs is competitive, and this competition is a harbinger for the environment that will characterize scholarly and scientific careers. It is during the years of graduate training for the doctorate that students become acquainted with the so-called norms of scholarship and science. We use the term “so-called” because the existence and forms of such norms is a subject of some disagreement. Sociologist Robert K. Merton, regarded as the father of the sociology of science,3 identified norms which he argued prevail in the world of science and scholarship. One type, technical norms, govern how science and scholarship are done. For example, there are norms that dictate how to set up an experiment, how to randomly assign subjects, to analyze the data, and how to report the results. Each field of study has its own set of technical norms, and these are learned during the graduate or professional school experience, as well as during post-graduate training. Merton’s four moral norms, in contrast, govern the transparency and honesty with which scholarly work should be shared and communicated.4 Universalism suggests that scientific claims should be evaluated without regard to the location or characteristics of the scientist. Communalism suggests that scientific findings belong to the community of scientists. This moral norm argues against secrecy and argues for communicating scientific findings as widely as possible. Disinterestedness connotes the special objectivity with which all scientific work should be pursued. Lastly, organized skepticism expresses itself in such practices as peer review wherein scientific findings are subjected to close scrutiny by the scientific community. These norms call for objectivity, openness, and honesty, and they are the ones violated by the behaviors under analysis in this book.5 We raise Mertonian norms here because it is during training in graduate or professional school, the candidates not only take courses in their field of specialization, they also become involved in research projects in which they gain hands-on experience with the nuts and bolts of research. It is through conducting research that professors and other mentors expose students to the various norms of scholarship. Implicit in this training process is the crucial role played by mentors. That is, if we accept that scholars in training are, at least to some extent, malleable pieces of clay that can be molded during this process, then the sculptor – the PhD advisor or other mentor – is an important influence in this process. It is easy to see that these years in training are formative in developing the trainee’s sense of how to conduct scholarly work and which practices are acceptable and unacceptable. These norms regarding how research is conducted, both technically and ethically, are important, but as we will argue throughout this book, general societal norms

20  The world of scholarship that govern the behavior of scholars and non-scholars alike are far more important to the fair and trustworthy conduct of scholarly inquiry. One of the lessons graduate and professional students learn early is that publication of articles in refereed journals is necessary for success in traditional academic jobs. It’s fine for PhDs to excel as instructors in the classroom, but it is publication in peer-reviewed periodicals that translates into good job offers, and later, retention and promotion. Published papers are the coin of the scholarly realm. Students who start their publication records while still in training have a definite advantage over their counterparts who either procrastinate or who do little scholarly work at all. Prior to graduation with the doctorate, those interested in a teaching position apply for available openings. As one might expect, often there are more candidates than openings, and thus there is competition. Graduate students with the best credentials – those from the best programs with the best publication records who have studied under prominent scholars in their field – will fare the best on the job market.

2.1  In the beginning (as in the end), it’s all about money The following was part of a 2015 University of Maine job advertisement for a tenure-track faculty position in sociology/criminology.6 It illustrates the structural emphasis the academy now places on the acquisition of research funding by faculty. We should note that the University of Maine is not unique in imposing such a standard.

Required qualifications: • • • • • •

Ph.D. in Sociology preferred (Ph.D. in Criminology/Criminal Justice with strong background in sociology also encouraged to apply) Demonstrated or potential excellence as an instructor of undergraduate sociology students Demonstrated or potential strength in published scholarship [italics ours] Strong potential for extramural funding [italics ours] Demonstrated commitment to public/engaged sociology Clear potential to work both independently and in collaboration with students, colleagues, administrators, community members, and other stakeholders

Preferred qualifications: • •

Demonstrated success in obtaining extramural funding for research [italics ours] Experience collaborating on research projects with undergraduate students

The world of scholarship  21 Once new scholars land an academic position, their new role involves several facets. One, of course, is teaching. Faculty have to prepare and deliver class lectures, give assignments, administer and grade exams, evaluate term papers, mentor graduate students and supervise teaching assistants. This is the role people most often associate with being a professor, but, in the eyes of large research universities, it is not the most important role. Faculty members can be adequate or even substandard lecturers if they excel at research and publication. A special distinction belongs to the research function. College and university faculty are expected to engage in research and publish their findings in scholarly journals or books. As we have noted, most learn how important this is during graduate or professional school, as well as while serving postgraduate fellowships, and those who do are best equipped to meet this obligation. In some cases, university scholars will apply for funding from federal, state or private sources to support their research agendas. Regardless of how they approach this responsibility, it is the one they must take most seriously because it is their published scholarship upon which they will primarily be evaluated for tenure and promotion. This research function takes different forms depending on discipline. In the sciences, researchers hypothesize a relationship between variables which is tested through an experiment designed to control outside influences. The researcher collects data on both the experimental and the control subjects, whether they be human or nonhuman animals. In the social and behavior sciences, the investigator may devise and administer a survey on some topic of interest. These data are entered into computers and analyzed using statistical software. Provided the results are worthy of publication, the researcher will prepare one or more manuscripts for submission to scientific or scholarly journals. The research process seems straightforward, but in practice there are numerous decisions the researcher must make along the way. Which measures should I use to use to assess the phenomenon in question? Should I address missing survey data by plugging in mean values from other subjects?7 Although hypotheses should be proposed prior to any data analysis, is “moving the goalposts,” that is, exploring the data for significant results and then generating compatible hypotheses after the fact, common and acceptable among researchers?8 Has a related study been conducted on this question and what can my study add to the knowledge base? Most institutions expect their scholars to also perform various types of service. This includes serving on departmental committees such as those charged with overseeing curricula or graduate student admissions. It also includes service to one’s discipline in the form of active involvement with scholarly associations through presenting papers, abstracts and posters at conferences. Some disciplines and institutions include service to the local community as part of this obligation. One common form of service scholars perform for their respective disciplines is to volunteer as reviewers for journals. When papers are submitted to scholarly journals, the editors review the submissions for suitability. If the papers pass that initial review, they are sent out to referees, scholars who are expert in the subject matter and who read and evaluate them. Publishers of journals cannot afford to pay reviewers to evaluate manuscripts, so those who participate in this process

22  The world of scholarship do so voluntarily. It’s a thankless task and it can consume many uncompensated hours, but most reviewers do it because others do it for them. The role of referee, however, is a two-edged sword. On the one hand, referees have the opportunity to read the latest contributions of their peers, enabling them to keep abreast of the most recent developments in their field. Unfortunately, this process also gives unscrupulous reviewers access to ideas they might like to misappropriate. Related to the peer review of journal manuscripts is the review of research grant proposals. Funding organizations such as the National Institutes of Health and the National Science Foundation in the U.S. rely on outside expert reviewers to evaluate proposals submitted by researchers. Unlike scholarly journals, funding organizations compensate the experts they engage as peer reviewers. The stipend is modest – often less than the scholar could earn through other activities – but it represents an attempt to cover the reviewer’s time and effort. In some forms of grant review, reviewers are asked to meet with their fellow reviewers to participate in a face-to-face discussion of the proposals under review, and in such cases their travel expenses are also covered. Of all the activities that make up an academic career, those related to research and publishing are most important for meeting the competitive demands that are part and parcel of organized scholarship. As we will see, these demands have increased over time, creating an ever-rising bar for would-be scholars.

A note on trust in scholarship and science Throughout this book we emphasize how scholarly misconduct violates not only various norms, but also the implicit trust that scholars must have in one another if they are to share their contributions with journal and book editors, reviewers, and others. Referred to as one of the more important elements of social glue,9 trust is critical for scholars and scientists inasmuch as research would grind to a halt without it.10 Opportunities for trust and its betrayal permeate the world of scholarship and science. Scholars submitting manuscripts to journals trust that the editor and referees will not only evaluate the submission competently and fairly, they also must trust that their ideas and the expression of those ideas will not be misappropriated. A  scholar in the U.S. doesn’t have to know a scholar in The Netherlands who will be reviewing the former’s manuscript for a journal. The trust is implicit by the assumption of the roles of scholar and journal referee. Likewise, those who submit grant proposals to funding organizations trust that their work will not be stolen by reviewers. Another form of trust is what we term personal trust. It is a form of trust that grows not from formal role expectations such as those that govern scholarly activities, but from knowing individuals of good reputation and intention who behave ethically, who don’t exploit, and who won’t tolerate unethical qualities in others. In some cases, we know such individuals personally, and in other cases these persons are friends or acquaintances of those we know well and whose judgments we

The world of scholarship  23 trust. We will revisit the importance of personal trust when we discuss informal social control in Chapter 9. One of the arguments we make in this book is that the so-called sacred trust in scholarship and science is an ideal to which scholars should aspire. But the reality of scholarly work, as evidenced by numerous verified instances of misconduct, should disabuse us of any naïveté about the pure intentions of this community. It is time to recognize that scholars are just as susceptible to the same pressures and temptations as priests, pharmacists, politicians, and others.

The changing environment of scholarship Now that we have an understanding of the nature of scholarly work, we turn to recent changes to the landscape. These are important to our understanding of the social and cultural forces that facilitate scholarly misconduct. One of the metrics in contemporary scholarship is the impact factor (IF), a score which reflects the extent to which a journal has had an impact on the field it represents. The higher the impact factor of a journal, the more prestigious it is to have one’s work appear in it. So important has this become that some scholars record in their curricula vitae (CVs) the IFs of the journals in which they publish. We mention IFs because they are another indicator of the growing competition within organized scholarship. So not only do scholars need to publish, at more prestigious institutions they are expected to publish in journals with higher IFs. If getting one’s work into print wasn’t difficult enough, scholars now must aim for a higher bar. The result is an increasingly competitive environment in which a large number of aspirants strive for space in prestigious publications. Related to the pressure to publish is the limited space in the high-impact journals. In the authors’ field – criminology – it is considered desirable to publish in Criminology, the flagship journal of the American Society of Criminology. Anyone can submit manuscripts for consideration to this journal, but it accepts something like 15% of all submissions. This tells us that scholars working at schools that expect faculty to publish their work in such journals face tremendous odds and tough competition. The same is true of other prestigious journals in any particular specialization. Earlier we mentioned the advent of questionable journals that publish the work of scholars for a fee. These are an outgrowth of the open journal, a journal whose publisher is committed to making the content freely available to a greater number of potential users. Legitimate open journals employ peer review of submissions. Permitting greater access to the scholarly periodical literature is a laudable goal, but there is a small catch. In order to offer this feature to scholars and to the public, open-access journals require that those submitting their work pay a publication fee. Scholars with research budgets from grants or other sources have less problem covering these fees than scholars without such funding sources. Those without these advantages may find such fees a hardship or impediment to submitting work to open-access journals.

24  The world of scholarship There now are multiple publishers – and we use the term publisher loosely – that prey on scholars who are either naïve or desperate for publication. These predatory journals differ from their respectable open-access counterparts in that they fail to employ legitimate peer review of submissions. The review is perfunctory, expedited, or nonexistent, and acceptance of one’s manuscript, once payment is made, is guaranteed. There is a lack of transparency about the required fees for publication which can run in the hundreds of dollars. The result is sticker shock for unsuspecting scholars. As we noted earlier, Robert K. Merton, the same sociologist who explicated the norms of science, argued that when society places greater emphasis on socially desirable ends than the means to achieve those ends, we can expect at least some individuals to take deviant paths to get what they want.11 If universities make job security and promotion dependent upon published work without regard for how faculty achieve those goals, we should expect some to choose deviant means of getting there. This competition in scholarship has resulted in what is perhaps a predictable perversion of the open-access journal system. Realizing that scholars need to publish in order to survive and achieve, a spate of companies have come into existence that offer publication in their journals for a fee. It’s an easy, if expensive, way to pad one’s CV with published papers. This could prove attractive to scholars who are under pressure from superiors to publish in order to maintain their employment, but who work in an environment that places greater emphasis on the ends than the means. As a result, the U.S. Federal Trade Commission has moved against one of the more prominent publishers of these predatory journals, OMICS, for misrepresenting their services.12 The publishers of such journals are taking advantage of the structural pressure to publish and exploiting naïve or desperate scholars, not to mention cluttering the literature with shoddy scholarship that contributes little or nothing to the general body of knowledge.

2.2  Legitimate, predatory, or something else? The following is from an email solicitation sent to Mark Davis. According to the journal’s website, the Article Processing Charge ranges from $549 to $749. Their website advertises “Rapid peer-review” and “Publication in three weeks.” Note the grammatical errors and solicitous compliment. We’ll let the reader decide the legitimacy of this offer. Dear Dr. Mark S Davis, Thank you for taking time out to read our email. Well, we are running short of articles to release the Volume 1 – Issue 2 of our Biomedical Journal of Scientific & Technical Research on time and this was due to my health conditions I  could not follow up the articles properly.

The world of scholarship  25 So now, I was pushed by my editorial team to publish a minimum number of articles to the sphere of this issue. I would like to open the door of our upcoming issue with your eminent research work. In this instance, I need at least a single article from your end and it can be either a 2 Page opinion/Mini Review/Short Communication/ Proceeding etc. I hope you will definitely help me out in this instigate. Await your article submission. Regards, [Editor’s name omitted]

The socialization process for scholars and the recent developments in their world form a picture of organized scholarship as competitive, driven by pressure for limited prestigious positions, job security, promotion and awards. Universities wittingly and willingly participate in this process by institutionalizing this pressurized competition through their hiring, retention, and promotion practices, something we will discuss in the next chapter. We can argue the rightness or wrongness of such policies, but the result is a complex system that sets the stage for ethical and legal breaches. Over time the bar representing what is required to attain success in the world of scholarship has continued to rise. One way in which the productivity of scholars has been affected is by the extent to which they participate in teams. Whereas the multiple authorship of journal articles was rare 50 years ago, it is now commonplace. Pick up any journal in which criminologists publish and it is not uncommon for articles to have four, five, six or more authors. In the biomedical sciences, authors often number in the dozens. Multiple authorship offers some very practical benefits. First, the individual participating in such a team need only be responsible for a small segment of what is required to complete a manuscript. For some people, this might be conceiving of the original idea. Or it could be supplying the data upon which the analysis is based. It might be performing the actual statistical analysis of the data. It could be writing a draft of the paper or a certain section of the paper. Or it could be some combination of these. When a scholar has only to be responsible for specific tasks instead of the whole paper, she can be more productive. This means credit for a greater number of publications with less effort. There are scholars in criminology and criminal justice who average 15–20 published papers or more per year, when the average in the discipline is much less. A CV that grows at such a rate translates into guaranteed and perhaps early tenure and promotion for an untenured assistant professor, and earlier promotion to full professor for a tenured associate professor. This in turn leads to higher salaries, as well as increased opportunities to collaborate, participate in grant reviews,

26  The world of scholarship consulting, speaking engagements, invitations to lecture abroad, and other paid opportunities. We do not mean to imply that there is anything suspect or inappropriate in multiple scholarship. Dividing and conquering research projects makes good sense for busy scholars who must prep and teach courses, advise students, and perform university and community service. We mention it because it has changed the complexion of the scholarly role which, as we will argue, can be related to scholarly misconduct. Another noteworthy change is the emphasis that universities and other research centers place on the pursuit of research grants.13 A scholar in the social sciences can still carve out a career by analyzing existing data such as those archived at the Inter-university Consortium for Political and Social Research (ICPSR), which enable scholars to download and analyze data and produce publishable papers without the need to pursue grants. This is not the case, however, for researchers in the biomedical sciences in which studies yielding publishable results require hundreds of thousands if not millions of dollars of federal or industry support for the collection of new data. There is no question that those scholars who bring the most grant money into their host institutions fare the best financially. Not all scholars chase grant money; indeed, many choose a more laid-back academic existence dominated by teaching and engaging in forms of research and writing that do not require huge sums of grant money for data collection. But there are ambitious scholars who become quite adept at applying for and securing large research grants, and research institutions value these go-getters for a very important reason. Research grants such as those awarded by the National Institutes of Health (NIH) or the National Science Foundation (NSF) carry what are known as facilities and administrative (F&A) or indirect costs. These represent costs the university incurs to administer a grant that cannot be covered by direct project costs. Included in F&A costs are those represented by the institution’s investment in buildings, maintenance and libraries that cannot be billed to the grant as direct costs.14 Universities apply for and receive a federally-approved F&A rate which consists of a percentage of direct costs. At Ohio State University, for example, the federally approved F&A rate for the period July 2017 through June 2018 is 55%.15 This means that a federal grant of $100,000 in direct research costs will result in a total grant award of $155,000. So, for obvious reasons, these so-called “indirect” costs are attractive to research institutions as a source of revenue. Thus, those capable of bringing in such grants are rewarded for their success. The competition of contemporary scholarship has given rise to a number of metrics that purportedly substantiate the success and popularity of a scholar. One of these is the previously-mentioned Impact Factor. Impact is measured in larger society also, such as the number of Facebook “friends” or the number of Twitter “followers.” In an era in which self-promotion is commonplace, scholars and scientists now have their own social media platforms. One of the most popular is ResearchGate, a free platform whereby subscribers can list their publications and make available downloadable PDFs.16 One aspect of Research Gate we find noteworthy is the RG Score, a metric which tells others the extent to which a

The world of scholarship  27 subscriber’s body of research has been cited, incorporating the reputation of the citers. We mention this because throughout this analysis we will argue that the competitive nature of scholarly and scientific work inadvertently promotes both individual and social narcissism, particularly in those predisposed toward selfaggrandizement and exploitative behavior. One recent development that may have relevance to the responsible conduct of research and scholarship is the proliferation of non-residential, online doctoral programs. Such programs are aimed at students who cannot or prefer not to pursue a traditional residential degree program. An increasing number of highly-ranked, bricks-and-mortar schools are offering non-residential doctorates. The University of Southern California, for example, offers a doctorate of social work (DSW) degree via distance education.17 But many of these distance education programs are offered by for-profit corporations, some of which maintain an open-enrollment philosophy; that is, they admit any student to their graduate programs that holds a bachelor’s degree from an accredited school. Some of the more prominent nonresidential schools offering doctoral education include Capella University, Northcentral University, Strayer University, the University of Phoenix, and Walden University. These schools are accredited, and coursework is taught by qualified instructors in either a synchronous or asynchronous format. We are not arguing that they are substandard in any way. What is yet unknown, however, is how online doctoral education might affect the way in which graduate students become socialized in the norms of scholarship. Are students trained through online systems, having less face-to-face contact with mentors than those in traditional programs, able to learn to competently and ethically handle data collection, analysis and reporting? This different type of socialization into the world of scholarship and science could have implications for how the graduates approach their work and how they internalize the norms of scholarship. There is another aspect of modern science and scholarship that we consider noteworthy. The U.S. has long been considered a melting pot of different nationalities, and nowhere is this more evident than in the academy. Countless scholars leave their countries of origin to train elsewhere. Many return to their countries of origin, but others choose to stay in the U.S. in permanent positions. One need only to walk around the campus of a large research university to get a feel for this diversity. Despite periodic waves of xenophobia, such as that prominent during the 2016 U.S. Presidential campaign, the U.S. has long prided itself in being conglomerate of cultures and peoples. Graduate and professional students moving from their country of origin frequently come from cultures different from the West, not a problem in itself. However, the norms governing how scientific and scholarly work is performed in their native country may be somewhat different from the norms of their country of adoption. For example, in Western countries, it is common for authorship credit to be awarded based on contribution. In certain Eastern countries in which senior professors are accorded special deference, a graduate or professional student may feel obligated to give his or her professor first or even sole authorship. So, for example, when a foreign scholar moves to the U.K., does that scholar operate under the general and specific norms of his or her

28  The world of scholarship native country, or do they adopt the norms of the culture of their new home? In Chapter 4, we will discuss the role of culture in greater detail and argue that this diversity, as beneficial as it is for scholarship and for society in general, presents a unique set of challenges for the responsible conduct of research. In sum, those who choose the world of scholarship learn early that in order to survive and thrive they must publish their scholarship in prestigious journals. At some institutions, there is an added expectation of pursuing external funding to support their scholarship. These and other norms that prevail over organized scholarship are learned during early training. A thorough understanding of the scholar’s world and its norms will help make sense of departures from those norms.

The terminology of scholarly misconduct Before we begin with definitions of scholarly misconduct, let us first point out two main themes about such definitions. The first is that there is much resistance to thinking about such violations as violations. This resistance is central to the issue of determining the occurrence of scholarly misconduct and, moreover, determining what to do about it. Second, we propose that, although scholarly misconduct is usually not considered a criminal act, it could be so conceived. While defining scholarly misconduct as crime would engender even more resistance than already exists, we have and we offer reasons for claiming that at least a portion of scholarly misconduct closely resembles conventional crime. To clarify the terminology as it applies to scholarly misconduct, we will variously use phraseology such as scholarly misconduct, academic misconduct, and research misconduct since, in the context of this book, these terms refer to similar behavior. “Scholarly,” “academic,” “scientific,” and “research” refer to the settings and people in those settings who are engaged in high-level intellectual and scientific endeavors. The settings can be universities or other research organizations of any size, private or public. The scholars involved in misconduct may occupy any stratum from undergraduate student to those at the very apex of the university or research organization. The emphasis on research misconduct separates this form of misconduct from two other forms of misconduct committed by scholars. Research misconduct is distinguishable from misconduct related to teaching or supervisory activities; for instance, professors who sexually harass students or staff are committing academic misconduct unrelated to research activities. This offense occurs in the academic setting and could possibly influence the victims’ careers or, more broadly, their entire future, but is not directly associated with the research process. This point can get a little fuzzy when we consider scholars who commit offenses within the realm of the scientific setting, such as inappropriate teaching practices that inhibit the transfer of scientific knowledge. Another distinction that separates scholarly, academic, or research misconduct from other forms of misconduct committed by scholars is one of legal definition. Scholars, like all people, sometimes commit offenses not related to the practice

The world of scholarship  29 of research and teaching, in other words, misdemeanors and felonies. Scholars who become involved in the criminal justice system because of ordinary criminal activity, from drunk driving to higher-level offenses such as embezzlement, are not part of our analysis. Medical doctors, airline pilots, and many other professionals can get into legal trouble strictly of a criminal nature and lose their licenses to practice, but their crimes have nothing to do with the skills, training, or specific purpose of their work. Some scholarly offenses have all the attributes of criminal behavior but largely have not yet been specified as crimes. This has been, in the past, true of street crime and white-collar crime as well. It is a recent development, since the 1960s, that we have recognized white-collar crime including governmental or corporate crime, as crime. Many garden-variety street crimes were not always crimes until they were labeled thus. For example, once it was legal to own human beings and mistreat them in any manner the owner chose. Once it was legal to harm nonhuman animals but now there are statutes against it. Once it was legal for a husband to rape his wife and now it is not legal. The list of the acts that were not illegal, until they were defined as illegal, is lengthy. We argue that serious scholarly offenses may, at some stage of legal development, become crime, subject to criminal investigation and prosecution. To reflect for a moment, it is astonishing that many forms of scholarly misconduct have escaped scrutiny as legally-defined crimes. When our colleagues learn of scholarly offenses such as blatant plagiarism or the theft of ideas, they recoil in horror. They don’t disbelieve that these things happen, but their reaction is a mix of anger, disgust, and how-can-this-be? We pause at this juncture to wonder: If the audience believes these true stories, and if they are negatively affected by the stories, can it be possible to fail to view these actions as socially and personally harmful and therefore criminal? Harm came to innocent victims. Commonly, an event that does physical, emotional, or financial harm to innocent victims is criminal. Scholars may indeed view these actions as criminal but, without an official designation as criminal, these actions are not defined as criminal. This last statement does not make the action any less criminal in experience.

The ambiguous definition of scholarly misconduct Plenty of scholars refuse to examine scholarly misconduct carefully, so much so that we lack firm and clear definitions of what constitutes scholarly misconduct. Since we have no clear and firm definitions, we do not feel comfortable reporting what we suspect may be scholarly misconduct and, therefore, we lack adequate systems or procedures for responding to scholarly misconduct. This lack of agreement, in turn, has allowed scholars to throw up their hands in seeming despair and admit defeat without even considering whether scholarly misconduct could be defined as wrongdoing or even crime. Given the lack of agreement about definitions and suitable sanctions, the scholarly community has been mostly forgiven for not doing anything about scholarly misconduct. The lack of clarity and the absence of a lexicon to describe scholarly misdeeds may or may not be deliberate.

30  The world of scholarship However, these gaps in understanding, let alone discourse on the topic, may substantiate resistance to definition and sanctioning. How do we know scholarly misconduct when we see it? This is the question asked by Rebecca Dresser, a legal expert on the topic of research misconduct.18 A clear answer remains elusive but discussions about this question do offer the sources of and reasons for vague definitions of scholarly misconduct. For instance, Dresser offers definitions of scientific misconduct as laid out by six agencies – Public Health Service, Office of Science and Technology Policy, Department of Health and Human Services (DHHS), National Science Foundation, National Academy of Sciences, and Federal Policy of Research Misconduct – all of which are so vague as to be almost useless. Terminology such as “honest error” fails to differentiate between scientific error and misconduct; this terminology further seems to excuse misconduct if it is the result of “honest error.” To define scientific misconduct as “fabrication, falsification, plagiarism, or other practices” leaves a lot of room to speculate on what may comprise “other practices.” Definitions of this type, if they can be called definitions, are so vague that much questionable behavior is allowed to slip through the cracks. The DHHS Commission on Research Integrity issued the general statement that a researcher’s duty is to “be truthful and fair in the conduct of research and the dissemination of its results.” This statement was intended to provide boundaries for misconduct when it does no such thing. Vague definitions such as these prohibit meaningful governmental regulation and, we would argue, other forms of regulations such as university disciplinary boards from defining and responding to misconduct. Since that is the case, clearly the definitions of misconduct require tightening.

Plagiarism, fabrication, and falsification Fabrication, falsification, and plagiarism, (FF&P) are the “big three” forms of scholarly misconduct discussed in the academic literature. We will describe each of them here and throughout, and while there are numerous forms of maladaptive scholarly behavior, these are the focus of our analysis. Plagiarism There is a lot of ambiguity and confusion as to what constitutes acts of scholarly misconduct. Even the experts who discuss it do not always agree on the simplest matters such as the definition of plagiarism. Plagiarism is often considered a “significant violation of truthfulness and involves stealing intellectual property or taking credit for other individuals’ work.”19 Most scholars, legal and other, say that it is verbatim copying of another’s work. Yes, that definition makes it easy to know it when you see it but it does not encompass the theft of ideas. Mathieu Bouville, for example, contends that plagiarism is not merely the theft of words but can also refer to the theft of ideas.20 To that last remark, scholars, especially legal scholars,

The world of scholarship  31 would point out that ideas are not copyrightable and therefore cannot be proven to have been stolen nor can the victim take legal action against the offender who has appropriated the idea. We can point to some distinguishing features of plagiarism as a form of theft by placing the features on a continuum. At one end, we find plagiarism’s most extreme and obvious form, verbatim copying, with managed copying in the middle, and theft of ideas, citations, and framework at the other end. An example of the extreme is the outright copying of an entire book on human anatomy by surgeon and anatomist William Cowper in the late 1600s.21 Our point is that this continuum of intellectual wrongdoing exists regardless of the existence of ethical standards or laws prohibiting them. Plagiarism is the verbatim copying of another’s work. However, as we saw in the Preface, managed copying and substantial similarity are two legal definitions that negate the need to demonstrate word-for-word copying as plagiarism. Managed copying, as Bonnie noted in the Preface, is paraphrasing, with the copier using the work of the original but saying it in a slightly different way. The test for substantial similarity is one of having two or more people examine the works, the copy and the original, and if the examiner cannot tell the difference between the content of the two documents because the content looks basically the same and the overlap is substantial, there is sufficient overlap to say the original work was plagiarized. So plagiarism is not necessarily the common understanding of verbatim copying. Indeed, some writers on the topic go so far as to say that plagiarism is the theft of ideas, not just the verbatim theft of words. This brings up an interesting question about motivations of the offender versus the technological aspect of stealing others’ previously-published or unpublished work. In Mark Davis’s piece on defining the legal concept mala in se, he makes the argument that offenses that are wrong in and of themselves are equally wrong as those that are wrong simply because they are prohibited.22 In other words, the question is whether plagiarism, and perhaps other examples of scholarly misconduct, are mala in se – inherently wrong. If so, we may ask whether the legal definitions of plagiarism are deliberately vague so that it cannot be defined as mala prohibita (wrong because prohibited). In light of the present argument about plagiarism a form of theft, suffice it to say that our perspective “posits that humans are socialized to believe in fairness in their dealings with other humans.”23 Taking something that is not yours or your own creation is, of course, unfair, wrong and it calls for some brand of justice. Equity theory posits that offenses mala in se violate the norm of reciprocity by yielding for the perpetrator an outcome to which he or she is not entitled.… Theft, if classified as a mala prohibitum, is not fair because the thief has derived an outcome . . . without having made the appropriate inputs.24 Our point in bringing up equity theory and mala in se versus mala prohibita offenses, in reference to plagiarism, is to state that plagiarism is a form of

32  The world of scholarship theft – gaining an outcome, such as a publication, without appropriate input – that is just as wrong whether it is prohibited by law or not. This is a concept we will revisit throughout this analysis. Like most offenses, criminally defined or otherwise, plagiarism occurs because it pays off for the offender. The likelihood of payoff is increased because of the lack of clarity of definition and the restrictedness of definition. It is a form of “literary theft” that presents “as new and original an idea or product derived from an existing source.”25 It can cause distress among well-established victims as well as from “hard-working students who see fellow students willing to breach ethical codes only to be rewarded for their behavior.”26 Plagiarism overlaps with intellectual property theft and copyright violations when the original work is patented or published. We would argue that plagiarism includes the theft of ideas of work products even if unpublished; however, the patenting or copyrighting process does strengthen, even if minimally, the legal legs to stand on. As expected, scholars do rely on the work of others when publishing their own work. It is also expected that scholars, though relying on previous work, add new, creative, and original material. In short, we should adhere to rules of publishing etiquette and avoid fraudulent publication, duplicate publication, and plagiarism.27 Fraudulent publication can include plagiarism, fabrication, and falsification, discussed below. In our view, plagiarism and related forms of literary pilfering represent the taking of material one doesn’t have a right to take. Regardless of its status in the law, plagiarism is exploitation, that is, an unfair exchange in which the plagiarist benefits at the expense of the creator. When we refer to idea theft as the “kidnapping of one’s brainchild,” it is meant to convey the indescribable sense of creation and ownership one feels for an original idea or piece of work. Theft of one’s “baby,” therefore, constitutes an unrecoverable loss. We will come back to this notion when we argue that most people outside the worlds of science and scholarship can identify with this because we all subscribe to the same norms of fairness and justice. Fabrication and falsification Plagiarism can be viewed as a less serious offense compared to fabrication and falsification since plagiarism, though a scholarly offense, is theft of intellectual property which presumably is well-conducted and well-written and therefore has value. In short, it refers to republishing someone else’s original work and presenting it as one’s own work. If the original and the replicated work are both comprehensive and well-done, the harm is only in the theft. Fabricating data, by contrast, is making up nonexistent data from whole cloth and reporting it as though valid and existent. The harm here is worse than plagiarism because not only is the information misleading to the general and academic publics, it has no value whatsoever. Worse, since science builds on itself, scientists may rely on fabricated data as the basis for future work. If the foundation lacks structural integrity, then

The world of scholarship  33 whatever rests on it is doomed to collapse. Indeed, published studies that later have been retracted have been cited by numerous subsequent studies, resulting in a domino effect of damage to the scholarly record. A baffling and startling example of fabrication is the case of a Dutch social psychologist who made up entire studies and wrote the results centered around what he assumed the public had already concluded about human behavior.28 He and his students did conduct social psychology experiments, but to enhance his research findings, he concocted “sexy” results that scholarly journals would find attractive. “Sexy” is not a requirement of the scholarly world, but of the popular world in which we all live. Regardless, however sexy they might have been, the “results” were not truthful. Our fellow scientists react to such stories with disbelief, followed by anger, followed by a dismissal of the perpetrator as a “rotten egg in an otherwise-honest enterprise.29 On the face of it, it is hard to believe that respected scholars would engage in misconduct of such extremes. As to the “outlier” supposition, the comfortable assumption that such bad-apple scientists are rare, is, unfortunately, not true. They are “not outliers so much as one end on a continuum of dishonest behaviors” from cherry-picking data to support a particular hypothesis to outright fabrication.30 Falsifying data is also a serious breach since it refers to “massaging” data to represent something that was not actually found. Often, this form of misconduct occurs because the scientist wants to make a point that is not supported by the data. One common form of falsification is the modification of graphics designed to illustrate research findings in biomedical and other sciences. Retraction Watch reports on cases wherein results have been modified or duplicated from ones used in previously published papers. Astute observers uncover these attempts to falsify data, and the offenders are exposed. Fabrication and falsification can also express itself in the presentation of a scholar’s credentials. Pat Palmer, whose case is discussed later in this book, listed degrees on her CV that she hadn’t earned. On a federal grant application, she listed herself as a coauthor of several academic publications in which she had played no role whatsoever. Again, those who misrepresent their credentials in order to gain a job, promotion, grant or other career objective are unfairly taking advantage of the system. While we focus a great deal of attention on the role of the individual in this analysis, our sociological training demands that we give proper due to the social contexts in which scholarship and its maladaptations take place. We know that some individuals participating in the peer review process deviate from established procedures for the objective and anonymous evaluation of research. Editors can also deviate from standard practice as in a case that came to light in an instance of apparent collusion among editors to ensure the citation of articles in the journals they edited. The practice involved suggesting authors cite certain works to improve the papers they had submitted. But in doing so, these editors put their own selfish interests – having a higher Impact Factor for their journal – above the fair and objective evaluation of submissions. This shows that the profit motive – often associated with white-collar and corporate crime – also infects the world of scholarship and science.

34  The world of scholarship Questionable and not-so-questionable research practices While the emphasis in our analysis is FF&P, there are other, arguably less serious scholarly behaviors that have been labeled “questionable research practices (QRPs).31 The term embraces a wide range of practices many of which fall into a vast gray area between FF&P and acceptable research practice. They include authorship practices such as “gift authorship” to coauthors whose contributions do not warrant authorship, as well as failing to extend authorship to those who have contributed substantively to a scholarly work. It is quite probable that QRPs occur far more often than FF&P, and they therefore deserve greater attention from the scholarly community. Brian Martinson, a social scientist who has conducted several studies of research misconduct, has argued that a number of the so-called questionable research practices are not really questionable.32 That is, there is more agreement about their wrongness and the harm they do than the term “questionable” implies. In recognition of the harm these lesser offenses do to the worlds of scholarship and science, the National Academy of Sciences suggested that the term “detrimental research practices” is more appropriate.33 There are gradations of seriousness of these behaviors, and thus there is not universal agreement about which ones warrant official sanctions. This ambiguity makes the adoption of universal standards much more difficult. One can make a strong case that detrimental research practices (DRPs) are far more prevalent than FF&P and therefore deserve greater attention. And we wouldn’t argue with this. But in the case we make – that certain forms of scholarly misconduct can and should be treated as crime – we do not include most DRPs. Most of these truly are violations of Mertonian technical norms that vary with discipline. They are important, to be sure, but they are not part of our analysis which focuses on serious deviations from more general social norms.

Official responses to scholarly misconduct Cases of scientific misconduct have historically been handled administratively, primarily by the institutions in which they occur. In response to a number of notorious cases of research misconduct at some of the most prestigious institutions in the U.S., there was a number of recommendations designed to respond to the growing threat of research misconduct. Among the group’s recommendations was that there be a federal agency whose role was to investigate instances of research misconduct and to promote the responsible conduct of research. In 1989, the then newly-created federal Office of Scientific Integrity (OSI) began to investigate alleged cases of scientific misconduct involving research sponsored by the Public Health Service, including the various National Institutes of Health. In 1992, the OSI was combined with the Office of Scientific Integrity Review to form the current Office of Research Integrity (ORI) within the Office for Public Health and Science, a subdivision of the Department of Health and Human Services. Since its inception, the ORI has, as part of its charge, the

The world of scholarship  35 investigation of certain alleged cases of research misconduct. The ORI employs a staff of scientists from fields that have included biochemistry, molecular biology, chemical engineering, experimental psychology and virology. These ORI scientists serve as detectives of sorts, sifting through scientific evidence, sometimes attempting to replicate questioned research findings, all in an effort to ascertain whether the respondent did in fact engage in research misconduct. It is easy to see how ORI and its predecessor, OSI, earned nicknames such as “data police” and “data FBI.”34 The ORI provides a number of services other than investigation and administrative action. ORI also promotes the responsible conduct of research through numerous educational and training initiatives, and sponsors research on research integrity. In the early 2000s, the ORI, with funding support from several other NIH agencies, embarked on a program of extramural research. We consider this development important in RCR history in that this program facilitated a number of empirical studies where previously the research record was dismayingly thin. In Chapter 8 we offer a number of suggestions for future research on these topics. The National Science Foundation, the other major sponsor of scientific research in the U.S., has an Office of Inspector General (OIG) which functions similarly to the ORI. The OIG “is responsible for promoting efficiency and effectiveness in agency programs and for preventing and detecting fraud, waste, and abuse.”35 In addition to investigating cases of alleged research misconduct, it recommends sanctions for those found guilty. Much less is known about NSF’s OIG, perhaps because it deals with fewer instances of alleged misconduct, many of which involve less money than biomedical research projects. The OIG is also less transparent in that it offers less information on its website on completed misconduct cases than does ORI. Other countries operate somewhat differently from the U.S. In the United Kingdom, for example, research misconduct by physician-scientists is governed by the General Medical Council. Those who are found guilty of serious research misconduct can be “struck off” the medical registry of licensed physicians. This eliminates the possibility of an alternate career in private medical practice for the doctor who seriously misbehaves in the role of scientist. There is a range of administrative consequences that face those found guilty of FF&P. Perhaps the best known of these is debarment, which precludes the respondent from applying for federal funds for a specified period of time, generally up to ten years. This is a severe sanction for researchers in the biomedical sciences because, as we have noted, research in these allied fields requires huge sums of money. If a scientist is no longer eligible to apply for necessary federal funding, his or her career is at minimum on hold, and in some cases, it may be over. Debarment is less an issue and therefore perhaps less worrisome to scholars in the social and behavioral sciences, as well as those in the humanities, many of whom can produce scholarship without the support of large federal grants. The ORI also frequently requires those found guilty of misconduct to work under the supervision of another scientist for a specified period of time. Respondents, the term ORI uses to refer to those accused of research misconduct, may

36  The world of scholarship also be prohibited from serving on federal grant review committees for a period of time. While these administrative sanctions may seem inconsequential to the lay person, they in fact can stall or bring to an end what otherwise might have been a promising career in scientific research. When an individual so sanctioned for scientific misconduct applies for future federal funds, they must indicate their status of being debarred. Universities, hospitals and other organizations in which research misconduct occurs can and do conduct their own investigations and impose their own punishments independent of any federal administrative actions. At Arizona State University, for example, Matthew Whitaker, a member of the history faculty found to have plagiarized part of a published book, was demoted by the university and forced to accept a pay cut. It turns out that Matthew Whitaker’s consequences for his misconduct went well beyond his university position. His consulting company had a $268,000 contract with the city of Phoenix, Arizona to provide cultural awareness training to police.36 Upon learning of his alleged plagiarism woes at ASU, the city cancelled the contract. Thus, betraying the trust as a scholar may well cause collateral damage to one’s life outside academe. Whitaker later agreed to resign from ASU. As serious as FF&P are, they are not always met with universities’ most severe responses. In the case of Rusi P. Taleyarkhan of Purdue University, the provost stripped the accused of a named professorship along with the $25,000 stipend associated with that distinction.37 Curiously, the university also stipulated that Taleyarkhan could not serve as a thesis adviser for at least three years. University of California, Riverside biochemist, Frank Sauer, is another case in point. Sauer was found to have engaged in either fabrication or falsification of images in a number of scientific papers. The university’s investigative committee recommended several forms of punishment including not being able to publish a paper for five years and not being eligible for merit increases for the same time period.38 It is not clear how these punishments fit the offenses. There were other recommended sanctions, but the committee did not recommend dismissal. What makes their official recommendations hard to understand is that Sauer had committed numerous scholarly offenses over a period of 16 years. In criminal terms, he was a serial offender. There have been cases wherein those found guilty of engaging in misconduct while in graduate or professional training have had their degrees revoked. Milena Penkowa, a neuroscientist who trained at the University of Copenhagen, was found to have attempted a coverup of irregularities related to her dissertation research.39 Not only was her degree revoked, she was given a suspended sentence in Copenhagen City Court. Revocations of degrees occur rarely, but it gives colleges and universities a unique stick to hold over scholars in training who consider taking ethical shortcuts. Indeed, this is far worse than debarment because without the doctorate, disgraced scholars are forced to find another line of work. In some cases, it is possible to salvage one’s academic career following research misconduct. John Smith was a professor in the medical school of a large research university.40 Two decades after receiving his highest degree, and after having been

The world of scholarship  37 promoted to full professor in his basic science department, Professor Smith was accused of committing plagiarism in a grant proposal. His university conducted an investigation and concluded that he had indeed committed plagiarism. Smith admitted to the conduct, even though he explained that it was not his intention to misappropriate the written work of another. As a punishment for his offense, Professor Smith was assessed a fine by his university, which he paid. He was permitted to retain his position and rank. After the finding of scientific misconduct, Dr. Smith continued to have a productive career in academic science. A MEDLINE search revealed that subsequent to the year of his misconduct finding, he was co-author of numerous papers in his field of specialization, and he and his team of scientists received subsequent NIH support. The case of Dr. Smith may not be typical, but it demonstrates that a remorseful, one-time offender can remain in the scientific community as a contributing member. We will come back to the notion of reclaiming careers in scholarship and science. Another example of career recovery is that of the William Summerlin, the infamous cancer researcher who enshrined himself in the scientific hall of shame by inking a skin patch on a mouse as a way of “proving” evidence that didn’t exist. Summerlin, who had trained to be a skin specialist, indicates in his online CV that at one time he specialized in studying mice. His CV, which notes his background in medical research but makes no mention of his checkered past, goes on to read: After several years conducting research in Minnesota and New York, Dr. William Summerlin relocated to Louisiana and began practice in the New Orleans area, where he also taught at Tulane University Medical School. A decade later, he relocated to Arkansas and founded his private practice. Dr. William Summerlin dedicates much of his free time to the support of charitable organizations and community groups. He serves as the head of his local Rotary Club group, serves on the board of his local YMCA and Chamber of Commerce, and works regularly with the United Way.41 Summerlin’s case appears to have had a better-than-anticipated ending. Not only was he able to have a career as a physician, it appears he was active in community and civic affairs. What we take away from Summerlin’s example is the notion that some of those who engage in scholarly misconduct, like many of their counterparts who commit more traditional crime, can transform themselves into productive members of society. This is an important conclusion inasmuch as we will argue later that as harmful as some forms of scholarly misconduct is, we agree with Michael Farthing that there should be room in the community of scholars for redemption and forgiveness.42 In response to well-publicized cases of research abuses and misconduct, the community of scholarship and science embarked on a formal program of providing instruction on the responsible conduct of research. Most researchers at universities and other research institutions must undergo training such as that provided at the Collaborative Institutional Training Initiative (CITI).

38  The world of scholarship CITI training offers numerous modules on such topics as the appropriate treatment of human subjects, animal subjects, research misconduct, and other RCR topics. There is mixed evidence on whether RCR training such as that offered through CITI is effective in preventing research misconduct and other detrimental research practices. It is our position the RCR training is both necessary and beneficial. We do not, however, consider such training effective in preventing the more egregious forms of misconduct such as idea theft, plagiarism and fabrication. Donald Kornfeld, professor emeritus of psychiatry at Columbia University, put it this way: We should not be surprised that Responsible Conduct of Research courses do not influence the behavior of trainees. Research misconduct – fabrication, falsification, and plagiarism – are the academic equivalents of lying, cheating, and stealing. Ethical standards prohibiting such behavior are established long before students begin graduate training in science.43 Finally, scholarly and professional associations also serve as agents of social control with respect to research misconduct. Most scientific societies now have statements on ethical research behavior including but not limited to authorship practices, the appropriate handling of data, conflicts of interest and the treatment of human and animal subjects. While many of these ethical codes do not specify precise consequences for specific violations, one can infer that the association has the wherewithal to censure members found guilty of such behavior, including expulsion from the group. Some scholarly associations have decided not to assume an investigatory or prosecutorial role in these cases. The limited array of consequences available to scholarly associations make them a much less effective agent of social control in the ethical practice of scholarship. And as we will discuss later, some scholarly associations try to avoid any formal involvement in promulgating or enforcing professional ethics. Last but important, the non-profit sector has made noteworthy inroads in the social control of scientific work. Retraction Watch, mentioned earlier, is a blog which serves as a vehicle for investigating and reporting suspicious studies. The brainchild of journalists Adam Marcus and Ivan Oransky, the latter of which also happens to hold an MD degree. Retraction Watch operates with the support of subscribers as well as from such prominent sponsors as the John D. and Catherine T. MacArthur Foundation.44 As the name suggests, Retraction Watch follows and highlights scholarly papers that have been retracted. Retraction Watch is intriguing as a modern source of social control. That is, in publicly examining alleged cases of scholarly misconduct, Retraction Watch draws back the curtain on what previously has largely been kept hidden. As a nonprofit, they are not beholden to government funding agencies or subject to their constraints. And inasmuch as the staff members are journalists, they enjoy the protection of the First Amendment of the U.S. Constitution which guarantees freedom of expression. In Chapter 9 we will revisit the role of the so-called Third Sector in preventing and controlling scholarly misconduct.

The world of scholarship  39

2.3  Administrative sanctions for scholarly misconduct The following is ORI’s published disposition of a fabrication/falsification case against Dr. Maria C.P. Geraedts, University of Maryland, Baltimore.45 Dr. Geraedts has entered into a Voluntary Settlement Agreement with ORI and UMB, in which she voluntarily agreed to the administrative actions set forth below. The administrative actions are required for three (3) years beginning on the date of Dr. Geraedts employment in a position in which she receives or applies for PHS support on or after the effective date of the Agreement (September 22, 2015). If the Respondent has not obtained employment in a research position in which she receives or applies for PHS support within one (1) year of the effective date of the Agreement, the administrative actions set forth below will no longer apply. Dr. Geraedts has voluntarily agreed: (1) to have her research supervised as described below and notify her employer(s)/ institution(s) of the terms of this supervision; Respondent agreed that prior to the submission of an application for PHS support for a research project on which her participation is proposed and prior to her participation in any capacity on PHS-supported research, Respondent shall ensure that a plan for supervision of her duties is submitted to ORI for approval; the supervision plan must be designed to ensure the scientific integrity of her research contribution; Respondent agreed that she will not participate in any PHS-supported research until such a supervision plan is submitted to and approved by ORI; Respondent agreed to maintain responsibility for compliance with the agreed upon supervision plan; (2) that any institution employing her shall submit in conjunction with each application for PHS funds, or report, manuscript, or abstract involving PHS-supported research in which Respondent is involved, a certification to ORI that the data provided by Respondent are based on actual experiments or are otherwise legitimately derived, and that the data, procedures, and methodology are accurately reported in the application, report, manuscript, or abstract; and (3) to exclude herself voluntarily from serving in any advisory capacity to PHS including, but not limited to, service on any PHS advisory committee, board, and/or peer review committee, or as a consultant for period of three (3) years beginning on September 22, 2015.

James DuBois of Washington University in St. Louis and his team embarked on a novel program designed to help wayward scholars recover their careers through a sort of rehabilitation. Supported by funding from the Office of Research Integrity and other sponsors, Professor DuBois, previously at Saint Louis University,

40  The world of scholarship developed the Professionalism & Integrity in Research (P. I.) Program with advice from an expert developmental team and consultants.46 The P. I. Program accepts applications from those who have been accused or found guilty of research misconduct. It is designed to help participants better navigate the complex world in which difficult decisions are made during the course of research. Whether rehabilitative efforts such as this are effective remains to be seen. It is impossible to know what trajectory the participants would have taken had they not participated. Consistent with our view of serious scholarly misconduct as crime, we support such efforts to help remorseful offenders rejoin the community. In these and other ways, the scholarly community attempts to address serious violations of scientific norms. A  major caveat here is one we have mentioned and will discuss more fully elsewhere and that is the suspected relative absence of reporting misconduct. And this absence speaks to fear of reprisal on the part of victims, the relative power of many scholarly offenders, and an intense willingness on the part of scientific professions to not recognize the existence and seriousness of scholarly misconduct. Moreover, can it be said that the posture of intolerance regarding serious violations is consistent with how society addresses serious violations of other norms? An examination of how we process crime may help to illustrate alternative ways of processing some cases of research misconduct. We will discuss these in greater detail in Chapter 6.

Conclusions Scholarship within academic institutions has become a complex and competitive enterprise. It not only requires a substantial investment of time, money and energy to acquire the necessary credentials, the race continues well after graduate school as scholars compete for positions, journal space, research funding, job security and awards. These aspects of academic life may be resistant to change. The world of organized scholarship and science defines what is appropriate and what falls outside standards of acceptability. Behaviors such as the fabrication and falsification of data as well as plagiarism stand at the top of proscribed offenses. Less clear are the detrimental research practices, some of which fall in a gray area between acceptable and unacceptable. This ambiguity about what is and what is not wrong provides opportunities for would-be offenders to commit scholarly offenses and deny that they have. And as we will see, this ambiguity makes it more difficult for agents of social control such as universities and scholarly organizations to bring about positive change. Behaviors including FFP and other detrimental practices are said to violate the various norms of science as outlined by sociologist of science Robert K. Merton. But these offenses occur outside of the world of scholarship and are perpetrated by journalists, fiction writers, corporate executives, high school and college students, and others who have no obligation to be familiar with, or to abide by, Mertonian norms. This suggests we must come up with another rationale for why these behaviors are so offensive in so many different realms.

The world of scholarship  41 Let us first point out two main points about such definitions. The first is that there is much resistance to thinking about such breaches as violations. This resistance is central to the issue of determining the occurrence of scholarly misconduct and determining what to do about it. Second, we propose that, although rarely is scholarly misconduct considered a criminal act, it could be so conceived. While defining scholarly misconduct as crime would engender even more resistance than already exists, we have reasons for claiming that at least a portion of scholarly misconduct does resemble crime. There is a wide range of administrative consequences facing the individual who engages in scholarly misconduct. They involve not only the organization in which the offending scholar works, but also federal agencies if the scholarship in question was federally supported. If we assume that rationales for administrative sanctions parallel those in the criminal justice system, we must conclude that current practices come up short. Inasmuch as there continue to be cases of research misconduct that cause substantial harm, we have to conclude that administrative consequences carry little deterrent value. Despite well-intentioned efforts to rehabilitate a handful of those who have been found guilty of research misconduct, we don’t know if they would have ceased offending without such efforts. Some offenders occupy prominent positions at prestigious universities, and have been able to offend and get away with it. It is likely that the victim in such cases could successfully fight any attempts to sanction the offender. Why are scholars and scientists immune from the processes of justice that others face for wrongful behavior? Why is the theft of an idea or paper deemed less serious than the theft of a lawnmower or a laptop computer?

Notes 1 Independent scholars are those not employed full-time as scholars, but who engage in scholarship part-time or in retirement. In the U.S., the National Coalition of Independent Scholars is a non-profit association that supports independent scholarship. In Canada, the Canadian Academy of Independent Scholars serves a similar purpose. 2 Available online at: www.usnews.com/best-graduate-schools (accessed 7-26-2017). 3 Columbia University Record (1994). “Merton Awarded Nation’s Highest Science Honor.” September  16. Available online at: www.columbia.edu/cu/record/archives/ vol20/vol20_iss2/record2002.13.html (accessed 7-11-2017). 4 Merton, Robert K. (1942). “The normative structure of science.” In The Sociology of Science: Theoretical and Empirical Investigations. Chicago, IL: University of Chicago Press. 5 Zuckerman, Harriet (1977). “Deviant behavior and social control in science,” in Edward Sagarin (ed.), Deviance and Social Change: Sage Annual Reviews of Studies in Deviance, Volume 1. Beverly Hills, CA: Sage Publications, pp. 87–138. 6 Available online at: www.asc41.com/dir3/ads/umaine1115.pdf (accessed 12-29-2015). 7 See Little, Roderick J. A. and Rubin, Donald B. “The Analysis of Social Science Data with Missing Values.” Sociological Methods & Research, 18: 292–326 for an overview of the decisions facing the researcher whose data has missing values. 8 Carey, B. (2015). New York Times. 6–15–15. 9 Kleinig, John (1978). “Crime and the Concept of Harm.” American Philosophical Quarterly, 15: 27–36.

42  The world of scholarship 10 Institute of Medicine/National Research Council (2002). Integrity in Scientific Research. Washington, DC: National Academies Press. 11 Merton, Robert K. (1938). “Social structure and anomie.” American Sociological Review, 3: 672–682. 12 Federal Trade Commission (2016). “FTC Charges Academic Journal Publisher OMICS Group Deceived Researchers.” Available online at: www.ftc.gov/news-events/pressreleases/2016/08/ftc-charges-academic-journal-publisher-omics-group-deceived (accessed 7-12-2017). 13 See, for example, Musambira, G., Collins, S., Brown, T. & Voss, K. (2012). “From ‘Publish or Perish’ to ‘Grant or Perish’: Examining grantsmanship in communication and the pressures on communication faculty to procure external funding for research.” Journalism and Mass Communication Educator, 67: 234–251. 14 For a definition of F&A costs, see http://osp.osu.edu/administration/managing-expenditures/sponsored-programs-costing-policy/ (accessed 7-29-2017). 15 Available online at: http://osp.osu.edu/development/budgets/fa-costs/ (accessed 7-29-2017). 16 Scott, Mark (2017). “A  Facebook-shift in how science is shared.” New York Times, February 28. Available online at: www.nytimes.com/2017/02/28/technology/scienceresearch-researchgate-gates-goldman.html (accessed 1-2-2018). 17 Available online at: https://msw.usc.edu/academic/doctor-social-work/ (accessed 730-2017). 18 Dresser, Rebecca (2001). “Defining Research Misconduct: Will We Know It When We See It?” Hastings Center Report: 31–33. 19 King, C. R. (2001). “Ethical issues in writing and publishing.” Clinical Journal of Oncology Nursing, 5: 19–23. See also Berg, A. O. (1990). “Misconduct in science: Does family medicine have a problem?” Family Medicine, 22(2): 137–142; Berk, R. N. (1991). “Is Plagiarism ever Insignificant?” American Journal of Roentgenology, 157: 614; Malone, R. E. (1998). “Ethical Issues in Publication of Research.” Journal of Emergency Nursing, 24: 281–283; Rogers, B. (1993). “Using the words and works of others: A commentary.” AAOHN Journal, 41(1): 46–49. 20 Bouville, Mathieu (2008). “Plagiarism: Words and ideas.” Science and Engineering Ethics, 14: 311–322. 21 Newby, Kris (2017). “A  peek at the scandalous history of anatomical illustration.” SCOPE, May  11 [blog]. Available online at: http://scopeblog.stanford.edu/?s=anato mical+illustration (accessed 8-8-2017). 22 Davis, Mark S. (2006). Crimes mala in se: An equity-based definition. Criminal Justice Policy Review, 17: 270–289. 23 Ibid., p. 277. 24 Ibid., p. 278. 25 Merriam-Webster Online Dictionary entry for “plagiarize.” Available online at: www. merriam-webster.com/dictionary/plagiarize (accessed 7-23-2017). 26 Martin, Daniel F., Rao, Asha and Sloan, Lloyd R. (2009). “Plagiarism, Integrity, and Workplace Deviance: A Criterion Study.” Ethics and Behavior, 19: 36–50. 27 King, C. R., McQuire, D. B., Longman, A. J. and Carroll-Johnson, R. M. (1997). “Peer Review, Authorship, Ethics, and Conflict of Interest.” Image: Journal of Nursing Scholarship, 29: 163–167. 28 Bhattacharjee, Y. (2013). “The Mind of a Con Man.” New York Times Magazine, April 28, pp. 44–52. 29 Ibid., p. 48. 30 Ibid., p. 48. 31 One early discussion of QRPs can be found in Steneck, N. H. (2003). “The role of professional societies in promoting integrity in research.” American Journal of Health Behavior, 27 [Suppl 3], S239–S247.

The world of scholarship  43 32 Martinson, Brian. (2015). Personal communication at American Psychiatric Association meetings in May, Toronto, CA. 33 NAS report – new term for QRPs.Committee on Responsible Science. (2017). Fostering Integrity in Research. Washington, DC: The National Academies Press. 34 Leary, Warren E. (1991). “On the trail of research misconduct, ‘science police’ take the limelight.” New York Times, March  24. Available online at: www.nytimes. com/1991/03/25/us/on-the-trail-of-research-misconduct-science-police-take-the-limelight.html (accessed 8-7-2017). 35 National Science Foundation (2016). Office of Inspector General. Available online at: www.nsf.gov/oig/ (accessed 10-30-2016). 36 Associated Press (2015). Arizona Daily Star, September 18, p. A8. 37 Chang, Kenneth (2008). “Purdue, citing research misconduct, punishes scientist.” New York Times, August  27. Available online at: www.nytimes.com/2008/08/28/ science/28purdue.html (accessed 8-7-2017). 38 Marcus, Adam (2017). “What a report into scientific misconduct reveals: The case of Frank Sauer.” RetractionWatch, July  19. Available online at: http://retractionwatch. com/2017/07/19/report-scientific-misconduct-reveals-case-frank-sauer/ (accessed 8-8-2017). 39 McCook, Alison (2017). “Copenhagen revokes degree of controversial neuroscientist Milena Penkowa.” RetractionWatch, September 12. Available online at: http://retractionwatch.com/2017/09/12/copenhagen-revokes-degree-controversial-neuroscientistmilena-penkowa/ (accessed 1-2-2018). 40 John Smith is the pseudonym we have given an individual who was actually found guilty of scientific misconduct. Even though information about him and his case is available from public sources, we see no worthwhile purpose in divulging his identity. 41 VisualCV.com. Available online at: www.visualcv.com/williamsummerlin (accessed 7-11-2017). 42 Farthing, Michael J. G. (2014). “Research misconduct: A grand global challenge for the 21st Century.” Journal of Gastroenterology and Hepatology, 29: 422–427. 43 Kornfeld, Donald S. (2012). “Perspective: Research misconduct: The search for a remedy.” Academic Medicine, 87: 877–882 44 Available online at: http://retractionwatch.com/ (accessed 7-30-2017). 45 Available online at: https://ori.hhs.gov/content/case-summary-geraedts-maria-cp (accessed 12-29-2015). 46 See http://integrityprogram.org/

3 Structural and organizational causes of scholarly misconduct

When an understudied social problem such as scholarly misconduct comes to the fore, most often through media accounts, it is first discussed by members of the profession. They discuss the forms it takes. They ask questions about the likely incidence and prevalence of the problem, and they wonder how serious it is compared with other social problems. And they offer armchair explanations for why the problem exists. Regardless of whether the explanation for scholarly misconduct is social, psychological, or organizational, the curiosity and accretion of knowledge to understand it on some level, whether in casual conversations or formalized research, seem to follow a similar course.1 When in 1974 Sloan-Kettering cancer researcher William Summerlin was exposed for having inked a patch of skin on a mouse in a vain attempt to demonstrate the success of a transplantation graft, a number of commentators offered reasons for why he had committed such a shocking violation of scientific trust. In one of the earlier surveys of researchers about scientific fraud, June Price Tangney, then a psychology graduate student at UCLA, asked her respondents to speculate about factors that contributed to scientific fraud.2 Of the 245 researchers in the physical, biological, behavioral, and social sciences who responded to Tangney’s questionnaire (representing a response rate of 22%), a little over half noted job security and promotion as major motivators for research misconduct, and 56% of the respondents reported that the desire for fame and recognition also factored into the etiology equation. Other reasons cited by her respondents were a belief or desire to promote a theory and laziness. The desire for fame and recognition could be interpreted as structural inasmuch as academe in general makes these possible through a series of incentives and pressures. Tangney’s preliminary research reinforced some of the early armchair impressions. In another attempt to measure the incidence of fraud, Jonas Ranstam and his colleagues conducted an international survey of biostatisticians and asked respondents about the probable rationale explaining why medical fraud occurred.3 A  majority of the respondents believed that career aspirations and a desire for power, not financial benefit, were the main motives for misconduct. We interpret “career aspirations” as socially desirable ends that are shared by many. Findings about purported causes from such studies should be interpreted with caution since they only report on researchers’ perceptions of why others engage

Structural and organizational causes  45 in misconduct, not on actual cases of research misconduct. Unless verified that those reporting have engaged in research misconduct and are responding based on firsthand knowledge, examining only perceptions of researchers in general tells us little about why research misconduct occurs. William James noted that “scientists seem to lack curiosity about what drives fraudsters.”4 Twenty years later Fanelli indicated that we need more research to understand the causes of research misconduct.5 We agree.

The structural causes of scholarly crime In Chapter  2 we discussed the contemporary climate of academic research. Its features such as the heavy emphasis on publishing papers in high-quality journals and getting grants are part of the structure of the world of scholarship. Their influence pervades the academic world, particularly research-intensive institutions. Here we discuss these in greater depth, and connect them with the tradition of sociology which emphasizes the nature and role of institutions and patterns of behavior within society. Among the etiological factors cited by commentators are the pressures to secure grant money and the now notorious “publish-or-perish” pressure associated with the structure of academic science.6 The publish-or-perish pressure is not a manufactured product of the media or academic folklore. As noted in the last chapter, those who don’t publish sufficiently are notified, most often in their fifth or sixth year of employment, that they will not be offered a permanent position. This means that scholars who do not have a sufficient publication record, along with their partners and children if they have them, must pull up stakes, sell their home if they own one, and try to find another position. Finding another position may prove difficult because they have not published enough to satisfy the requirement of the initial employer and thus may not have an appealing record for any subsequent employer. Thus, failure to secure tenure has ramifications not just for the scholars, but also for those who depend upon the scholars for financial and other support. When the authors were graduate students at Ohio State University, this fate befell a popular assistant professor in the Department of Sociology. He had a good rapport with the graduate students in the department, but he had not produced enough to convince his colleagues and the University that he deserved a permanent appointment. He moved to another university – not of the stature of Ohio State – and in time earned a full professorship. As disheartening and inconvenient failure to win tenure may be, many of those who meet this fate not only recover from the setback, but later flourish in their careers. One notable example is Carl Sagan, the astronomer and astrophysicist responsible for the book and associated PBS series, Cosmos. Early in his career he was denied tenure at Harvard, but went on to Cornell, where he spent most of his career. There have been numerous calls to change the criteria by which scholars are evaluated, and we are among the supporters of this proposition. But of the five types of etiological factors described in Figure 3.1 below, it may be one of the

46  Structural and organizational causes

Structural

Cultural

Situational

Organizational

Individual

Figure 3.1  Levels of explanation for scholarly misconduct Adapted from Davis (2003)7

most difficult to change. To the extent to which the world of scholarship embraces quantifiable metrics such as journal impact factors, the greater the pressure on those in the early stages of their academic careers to meet those metrics. In Chapter 9 we will discuss the immutability of this structure, and we will offer suggestions on how it can be modified for positive change. There are other ways social structure expresses itself in scholarship and science. Some countries approach research and publication differently from others, either due to lags in cultural development or to other factors. These differences transcend both individuals and organizations. We consider structure at this level important, and we discuss it in greater detail in the next chapter. The structural pressures are very much a part of academic work, so much so that it seems ludicrous to do extensive research to document them. When we see ads for academic jobs, we see that the expectation of candidates is that they publish and have the demonstrated ability or potential to bring in external funding for their research agenda. The crucial question is, are they immutable features of the world of scholarship and science? Or are there steps we can collectively take to change them or at least blunt their effects? Limitations of a structural approach It could be argued that most academic researchers who work under the structural “publish or perish” pressure adhere to the highest standards of integrity. If they did not, important scientific work could not take place. In fact, the whole scientific enterprise would collapse if the majority of researchers succumbed to these pressures and failed to conduct their work competently and ethically. If three percent of scholars surveyed admit to engaging in serious forms of scholarly misconduct,

Structural and organizational causes  47 we unfortunately cannot infer that ninety-seven percent refrain from such behavior. But we can assume that the majority do the right thing most of the time. Structure, therefore, does not dictate destiny. A related problem with publish-or-perish pressure as a major cause of research misconduct is what has to be an incalculable number of false positives, that is, cases that fail to meet our structural predictions. Let’s take an example from the study of more conventional crime, that being young men who live in sociallydisorganized neighborhoods. It would be reasonable to predict that, cut off from the various social supports that their more affluent white counterparts enjoy, many of these young men would become involved in illegal behavior and, consequently, be subject to the mechanisms of the criminal justice system. On closer examination, however, we find young men who do not follow that path. They refrain from involvement in gangs and other criminal activity. Some get jobs and support themselves and their families. Some get an education which makes them more socially mobile. Successful young urban men who don’t fit our original prediction would be considered false positives. Here’s a white-collar crime example. During World War II, sociologist Marshall Clinard had an opportunity to observe the inner workings of wartime black market activities.8 He concluded that differential association, a popular twentieth century theory of crime, fell short in explaining what he had observed. He noted, for example, that many of the people who would have been acquainted with the methods of violation refrained from such violations. Further, the techniques for violating were accessible to anyone because of their simplicity. The same can be said of much scientific research. Assuming that most scholars in research-intensive organizations operate under great pressure to publish, why do some scholars engage in maladaptive behavior and others to not? This is a question not only for scholarly misconduct, but for most forms of wrongful behavior. It must be something other than just structure or something in addition to it. We discuss some possible answers to this question in Chapter 5. Another disadvantage of a structural approach to scholarly misconduct relates to prevention and control. We will address these in greater detail in Chapter 9, but we must note that changes to structure, for example, a change in expectations and avenues to achieve those expectations, are among the most difficult to bring about. If we suspect that the pressure to publish and get grants is at least a distal etiological factor, what are we to do with that information? If we no longer evaluate the productivity of scholars based on how much they publish, where they publish, and how much money they generate through grant activity, then what criteria do we use for successful academic performance? The three best publications? The best ten? There have been numerous calls to reevaluate the criteria by which scholars are evaluated, and we are among the supporters. Of the five types of etiological factors depicted in Figure 3.1, the structural may be among the most challenging to change. A theoretical explanation or a set of explanations is also necessary in order to know how to respond to this or any troublesome phenomenon. As in medicine, we need to know the sources of a condition before we can act to control it. What do these factors and their interactions tell us about potential prevention and control?

48  Structural and organizational causes We have discussed the structural level and, below, we describe the organizational level. Chapter 4 will detail the cultural level of explanation and Chapter 5 will address situational and individual levels of explanation.

Organizational causes of scholarly misconduct We now turn to the institutions in which scholars work. Whereas social structure is somewhat distal and amorphous, the organizational environment is more proximal and discernible. Despite the pressures posed by the structure of academic scholarship, there are ways in which the work environment, the organization itself as a university or a research institute, can either encourage or discourage scholarly crime. The notion of organizational culture has been applied to scholarly misconduct. It has been argued that such culture is the Petrie dish for growing these kinds of behavior,9 and that these factors must be addressed in order to influence the selfcontrol of individuals,10 and in light of the changing nature of organized science itself.11 We appreciate all these meanings of culture which relate to organization – including the institution and its subdivisions. The organization as mediator of structural pressures Universities, hospitals and other research organizations have it within their power to interpret, mediate and moderate the influence of structure. That is, it is at the institutional and organizational level that structural pressures to publish are codified, promulgated and interpreted. As we saw in Chapter 2, in many research institutions the standards are so high that many cannot achieve them. For any given research university, the process for tenure and promotion (T&P) may be in written form, available on the university’s website. This policy is likely to be a bit vague – perhaps intentionally so – about the approximate number of published papers one needs in order to reach the T&P bar. The candidate’s colleagues will likely share their opinions of what is required in terms of quantity and quality, but rarely does the organization put the T&P standard in strict, quantifiable terms. A review of the credentials of successful candidates would be another suitable proxy of what is required to reach the bar. An academic department can implement T&P policies that ignore popular metrics such as IFs. If candidates for tenure know such metrics are not part of the decision-making criteria, they needn’t be concerned with them. Some institutions have in fact adopted hiring and retention practices that help counteract the structural factors such as publish or perish. The criminology and criminal justice programs at Arizona State University and the University of Missouri-St. Louis have faculty in separate teaching and research tracks. This permits scholars who prefer teaching over research to pursue a satisfying career without having to worry about publishing papers and getting grants. The support of separate research and teaching tracks exemplifies how universities and other research organizations can mitigate structural effects.

Structural and organizational causes  49 There is another way organizations can contribute to poor research practices, and that is through their inaction. It should be clear that organizations have a wide berth when it comes to interpreting and enforcing RCR standards. Derek Pyne, a faculty member at Thompson Rivers University in Canada, noticed that most of his fellow faculty in the business school had published in predatory journals.12 The university, in its evaluation of faculty members, should have known that these journals were not legitimate. Ignorance of the status of journals in which faculty publish is an unacceptable excuse for an institution whose responsibilities include fairly and objectively evaluating faculty for tenure, promotion and awards. Organizational justice and counterproductive work behavior For several decades now a number of industrial and organizational (I/O) psychologists have been studying the way in which organizations create fair and just environments. Some of this attention has been focused on identifying factors related to workplace-related maladaptive behaviors such as malingering, sabotage,13 excessive absenteeism, and theft.14 I/O psychologists have identified three principal forms of justice: Distributive, procedural, and interactional. Distributive justice has to do with outcomes in a social exchange. For example, if one individual gets a larger holiday bonus than a colleague, the individual will perceive distributive injustice. Earlier we argued that the appropriation of another’s idea is unfair because it yields an undeserved outcome. This is an example of distributive injustice. As the term suggests, procedural justice has to do with the fairness of procedures employed by organizations and management in exchanges. If the chair of an academic department institutes a major change without consulting the faculty for their input, this could be perceived as a violation of procedural justice. Even though the faculty might prefer the outcome, it could be that they feel they were treated unfairly for not having been given a voice in how the changes were decided or implemented. Interactional justice, the third form, is related to how representatives of the organization deal with staff. Whether they are courteous or rude can make a difference in how employees perceive the fairness with which they have been treated. People prefer to be treated well, and they experience a sense of injustice when higher ups in the organization fail to give them respect. Those who have studied organizational justice in the workplace have found that, in general, there is a positive relationship between perceived justice and the extent to which workers behave with integrity. The opposite is also true. If employees feel they have been treated unjustly, they are more likely to become involved in behaviors such as calling in sick, not doing their work, stealing organization property, and even behaving aggressively and violently. Employees who perceive their organizational environment as unjust are more likely to commit theft at work.15 Patricia Keith-Spiegel and Gerald Koocher employed the concept of organizational justice to the way Institutional Review Boards (IRBs) treat researchers who must seek approval for proposed studies.16 They argued that IRBs may perpetrate

50  Structural and organizational causes injustices in the way they consider and approve human subjects’ protocols. This is an interesting finding for our thesis that it is more general norms, namely those related to fairness and justice. In Chapter 8 we will follow this line of inquiry and argue that it deserves far greater attention for future theorizing. The study most relevant to scholarly misconduct and procedural justice is that conducted by Brian Martinson and his colleagues.17 They were interested in the extent to which organizational justice played a role in adherence to standards of research integrity. Although there previously had been some attempt to explore the role of perceived justice and equity in RCR departures,18 the Martinson study was the first to give prominence to distributive and procedural justice. In their study of NIH-funded scientists, they examined the effort-reward imbalance using the Effort-Reward Imbalance questionnaire. The results of the study did not support their prediction that distributive justice would affect misbehavior. They did, however, find that procedural injustice had a significant effect on self-reported misbehavior. We consider this an important set of findings for several reasons. First, it provides empirical support for the notion that research organizations may well have within their means the ability to promote RCR through procedural justice. Of particular interest to us is that their work points not to Mertonian moral norms, but to more general norms of fairness that prevail in society as a whole. There is a rich and growing literature associating organizational injustice with various forms of counterproductive work behaviors. As we will discuss later, in Chapter 8, the role of fairness and justice should be given far greater weight in explaining crime. It is this focus on fairness that we believe to be the key to understanding why scholarly misconduct bothers us so much. Criminogenic tiers and scholarly misconduct Throughout our analysis we argue that multiple levels of explanation need to be considered in order to explain scholarly misconduct. Some prominent criminologists of white-collar crime have argued that the study of organizations requires consideration of multiple levels.19 One conceptual tool we think works well with this notion of multiple levels is that of criminogenic tiers. Criminologist Sally Simpson asserts that complex organizations in which white-collar crime occurs quite often consist of a series of tiers, and what happens in one tier influences behavior in the others.20 She goes on to say that much like squeezing a balloon, pressure at one level will manifest at another level. To demonstrate how this might relate to scholarly misconduct, let’s take a research organization, a university, in which misconduct can and does occur. In order to maintain or even augment its reputation within the academic community, a university must bring in large amounts of research funding. The most consequential and visible research needs such funding. The university institutes recruitment, retention and promotion criteria based on this need for research funding. For example, in recruiting new faculty, many departments are expected to identify

Structural and organizational causes  51 candidates with either a demonstrated track record of securing research grants, or the potential for doing so in the future. One line of inquiry we find interesting is the notion that organizations have personalities, and those personalities can influence the behavior of not only the organization, but also the individuals it employs. In Chapter 5, we touch on the role that personality styles such as narcissism might play in enabling certain individuals to engage in exploitative behavior that manifests itself as the fabrication or falsification of data, plagiarism, the theft of ideas, the duplication of others’ written works, and other forms of scholarly crime. Could it also be that narcissistic organizations set a tone in which those predisposed to exploitation feel entitled to flaunt the rules by which their ethical colleagues abide? The way organizations mediate the publish-or-perish pressure can influence the environment as either preventive or criminogenic. For example, some university departments assign an established faculty member to mentor junior faculty members. Such mentors help the neophytes navigate the tenure and promotion processes by ensuring that the latter stay on track, and do not procrastinate or get distracted by activities that do not contribute to academic survival. We like this model and will discuss it a bit more in the last chapter.

Conclusions Most scholars work in organizations such as universities, hospitals, or private research corporations. The structural requirements such as publish-or-perish exist outside these institutions but are very much a part of them. The analysis of social structure leads to the conclusion that the nature of scholarly work has evolved in such a way as to put pressure on individuals and the institutions in which they work. These pressures have grown over time as greater emphasis is placed on winning competitive grants and on achieving high scores on scholarly metrics. The tenure and promotion requirements are documentable in an institution’s policies and procedures, and exactly how this structural requirement expresses itself is dependent upon the specific organization. In some organizations, this will translate into a specific number of required published articles for tenure and promotion to associate professor. Other organizations will insist that publication in prestigious journals is more important than some arbitrary number. Either way, the institution in which scholars work mediates the way in which the structural requirement affects the individuals responsible for meeting that requirement. The structural aspects of organized scholarship are distal factors, important in creating an environment in which research misconduct can and will occur. Such explanations may be necessary but they are not sufficient for understanding scholarly misconduct. As high as the incidence of misconduct may be, offenders likely comprise a small minority of active scholars. Using only a structural approach fails to explain the behavior of those who are under pressure but who engage in competent, ethical, replicable scholarship. This suggests that there are other relevant factors at work.

52  Structural and organizational causes It has been argued that policy makers should focus on those aspects of the research environment that are most susceptible to change.21 We must acknowledge that some are more amenable to change than others. For example, it is not likely that the reward structure of scientific research can be modified so as to remove the stressors and incentives to deviate from accepted practice. Likewise, it would be nigh impossible to control the myriad situational problems confronting researchers, particularly those associated with their personal lives such as family health, finances, and relationships. It is because of these challenges that organizational factors have a special appeal, particularly to those responsible for promoting RCR. An organization’s structure, policies, and procedures are something “we can get our hands around.” They have sharper, better-defined parameters than the more amorphous categories of social structure and culture. There already is a long history of attempts to identify and remedy maladaptive behavior within organizations. If researchers can show that there are features of research organizations which increase the likelihood of FFP and other detrimental research practices, then perhaps this level and these features deserve close consideration. Just as I/O theory and research inform the work of those concerned with RCR, the world of scientific research may have something to offer I/O psychology. For example, studies of organizational deviance refer to counterproductive work behavior (CWB), which has been defined as “behavior intended to hurt the organization or other members of the organization”.22 Prior research suggests that CWB in the workplace manifests itself in two forms: harmful to individuals and harmful to the organization.23 In the case of FFP, one could make the argument that violations harm both, those whose work is plagiarized and the organizations whose reputations suffer over a publicized instances of research misconduct. Considering the harm caused by cases of FFP, and perhaps even some instances of DRPS, we suggest adding a third category of harmful to structure. When the literature is cluttered with bogus papers based on fictitious data, it can be argued that the harm goes far beyond the organization in which the perpetrator worked. Not only are human and other resources wasted on such cases, there is a betrayal of trust in which the costs are incalculable. We also need to acknowledge that perceived injustice is a multidimensional construct, and thus cannot be adequately assessed by a single measure. Research has shown that distributive, procedural, interactional, interpersonal, and informational justice each play an important and distinct role in forming employees’ perceptions of organizational justice. Future research in this vein should include measures of all forms of organizational justice in order to discern the distinct contribution each makes to departures from RCR.

Notes 1 When worker burnout came to the fore as an issue of psychological interest in the 1970s, teachers, social workers, nurses, and other members of the so-called helping professions wrote discursive pieces announcing the issue and speculating on causes and possible preventive and treatment strategies. In time, investigators undertook empirical research, including the development of psychometric instruments to measure burnout

Structural and organizational causes  53

2 3

4 5 6

7 8 9 10 11 12

13 14 15 16 17 18

19

20

and identify various correlates. Some of these studies received funding support. These findings, though often stemming from basic research, gave empirical support for ameliorative efforts. Scholarly misconduct has followed a similar trajectory in its evolution except that media accounts played a greater role than genuine professional concern. Tangney, June Price (1987). “Fraud Will Out – or Will It?” New Scientist, 115: 62–63. Ranstam, J., Buyse, M., George, S.L., Evans, S., Geller, N.L., Scherrer, B., Lesaffre, E., Murray, G., Edler, L., Hutton, J.L., Colton, T. & Lachenbruch, P. (2000). “Fraud in medical research: an international survey of biostatisticians. ISCB Subcommittee on Fraud.” Control Clin Trials, 21: 415–427. James, William (1995). “Fraud and hoaxes in science.” Nature, 377: 474. Fanelli, D. (2015). “We need more research on causes and consequences, as well as on solutions.” Addiction, 110: 11–12. Verbeke, R. & Tijdink, J. (2013). “Science Fraud: The Hard Figures.” EOS (Voor Bijzondere Journalistiek) April 4: 24–28; Zimmer, C. (2012a). “A Sharp Rise in Retractions Prompts calls for Reform.” New York Times (Science Times section) April 17, pp. D1, D4; Zimmer, C. (2012b). “After Mistakes, Scientists Try to Explain Themselves.” New York Times (Science Times section) April 17, p. D4. Davis, Mark S. (2003). “The role of culture in research misconduct.” Accountability in Research, 10, 189–201. Clinard, Marshall B. (1946). “Criminological theories of violations of wartime regulations. American Sociological Review, 11, 258–270. Moore, Andrew (2009). More than mentoring: The importance of group culture for scientific integrity.” BioEssays, 31: 1271–1272. Muller, J. M., Landsberg, B. & Ried, J. (2014). “Fraud in science: A plea for a new culture in research.” International Journal of Obesity, 38: 572–576. Cohen, J. J.  & Siegal, E. K. (2017). “Academic Medical Centers and Medical Research: The Challenges Ahead.” Journal of the American Medical Association, 294: 1367–1372. McCook, A. (2017). “When most faculty publish in predatory journals, does the school become ‘complicit?’ ” Retraction Watch, May 9. Available online at: http://retractionwatch.com/2017/05/09/faculty-publish-predatory-journals-school-become-complicit/ (accessed 12-31-2017). Ambrose, M. L., Seabright, M. A. & Schminke, M. (2002). “Sabotage in the workplace: The role of organizational injustice.” Organizational Behavior and Human Decision Processes, 89: 947–965. Greenberg, J. (1990).“Employee theft as a reaction to underpayment equity: The hidden cost of pay cuts.” Journal of Applied Psychology, 75: 561–568. Ibid. Keith-Spiegel, P., Koocher, G. P. & Tabachnick, B. (2006). “What scientists want from their research ethics committee.” Journal of Empirical Research on Human Research Ethics, 1: 67–82. Martinson, Brian C., Anderson, Melissa S., Crain, A. L. & de Vries, Raymond (2006). “Scientists’ perceptions of organizational justice and self-reported misbehaviors.” Journal of Empirical Research on Human Research Ethics, 1: 51–66. Davis, M. S. & Riske, M. L. (2002). “Preventing scientific misconduct: Insights from ‘convicted offenders’ ” In Nicholas H. Steneck & Mary D. Scheetz (eds), Investigating Research Integrity: Proceedings of the First ORI Research Conference on Research Integrity. Rockville, MD: Office of Research Integrity. Simpson, S. S.  & Piquero, N. L. (2002). “Low self-control, organizational theory, and corporate crime.” Law & Society Review, 36: 509–548; Vaughn, Diane (2002). “Criminology and the sociology of organizations.” Crime, Law & Social Change, 37: 117–136. Simpson, Sally S. (2011). “Making sense of white-collar crime: Theory and research.” Ohio State Journal of Criminal Law, 8: 481–502.

54  Structural and organizational causes 21 Martinson, B. C., Anderson, M. S. & de Vries, R. (2005). “Scientists behaving badly.” Nature, 435, 737–738. 22 Spector, P. E. & Fox, S. (2002). “An emotion-centered model of voluntary work behavior: Some parallels between counterproductive work behavior and organizational citizenship behavior.” Human Resource Management Review, 12: 269–292. 23 Robinson, S. L. & Bennett, R. J. (1995). “A typology of deviant workplace behaviors: A multidimensional scaling study.” Academy of Management Journal, 38: 555–572.

4 Cultural causes of scholarly misconduct

Pointing the finger at culture as a causal factor in scholarly misconduct creates discomfort since it implicates outsiders – non-Westerners – as inclined to deviance.1 The 2016 Presidential campaign in the United States, particularly Donald Trump’s xenophobic rants about Mexicans and Muslims, put this troubling discussion of foreigners front and center. The truth is that science as we know it today, with all of its exciting and important advances, benefits from foreignborn scientists.2 Still, we are compelled to explore how culture can contribute to a better understanding of these behaviors. Instances of research misconduct in non-Western countries hint that perhaps something other than just structural pressure or individual pathology may be at work in these cases. We feel that as social scientists we have an obligation to pursue even those explanations that invite criticism or that make us uncomfortable.

Culture defined The word “culture” has multiple meanings. It of course includes the beliefs and customs of those in a particular people or society.3 We argue that culture includes the unique twist a particular part of the world puts on more general norms, such as those governing the taking of human life or social and economic exchanges. Definitions of culture are not, of course, limited to nations. Sociologists Richard Nisbett and Dov Cohen wrote about a “southern culture of violence” in which long-held traditions of responding to insults against honor help explain regional variations in violent behavior.4 What Nisbet and Cohen described was a combination of attitudes and habits formed by the history of a particular region of the U.S. Subsequent research by Richard Felson and Paul-Philippe Paré argued that what the data reveal is not a culture of honor but a gun culture.5 Regardless of who is right, both suggest important ways of looking at how culture can play a role in behavior considered socially deviant or harmful. We can narrow culture to even more limited confines, such as prisons, corporations (e.g., the corporate culture endemic to a particular workplace), and microcultures centered around social and political issues such as LGBTQ cultures, animal rights cultures, the American “tea party” culture, and the survivalist-right/ militia culture. The literature of corrections and penology, for instance, speaks

56  Cultural causes about inmate culture and how its unconventional norms differ from those of mainstream society.6 Organizations are said to be characterized by various types of culture. Sociologist Diane Vaughn, in her award-winning analysis of the Challenger tragedy, highlighted the role of the organizational culture of the National Aeronautics and Space Administration (NASA).7 She found that unique features of the organizational culture of NASA were crucial in explaining how the nowinfamous defective “o” rings could end up in a manned spacecraft. We would argue that similar factors are at work with other forms of misbehavior, such as the sex abuse scandal at Pennsylvania State University, perpetrated by a former football coach against adolescent boys, wherein what might be termed a “culture of indifference” allowed a number of officials to be complicit in keeping the abuse from coming to light. There are numerous other meanings of culture that are important for criminological inquiry, but are not part of our current conception. They include material culture in industrial society,8 vigilante culture such as that in apartheid South Africa,9 the Albanian culture of sex-slavery and violence,10 the American rap culture of resistance,11 and prosecutors’ culture of professional misconduct.12 All these deserve attention for certain purposes, but they do not fit into the current cultural definition. For the purposes of this book, we restrict the definition of culture to the Merriam-Webster dictionary definition “the customary beliefs, social forms, and material traits of a racial, religious, or social group.”13

4.1  Scholarly credentials for sale14 One indisputable fact of successful academic careers is the importance of academic papers, particularly those published in prestigious journals. The culture of organized science in China also experiences this intense pressure to produce. In addition, Chinese scholars are handsomely rewarded for publishing papers in prestigious journals. In some cases, the financial bonuses for such publications can exceed one’s annual salary. What China lacks, unfortunately, is a well-developed system of social control which not only offers incentives for individual to engage in unethical practices, it also provides an opening for opportunistic companies that exist solely to sell papers to academic doctors and other scholars who allegedly are too busy to write their own research papers. One consequence of this situation is the cluttering of the scientific literature with garbage scholarship. This in turn leads to inevitable retractions, scandals in which Chinese scientists and their co-authors are publicly dishonored, and a tremendous waste of human and financial resources. The Chinese government realizes it has a problem and has recently made substantial efforts to formalize social control of scientific research. It remains to be seen whether these efforts are successful in curbing scholarly crime.

Cultural causes  57 Culture and crime In the early twentieth century American sociologists suggested that a conflict of conduct norms of different cultures might help explain why those from abroad become involved in crime.15 The United States had experienced decades of massive immigration from abroad, Eastern and Northern Europe in particular, and these early sociologists associated this immigration with many of the problems in large cities, including crime. Sociology was a nascent discipline trying to develop a distinct identity by offering unique theoretical explanations. One of these sociologists, Thorsten Sellin of the University of Pennsylvania, observed that the immigrants coming to America for a better life brought with them the various habits and norms of their homelands.16 These immigrants didn’t set foot on Ellis Island and undergo a magical transformation into mainstream Americans. Assimilation, even partial internalization of the new norms, can take years, if it happens at all. The norms of the old country, according to Sellin, often at odds with American laws and customs, could result in a conflict of cultures. A number of studies have applied culture conflict theory to various groups including but not limited to Native American Indians,17 ethnic Albanians,18 Balkan youths,19 and immigrants to Israel.20 While there is mixed criminological support for culture conflict theory, and even though criminologists have been taken to task for their approach to culture and crime,21 the question of culture conflict has sufficient merit to retain for our discussion of how culture influences scholarly crime. Not only has culture been linked as a possible cause of a variety of crimes,22 it can also be responsible for the failure to disclose criminal offenses or pursue their prosecution. Analyses have demonstrated that culture plays a role in the reluctance of individuals to disclose child sexual abuse,23 one of the most serious crimes imaginable. Hypothesized reasons for the reluctance to reveal abuse include such factors as patriarchy, women’s unequal status, collective shame, modesty, honor and taboos. We think that similar mechanisms may be in play with regard to the disclosure of scholarly crime. These studies all suggest that criminologists consider seriously the role of culture in the genesis and contours of crime.

Scholarship, culture, and scholarly crime Walter Meyer and George Bernier, Jr. were among the first to link culture to research misconduct.24 They examined 15 cases involving 29 individuals who had been accused of plagiarism, fabrication and falsification, as well as other questionable research practices. Although their sample was indeed small, they found that individuals born outside the U.S. were disproportionately involved in misconduct cases. They note that it is uncomfortable to raise culture as a possible causal factor due to fear of charges of racism. Building on the work of Meyer and Bernier, Mark Davis undertook a review of cases over a period of ten years wherein a finding of scientific misconduct had been made by the Office of Research Integrity.25 The findings were similar to the Meyer and Bernier study in that a disproportionate number of the cases

58  Cultural causes involved persons with non-Western names. Now the skeptic might counter that lots of people come from abroad, and therefore the proportions of foreign names among those found guilty of scientific misconduct might simply reflect their numbers in the ranks of scholars. Davis cautioned that even if the available evidence does point to a significant cultural element to research misconduct, we must not jump to conclusions that foreigners are responsible for the problem of research misconduct. It is our contention that, much like the early twentieth century immigrants to the U.S., scholars moving from one culture to another do not abandon the traditions with which they were raised. Imagine a young scholar or scholar-in-training moving to a new country. Notions of subordinate-superordinate relationships, professional deference, and assignment of credit for collaborative work, all affect the individual’s approach to science and scholarship. The individual who has been socialized over a period of twenty or more years does not discard the internalized norms of his or her home culture. Asian countries have had a large number of publicized cases in recent years. China, Japan, and South Korea in particular seem to be experiencing an embarrassing amount of misconduct. Whether this is due to a real increase in the number of cases or whether it is simply a function of increased scrutiny, these countries have been in the spotlight, and the governments of these Asian countries have had to recognize the existence of a problem. In large part, this is due to the efforts of journalists who defy the prevailing tendency to sweep research scandals under the rug. The tremendous growth of a competitive environment in Asian countries has not seen a corresponding focus on scholarly ethics. The development in the U.S. of the federal Office of Research Integrity (ORI) was the result in part of several highly-publicized cases of research misconduct at some of the most prominent research institutions in the country. Likewise, there was little impetus to develop these forms of social control in China and South Korea until the governments could no longer deny the problem. There have been several prominent cases where high-level public officials had their academic backgrounds scrutinized. In some cases, the scrutiny came from the press, but in others it appeared to come from political adversaries. Regardless, in several cases it was uncovered that portions of their doctoral dissertations had been plagiarized from the works of others. Former German defense minister, Karl-Theodor zu Guttenberg was one of the more prominent cases. What makes these cases noteworthy for our analysis is the emphasis that German culture has placed on possessing a doctorate. Not only does such a credential have meaning within academic circles, it is accorded special status outside the academy. It has been argued that those who hold doctorate degrees can command higher salaries in non-academic positions, even when the doctorate is unrelated to that position’s responsibilities. This suggests that through history, those who have attained the doctorate in Germany have enjoyed uncommon respect and deference. Whereas in the U.S. many holders of PhDs and other earned doctorates do not use their title in everyday circumstances, it appears that in Germany, those who have the title

Cultural causes  59 make use of it publicly, thereby elevating themselves above others. To acknowledge this cultural finding is not to indict German culture, but merely points out how cultures operate differently. Aside from culture conflict theory, there are other perspectives that help us understand how culture influences scholarly crime. One is the concept of tight and loose cultures, clarified by anthropologist Pertti Pelto.26 Some cultures are deemed loose, meaning that they are more individualistic. Those living and participating in so-called loose cultures are less likely to respect authority, institutions and structural arrangements. As a result, loose cultures are more likely to tolerate deviant behavior.27 Tight cultures, by contrast, are less likely to tolerate such behavior. We would argue that Western countries including the U.S. are tight inasmuch as they have developed systems to control harmful forms of deviant behavior including crime and, yes, scholarly misconduct. The concept of tight and loose cultures dovetails well with culture conflict theory. According to Gelfand and her colleagues, people moving from a tight culture to a loose culture will experience anxiety.28 The same is true of those going from a loose culture to a tighter one. The manner in which culture influences scholarly crime can occur as, for instance, moving from a tight to a loose culture or vice versa creates a greater risk of engaging in scholarly crime. Another perspective that holds promise for explaining scholarly misconduct is one that connects culture with social structure. Sociologists Cheol-Sung Lee and Andrew Schrank approach the publicized problems of research misconduct in Asian countries by considering how these countries have adopted a Western model of higher education.29 Under this model, the state offers funding to those who publish in prestigious journals. Because under the Western model, criticism is discouraged, this provides an opportunity for motivated scientists to take advantage of the system. Although their approach is structural, Lee and Schrank refer to “unscrupulous scientists,” suggesting a role that individual motivations can and do play in scholarly misconduct. This is consistent with our model in which the structural and cultural rings overlap with the individual ring, increasing the likelihood that scholarly misconduct will occur. So exactly how does culture contribute to scholarly crime? Returning to Sellin’s culture conflict theory, we concur that scholars moving from one culture to another bring to their newly adopted country the norms and values of their homeland. These include not only the norms about scholarship and science, but also more general norms pertaining to how scholarship and science work in those cultures. In Japan, for example, unearned credit for an article, known as gift authorship, is relatively common as is the practice of ghost authorship.30 These practices are discouraged in the West, as exemplified in the behavioral sciences in the Publication Manual of the American Psychological Association. Complicating the situation is the cultural reluctance to confront eminent scientists.31 The Japanese are taking purposive steps to curb such practices,32 but it may require a lengthy evolutionary process. Other non-Western countries struggle with scholarly crime. A  study of a convenience sample of Nigerian researchers revealed that two-thirds of them

60  Cultural causes admitted engaging in some form of research misconduct.33 While African countries may be behind the West in the promulgation of research standards, they are beginning to address these issues as their scientific programs evolve.34 Other countries that have been the focus of recent discussions include Austria35 and Brazil.36 This is not to say that the U.S. and Europe are above reproach in terms of scholarly misconduct. Some Western nations such as the United Kingdom have been forced to bring the issue out in the open and take serious measures to curb scholarly crime.37 Separating xenophobia from genuine cultural concerns One reason we suspect culture has not been given the attention it deserves as an etiological factor is that it is perceived as a form of prejudice against outsiders. Criminologists in particular recognize that incorporating culture as an explanation for any form of deviance can lead to charges of discrimination.38 Despite the fact that America has a rich history as a melting pot of people from different parts of the globe, there remains a great deal of resentment toward foreigners, notably among more conservative elements, against those who enter the country illegally or who are identified with terrorism. Again, this was evident during the 2016 U.S. presidential campaign during which a spotlight was trained on illegal immigration and the possibility of admitting terrorists among refugees fleeing the turmoil in the Middle East. While some of this concern may be warranted, it can be used by bigots to justify xenophobic policies and practices. The meager research that is available on the interface between scholarly misconduct and culture shows that the bulk of the offenses by non-U.S. and nonEuropean scientists are committed in the biomedical and medical fields and are committed by Asians, primarily Chinese, although Asian Indians and Greeks are also featured.39 The common complaint is plagiarism. South Korea is a current example of a cultural challenge to the scholarly enterprise, with many recent publicized cases involving South Korean investigators. This could be the result of increased awareness about scholarly misconduct in the South Korean scientific and academic communities. Or it could be a growing culture in which research misconduct is seen as an innovative, if unacceptable, way of attaining job security and recognition. In sum, research on the overlap between plagiarism and culture finds that some scientists who are trained outside of the United States, although they know better, plagiarize. There is some confusion as to whether these scientists view this behavior as wrong, per se, since (a) their language skills are different than Englishspeaking language skills (and perhaps they should not be expected to use a second language with the level of skill as they would and do use their own first language); and (b) they were not trained to know that copying other scientists’ words is a form of misconduct. Nonetheless, since English is the main language used in the sciences, and they are publishing in English-language journals, they are expected to know that copying is wrong.

Cultural causes  61 Culture as a defense against charges of wrongdoing If culture is a legitimate etiological factor in scholarly misconduct, the question arises whether it can be used by those accused as a mitigating or extenuating factor. This is a question not only for scholarly misconduct, but also for more traditional kinds of crime. For non-U.S. and non-European scholars whose views of scholarly roles and relationships differ from those in the West, it is an open question whether they are responsible for departures from Western norms when their non-Western norms conflict with more dominant Western norms that are imposed upon them. There may be a certain amount of confusion as to what the rules are and why Western rules are superior to non-Western ones. After all, scholarly offenders have pointed to structural pressures, mental status and situational factors in the aftermath of their misdeeds. This begs the question of whether cultural explanations are fair game as a defense strategy. We again appeal to criminology and criminal justice to explain the influence of culture on criminal culpability. Anglo-American law provides that the accused can use culture as a defense against criminal charges. Through what has become known as the inability thesis, those charged with crimes can assert that their behavior, though defined as criminal, was reasonable in light of cultural definitions.40 In some cases, these crimes can qualify as cultural offenses, in that the behavior in question is not only acceptable, but is even condoned and promoted in the individual’s culture.41 If culture is introduced into the criminal process, its effect can range anywhere from excusing the defendant’s behavior to serving as a mitigating factor at sentencing.42 But in addition to basic legal considerations, its use as a defense can also be in part a function of the power of the cultural group in question.43 As is true for traditional crime, the use of culture as a defense for scholarly crime depends on a number of factors including, but not limited to, the extent to which the culture in question is assimilated in the larger society. For now, we will restrict our interest in culture as an etiological factor. It will be interesting to see if any of those accused of serious research misconduct offer culture as a partial defense. If they do, will it make any difference in the official consequences they face?

Conclusions The interface between culture and scholarly misconduct is perhaps the least investigated of all the etiological factors, and there are some understandable reasons for this. Among these is the fear of appearing biased against foreigners. But because of this neglect, it remains one of the least understood of the possible contributors. It also makes it one of the more interesting to explore from a social scientific standpoint. Somewhere on the spectrum between unqualified cultural acceptance and rabid xenophobia lies a position which suggests that culture can at least be partially responsible for social problems including scholarly misconduct. This is where we

62  Cultural causes take our position and we argue that for some instances of scholarly crime, culture, defined as we have in this chapter, is at times a necessary but not sufficient element of the criminogenic environment. Despite the identification of norms specific to science and scholarship, the violations that constitute scholarly crime involve more general norms. We will discuss this in greater detail in the final chapter, but suffice it to say that the approbation with which lay persons regard scholarly crime is far less than what they assign to more traditional crime. U.S. Senator Charles Grassley, for instance, is not offended that obscure norms of scholarship have been violated, but is incensed by the waste of federal funds. Scholarly offenses, like insider trading or any other socially harmful behavior, can be viewed as a form of cheating and we, the public, are dupes. This is an uncomfortable feeling that requires a justice-type response to correct the social order. So how does this relate to culture? The norms that scholars violate by stealing the work of others or by fabricating data are the same norms violated in traditional theft and fraud. And despite nuanced differences, these fundamental norms operate in other cultures. We would argue that theft is as inappropriate in South Korea and Japan as it is in the U.S. and the U.K. Indeed, they may manifest themselves somewhat differently from one culture to another. But the element that links a cultural explanation to the other levels is a set of basic social norms. This contradicts Merton’s notion that specific norms govern scientific work; instead, we suggest that general rules of fairness and reciprocity are required in the advancement of science even if they do not always accurately reflect the reality of modern scholarship. We tend to approach this subject matter as though scholarly crime exists within social structures, cultures, and organizations. This assumes that these factors unidirectionally exert a force – positive, negative or neutral – on those who engage in such behavior. But we need to consider the possibility that scholarly crime can bring about change to these higher-level structures. Just as it has been argued that large amounts of crime can change society,44 it may be that increasing amounts of scholarly crime can somehow affect the culture in either a positive or negative way. If increasing amounts of scholarly misconduct are discovered and disapproved, the change will be positive. If increasing amounts of scholarly misconduct are discovered but ignored, there is no change. If increasing amounts of scholarly misconduct are not discovered and not discussed, the change will be negative. This notion of discovery and reaction to the discovery, as it pertains to cross-cultural or intra-cultural environments, deserves further exploration. Among the purposes of this book is the identification of a theoretical perspective which not only can explain scholarly crime, but can also inform our understanding of more traditional forms of offending. We have accepted Steven Messner’s challenge to make Western criminological theories relevant to Eastern cultures,45 despite the fact that we increasingly are a global community through immigration and communication.46 Just as it is difficult to make sense of the relationship between culture and more traditional crime,47 so it is with culture and scholarly crime. It is easy to view such behavior in non-Western cultures through a Western set of lenses. Even though it

Cultural causes  63 can be argued that Mertonian norms should operate the same throughout the civilized world, cultures in which science and scholarship are less evolved or merely different from the West should not be expected to operate as they do in the West where systems of social control have long been discussed, developed and refined. This book is concerned not only with the causes of scholarly crime, but also its control. We need to recognize that culture not only helps create an environment in which scholarly crime can flourish, it also affects its control. Research has shown that culture can and does influence social control.48 We have reason to posit that culture likewise has an effect on both the informal and formal social control of scholarly crime. It has been argued that drawing a distinction between “us” and “them” in terms of culture is less meaningful than it used to be, as a result of mobility.49 One could further argue that culture is even less relevant in an era in which the world is electronically linked by the Internet and various social media. We appreciate this argument, and indeed in many realms it may be inconsequential. But the ability to quickly travel to and communicate with other parts of the world does not strip culture of its significance or influence.

Notes 1 Davis, Mark S. (2003). “The Role of Culture in Research Misconduct.” Accountability in Research, 10: 189–201. 2 No, Y. & Walsh, J. P. (2010). The importance of foreign-born talent for US innovation. Nature Biotechnology, 28: 289–291. 3 Merriam-Webster (2015). Available online at: www.merriam-webster.com/dictionary/ culture (accessed 09-01-2015). 4 Nisbett, R. E. & Cohen, D. (1996). Culture of Honor: The Psychology of Violence in the South. Boulder, CO: Westview Press. 5 Felson, Richard B. & Paré, P-P. (2010). “Gun cultures or honor cultures: Explaining regional and race differences in weapon carrying.” Social Forces, 88: 1357–1378. 6 Copes, H., Brookman, F. & Brown, A. (2013). “Accounting for violations of the convict code.” Deviant Behavior, 34: 841–858. 7 Vaughn, Diane (1996). The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press. 8 Albrecht, H-J. (1995). “Ethnic minorities, culture conflicts and crime.” Crime, Law & Social Change, 24: 19–36. 9 Martin, J. (2012). “Vigilantism and state crime in South Africa.” State Crime Journal, 217–234. 10 Arsovska, Jana (2006). “Understanding a ‘culture of violence and crime’: the Kanun of Lek Dukagjini and the rise of the Albanian sexual-slavery rackets.” European Journal of Crime, Criminal Law and Criminal Justice, 14: 161–184. 11 Tanner, J., Asbridge, M. & Wortley, S. (2009). “Listening to rap: Cultures of crime, cultures of resistance.” Social Forces, 88: 693–722. 12 White, Ken (2011). “Culture of misconduct: The misplaced priorities of prosecutors.” Reason, 30–31. 13 Davis (2003). 14 Wang, Jeanette (2013). “Research papers for sale taint mainland academia, journal says.” South China Morning Post, December 16. Available online at: www.scmp.com/ lifestyle/article/1379849/research-papers-sale-taint-mainland-academia-journal-says (accessed 12-31-2017).

64  Cultural causes 15 Allport, Floyd H. (1931). “Culture conflict versus the individual as factors in delinquency.” Social Forces, 9: 493–497; Sellin, Thorsten (1938). Culture Conflict and Crime. New York: Social Science Research Council. 16 Sellin (1938). 17 Abril, Julie C. (2007). “Cultural conflict and crime: Violations of Native American cultural values.” International Journal of Criminal Justice Sciences, 2: 44–62. 18 Arsovska, Jana & Verduyn, Philippe. (2008). “Globalization, conduct norms and ‘culture conflict.’ ” British Journal of Criminology, 48: 266–246. 19 Killias, M., Maljevic, A. & Lucia, S. (2010). “Imported violence? Juvenile delinquency among Balkan youths in Switzerland and in Bosnia-Herzegovina.” European Journal on Criminal Policy & Research, 16: 183–189. 20 Shoham, Shlomo (1962). “The application of the ‘culture-conflict’ hypothesis to the criminality of immigrants in Israel.” Journal of Criminal Law, Criminology & Police Science, 53: 207–214. 21 O’Brien, M. (2005). “What is cultural about cultural criminology.” British Journal of Criminology, 45: 599–612. 22 These include, but are not limited to, organized crime (Cottino, A. (1999) “Sicilian cultures of violence: The interconnections between organized crime and local society.” Crime, Law & Social Change, 32: 103–113); domestic violence (Pan, Amy, Daley, Sandra, Rivera, Lourdes M., Williams, Kara, Lingle, Danielle & Reznik, Vivian (2006). “Understanding the role of culture in domestic violence: The Ahimsa Project for Safe Families.” Journal of Immigrant and Minority Health, 8: 35–43, and Warrier, S. (2008) “’It’s in their culture’: Fairness and cultural considerations in domestic violence.” Family Court Review, 46: 537–542); insider trading (Statman & Meir (2009). “The cultures of insider trading.” Journal of Business Ethics, 89: 51–58); adolescent street violence (Bennett, T.  & Brookman, F. (2009). “The role of violence in street crime: A  qualitative study of violent offenders.” International Journal of Offender Therapy and Comparative Criminology, 53: 617–633; Stewart, E. A.  & Simons, R. L. (2006). “Structure and culture in African American adolescent violence: A partial test of the “code of the street” thesis.” Justice Quarterly, 23: 1–33; Stewart, E. A. & Simons, R. L. (2010). “Race, code of the street, and violent delinquency: A multilevel investigation of neighborhood street culture and individual norms of violence.” Criminology, 48: 569–605.); sexual violence (Sharma, R. & Bazilli, S. (2014). “A reflection on gang rape in India: What’s law got to do with it.” International for Crime, Justice and Social Democracy, 3: 4–21); child maltreatment (Nadan, Yochay, Spilsbury, James C. & Korbin, Jill. E. (2015). “Culture and context in understanding child maltreatment: Contributions of intersectionality and neighborhood-based research.” Child Abuse & Neglect, 41: 40–48); indigenous crime, (Day, A., Jones, R., Nakata, M. & McDermott, D. (2012). “Indigenous family violence: An attempt to understand the problems and inform appropriate and effective response to criminal justice system intervention.” Psychiatry, Psychology and Law, 19: 104–117; Marenin, Otwin (1992). “Explaining patterns of crime in the native villages of Alaska.” Canadian Journal of Criminology, 34: 339–368); gun violence (Lemieux, Frederic (2014). “Effect of gun culture and firearms laws on gun violence and mass shootings in the United States: a multi-level quantitative analysis.” International Journal of Criminal Justice Sciences, 9: 74–93); institutional corruption (Klein, J. L. & Tolson, D. (2015). “Wrangling rumors of corruption: Institutional neutralization of the Jerry Sandusky scandal at Penn State University.” Journal of Human Behavior in the Social Environment, 25: 477–486); and terrorism (Arena, M. P.  & Arrigo, B. A. (2005). “Social psychology, terrorism, and identity: A preliminary re-examination of theory, culture, self, and society.” Behavioral Sciences and the Law, 23: 485–506). 23 Boakye, K. E. (2009). “Culture and nondisclosure of child sexual abuse in Ghana: A theoretical and empirical exploration.” Law & Social Inquiry, 34: 951–979; Fontes,

Cultural causes  65

24

25 26 27 28 29 30 31 32 33 34

35 36 37 38 39

40 41 42 43

L. A. & Plummer, C. (2010). “Cultural issues in disclosures of child sexual abuse.” Journal of Child Sexual Abuse, 19: 91–518. Meyer, W. M. and Bernier, Jr., G. M. (2002). Potential Cultural Factors in Scientific Misconduct Allegations. In Nicholas H. Steneck and Mary D. Scheetz (eds.) Investigating Research Integrity: Proceedings of the First ORI Research Conference on Research Integrity. Davis (2003). Pelto, Pertti J. (1968). “The differences between ‘tight’ and ‘loose’ societies.” TransAction, 5: 37–40. Ryan, B. F. & Straus, M. A. (1954). “The integration of Sinhalese society.” Research Studies of the State College of Washington, 22: 179–227, cited in Pelto (1968). Gelfand, M. J. (2012). “Culture’s constraints: International differences in the strength of social norms.” Current Directions in Psychological Science, 21: 420–424. Lee, C-S. & Schrank, A. (2010). “Incubating innovation or cultivating corruption? The developmental state and the life sciences in Asia.” Social Forces, 88: 1231–1255. Yukawa, Y., Kitanaka, C.  & Yokoyama, M. (2014). “Authorship practices in multiauthored papers in the natural sciences at Japanese universities.” International Journal of Japanese Sociology, 23: 8. Normile, D. (2003). “Japan seeks answers to rise in misconduct.” Science, 301: 53–153. Wada, Masanori (2014). “Scientific misconduct: Research integrity guidelines in Japan.” Nature, 514: 35–35. Okonta, Patrick  & Rossouw, Theresa (2013). “Prevalence of scientific misconduct among a group of researchers in Nigeria.” Developing World Bioethics, 13: 149–157. Kombe, F., Anunobi, Eucharia N., Tshifugula, N. P., Wassenaar, D., Njadingwe, D., Mwalukore, S., Chinyama, J., Randrianasolo, B., Akindeh, P., Dlamini, P. S., Ramiandrisoa, F. N.  & Ranaivo, N. (2014). “Promoting research integrity in Africa: An African voice of concern on research misconduct and the way forward.” Developing World Bioethics, 14: 158–166. Muller, M. J., Landsberg, B. & Reid, J. (2014). “Fraud in science: A pleas for a new culture in research.” European Journal of Clinical Nutrition, 68: 411–415. Lins, Liliane & Carvalho, Fernando M. (2014). “Scientific integrity in Brazil.” Journal of Bioethical Inquiry, 11: 283–287. Phil Baty (2006). “Plagiarists face clampdown.” The Times Higher Education Supplement, December 8, p. 2. Akhtar, Zia (2014). “Child sex grooming: ‘culture’ crime, racial stereotyping and the environment.” European Journal of Crime, Criminal Law and Criminal Justice, 22: 167–196. Heitman, Elizabeth and Litewka, Sergio (2011). “International Perspectives on Plagiarism and Considerations for Teaching International Trainees.” Urologic Oncology, 29: 104–108; Segal, S., Gelfand, B. J., Hurwitz, S., Berkowitz, L., Ashley, S. W., Nadel, E. S.  & Katz, J. T. (2010). “Plagiarism in residency application essays.” Annals of Internal Medicine, 153: 112–120; Hayes, N. & Introna, L. D. (2005). “Cultural values, plagiarism, fairness: When plagiarism gets in the way of learning.” Ethics & Behavior, 15: 213–231. Tunick, M. (2004). “Can culture excuse crime? Evaluating the inability thesis.” Punishment and Society, 6: 395–40. Van Broeck, J. (2001). “Cultural defence and culturally motivated crimes (cultural offences).” European Journal of Crime, Criminal Law and Criminal Justice, 9: 1–32. Torry, W. I. (2001). “Social change, crime, and culture: The defense of provocation.” Crime, Law & Social Change, 36: 309–325. Tomer-Fishman, T. (2010). “‘Cultural defense,’ ‘cultural offense,’ or no culture at all?: An empirical examination of Israeli judicial decisions in cultural conflict criminal

66  Cultural causes

44 45 46 47 48 49

cases and of the factors affecting them.” Journal of Criminal Law & Criminology, 100: 475–521. Garland, David (2000). “The culture of high crime societies.” British Journal of Criminology, 40: 347–375. Messner, Steven F. (2015). “When west meets east: Generalizing theory and expanding the conceptual toolkit of criminology.” Asian Journal of Criminology, 10: 117–129. Valier, C. (2003). “Foreigners, crime and changing mobilities.” British Journal of Criminology, 43: 1–21. Karstedt, S. (2001). “Comparing cultures, comparing crime: Challenges, prospects and problems for a global criminology.” Crime, Law & Social Change, 36: 285–308. Reddy, R. (2008). “Gender, culture and the law: Approaches to ‘honour crimes’ in the UK.” Feminist Legal Studies, 16: 305–321. Valier (2003).

5 Individual and situational causes of scholarly misconduct

In this brief chapter, we consider the possible role that characteristics of individuals, as well as the personal challenges individuals may face, play in the etiology of scholarly misconduct.

Individual factors From the early days when research misconduct was discussed, it was clear that many observers saw these behaviors not only as rare, but also as the work of an impaired individual. There had to be something wrong with one who would violate the sacred trust that members of the scientific community place in one another. Sir Cyril Burt, the eminent British educational psychologist whose correlations were too good to be believed, was said to have so fervently believed in his theory of the inheritability of intelligence, that he concocted fake data to support it. At the time, Burt was already an accomplished scholar who no longer had to concern himself with job security. This is an example of an individual-level explanation. The etiology lies in the psychological makeup of the offender. When cancer researcher William Summerlin was suspected of having falsified the results of a skin graft, commentators were quick to explore why someone with a promising career at such a prestigious research center would engage in such behavior. Certainly, someone who would defy science’s most revered norms and risk approbation had to suffer from some form of mental pathology. Thus, possible mental or emotional illnesses were mentioned in a number of the early infamous cases of scholarly crime. It is hard to conceive that individuals in full possession of their faculties would knowingly fabricate data or plagiarize the work of others. In their paper on the etiology of research misconduct, Mark Davis, Michelle Riske-Morris, and Sebastian Diaz note that a number of commentators have pointed to individual factors.1 Even some of the protagonists, such as infamous data faker John Long, admitted that it was they, not the environment in which they worked, who were responsible for their misdeeds. Another reason for pointing to the individual actor is how they behave outside the realm of scholarly and scientific work. Erin Potts-Kant, a former Duke University scientist whose misconduct may run into the hundreds of millions of dollars, embezzled in excess of $25,000 from the university.2 Embezzlement is

68  Individual and situational causes a garden-variety, albeit white-collar offense. It has nothing to do with scholarship or science, nor does it violate any specific scientific norms. We would argue the regardless of occupational setting or pressures to produce, Potts-Kant likely would have engaged in wrongful behavior of one kind or another. We think the best reason for including the individual in discussions of etiology is that it is the individual that commits scholarly misconduct. That is, despite the influence of structural, organizational, or cultural factors, the behavior that violates specific norms is most often that of specific individuals. It is the individual who is administratively or legally charged with misconduct or fraud. And it is the individual who is subject to government sanctions such as debarment or criminal prosecution. Just as structural disadvantage doesn’t alleviate the disadvantaged person from responsibility for his or her actions, scholars must be accountable for their actions despite the influence of social structure and other influences.

5.1  Structural pressure or individual pathology? Pat Palmer was a researcher at the University of Iowa. It was discovered that a number of interviews she was supposed to have conducted with families of autism patients had been fabricated. Further investigation found that Palmer, who had not graduated from college, claimed a bachelor’s degree, two master’s degrees, and two doctoral degrees. In an NIH grant application, she listed several published papers by having inserted her name in the places of one of the actual authors. It was also discovered that Palmer had submitted falsified mileage reimbursement forms for travel she had claimed under an NIH grant. So not only had she committed scientific misconduct, she also committed gardenvariety theft. Not surprisingly, she was found guilty of scientific misconduct by the Office of Research Integrity. What was unusual in her case is that she was prosecuted in Johnson County District Court for felonies and misdemeanors related to her misconduct. She was sentenced to probation, assessed a fine, and given suspended prison and jail sentences. She was also ordered to make restitution to the University of Iowa in the amount of $18,976.80. So were Palmer’s offenses attributable to publish or perish? Or were they more likely character flaws that would have surfaced regardless of work setting? In Chapter  2 we portrayed the academic world as not only competitive, but also one which offers rewards of job security, money, and status for those who excel. Such rewards and the visibility they promise are particularly attractive to individuals who by personality are predisposed to seeking and enjoying such symbols of status. The Japanese scientific prodigy, Haruko Obokata, who was found guilty of FF&P, was described by the panel convened to investigate her case as

Individual and situational causes  69 someone who “sorely lacks, not only a sense of research ethics, but also integrity and humility as a scientific researcher”.3 It is for these reasons that narcissism is a candidate for a personality style capable of engaging in scholarly misconduct. It is characterized by such traits as arrogance, superiority, entitlement and exhibitionism. There are justifiable reasons we cite narcissism as a personality variable responsible for at least some instances of research misconduct. As we discuss elsewhere, serious forms of research misconduct represent exploitative behavior, that is, behavior that nets the offender an outcome to which he or she is not entitled. Interpersonal exploitativeness is a trait of narcissism, including its extreme form, narcissistic personality disorder. Social-personality psychologists measure interpersonal exploitativeness even in subclinical narcissism; that is, narcissism that doesn’t rise to the level of pathology. Narcissism is a thread that runs throughout this text, but we will interject here that it overlaps with status and power. The shock that academic audiences express upon hearing about powerful scholars, well-known and well-employed by prestigious institutions, is evidence of the disbelief that scholars of this caliber would and could do such things. Their status and power further serve as a defense: Scholars who are accused of scientific misconduct exclaim that the accusations are, of course, baseless given the stratum that they occupy. Their status serves as a shield and a deflector. In a circular way, their power and status protect them from colleagues’ criticisms and victims’ accusations: The colleagues and the victims are afraid to speak out regarding what they know of the misconduct because they are commonly less powerful than the perpetrators and are concerned about what might happen to their careers should they speak out. A study by Mark Davis, Kelly Wester, and Bridgett King found that those who scored higher on narcissism were more likely to report research misconduct.4 The evidence was even stronger when they looked at the connection between sense of entitlement, a specific narcissistic trait, and misconduct. One interesting study is that conducted by a professor emeritus of psychiatry at Columbia University, Donald Kornfeld, who examined a number of Office of Research Integrity (ORI) reports.5 Other observers have alluded to psychological and psychiatric problems, but Kornfeld was the first practicing psychiatrist to analyze the ORI protagonists through this set of professional lenses. His conclusions were that the misconduct could be attributed to both the psychological traits of the respondents as well as their circumstances. He categorized the offenders as the desperate, the perfectionist, the ethically challenged, the grandiose, the sociopath, and the nonprofessional staff, and he suggests that the motivations of each are different. This is important for strategies aimed at prevention and control, something we will discuss in Chapter 9. Little has been done to explore individual traits and scholarly crime. Because so much emphasis to date has been placed on the structural and organizational factors, individual factors such as personality has been given short shrift. In Chapter 8 we will suggest some research strategies that might yield additional insights into the role personality and other individual characteristics and situations play in scholarly misconduct.

70  Individual and situational causes

Situational factors In explanations of specific instances of misconduct, the perpetrator and the audience typically point to situations that precipitated the misdeed. Situational factors, in contract to individual factors, do not inhere in the individual, but rather are short- or long-term circumstances that can affect the health and well-being of the individual. These include, but are not limited to, personal and family health problems, death of a family member or friend, financial pressures, and relationship issues. Situational factors are not unique to the world of scholarship and science; they can be found in every milieu in which individuals live and work. Most situational factors may have a closer connection to individual scholars than to the other three levels of explanation. In a study discussed earlier, Mark Davis, Michelle Riske-Morris, and Sebastian Diaz also identified a number of factors which did not inhere in the individual, organization, culture or social structure.6 These were transitory factors which could come and go in the lives of individuals, and they included financial problems, relationship issues, challenges to personal or family health, a death in the family, and other situations that can strain the ability of the individual to comply with ethical standards. There is an early criminological study that might shed some light on how situations affect individuals. Donald Cressey, one of Edwin Sutherland’s protégés, conducted a fascinating study of businessmen who had embezzled other people’s money.7 Cressey identified what he termed the non-shareable problem in trust violation. One such problem he uncovered was that resulting from strained employer-employee relations. Here the employee might feel he or she has been treated unfairly in some way by the organization. The problem is non-shareable in that, should the individuals suffering the problem suggest ways to alleviate the unfairness, they may lose status within the organization. So imagine a scholar, perhaps a post-doctoral fellow working in a laboratory of a major university medical school. This individual has come to the U.S. from abroad. English is not his first or best language. Perhaps he has a wife who also does not speak English well. Not only is he faced with the anxieties that accompany a new job and responsibilities, he is supporting a partner whose language skills limit her ability to work outside the home, making him the sole breadwinner. Add to this mix the fact that he is by nature shy and reluctant to reach out to others to establish friendships. Then add to these challenges the structural pressures to produce publishable work, perhaps within an organization that doesn’t recognize the unique challenges a foreign student faces. All these factors converge to increase the likelihood that this student, unable to share his situation with mentors or colleagues, engages in a departure from acceptable research practice. Here we can also see the overlap of cultural and situational factors. That is, an American graduate student might well have a support system of family and friends on which to rely during times of stress caused by illness, financial struggles or loss. The foreign student, in contrast, may feel alone and isolated in a scary foreign environment. We can see in this example – which may not be as hypothetical as it seems – that our postdoc, according to Donald Cressey’s theory, has multiple non-shareable problems.

Individual and situational causes  71

Conclusions The inclusion of personality and mental health factors flies in the face of a purely sociological approach. Douglas Adams and Kenneth Pimple have cautioned against an overreliance on individualistic explanations of research misconduct,8 and we appreciate their concerns. However, we agree with Matt Vogel and Steven Messner that including individual variables with more traditional criminological theories strengthens our ability to explain, and ultimately to prevent and control, offending behavior,9 including scholarly crime. Despite the reluctance of many to attribute scholarly misconduct to individuals, there is compelling evidence that the world of scholarship and science lays the blame there. Individuals whose work is characterized by misconduct are held responsible by institutions and by federal authorities. We don’t prosecute or sue research universities for creating an atmosphere in which misconduct can and does occur; we level charges against the individuals associated with the research. And in the field’s laudable attempt to turn around scholarly behavior, programs such as Washington University’s P. I. Program directs their rehabilitative efforts toward the individual, not the host institution, although the latter may be involved in and supportive of the rehabilitative process. All this suggests that although the complex etiology of scholarly misconduct may not lie squarely on the shoulders of the individual, those shoulders must bear much of the weight of the efforts to prevent and control it. The recent work that has pointed the finger at structural and organizational explanations for scholarly crime makes us shy away from focusing on the individual. It is reductionistic, even deterministic, to ascribe serious misconduct to the psyches of individual scholars. But much like the traditional criminologist who studies street crime in impoverished urban neighborhoods, a more sociological approach takes us only so far. We cannot and should not ignore false positives – the errors in prediction  – who despite the challenges of poverty, joblessness, broken families and crime, manage to find and maintain employment, or get an education or avoid the revolving door of the criminal justice system. Everyone can identify with situational factors such as financial pressures, relationship problems, and family illness and death. We believe situational factors are important for several reasons. One is that they are found outside academe and therefore an understandable, perhaps unavoidable, part of life. It is not hard to imagine that people, including scholars, inadvertently take these stressors to the workplace. In some cases, it is the workplace that generated the stressors. So important are situational factors in the workplace that most large organizations, academic or not, now have Employee Assistance Programs (EAPs) whose purpose is to help employees facing mental, emotional, financial and other problems. We will revisit EAPs and their potential in Chapter Nine.

Notes 1 Davis, Mark S., Riske-Morris, Michelle L. & Diaz, Sebastian (2007). “Causal factors implicated in research misconduct: Evidence from ORI case files.” Science & Engineering Ethics, 13: 395–414.

72  Individual and situational causes 2 Han, Andrew P. (2017). “Duke admits faked data ‘potentially affected’ grant applications.” Retraction Watch, June 29. Available at: http://retractionwatch.com/2017/06/29/ duke-admits-faked-data-potentially-affected-grant-applications/ (accessed 12-31-2017). 3 McNeill, David (2014). “Academic scandal shakes Japan.” New York Times, July  7. Available at: www.nytimes.com/2014/07/07/world/asia/academic-scandal-shakes-japan. html (accessed 12-31-2017). 4 Davis, Mark S., Wester, Kelly L. & King Bridgett (2008). “Narcissism, entitlement, and questionable research practices in counseling: A pilot study.” Journal of Counseling & Development, 86, 200–210. 5 Kornfeld, Donald S. (2012). Perspective: Research misconduct: The search for a remedy. Academic Medicine, 87: 877–882. 6 Davis, Riske-Morris & Diaz (2007). 7 Cressey, Donald R. (1953). Other People’s Money: A Study in the Social Psychology of Embezzlement. New York: Free Press. 8 Adams, Douglas & Pimple, Kenneth D. (2005). “Research misconduct and crime: Lessons from criminal science on preventing misconduct and promoting integrity.” Accountability in Research, 12: 225–240. 9 Vogel, Matt & Steven F. Messner. 2012. “Social Correlates of Delinquency for Youth in Need of Mental Health Services: Examining the Scope Conditions of Criminological Theories.” Justice Quarterly, 29: 546–572.

6 Scholarly misconduct as crime

Many of those who have written about research misconduct have used the vocabulary of crime and justice to discuss the subject matter. Several analysts1 have employed various criminological theories to make sense of FF&P, which have been referred to as “fraud.”2 Less egregious forms of scholarly misbehavior have been termed “misdemeanors” or “lesser crimes.”3 The Federal Research Misconduct Policy produced by the Office of Science and Technology Policy mentions “aggravating” and “mitigating” circumstances surrounding instances of research misconduct. Richard Smith asserts that “due process” should be extended to respondents as it is for other types of accused,4 a right which Spece and Bernstein argue is unavailable under current Public Health Service regulations.5 In her analysis of discrepancies in the way various forms of research misconduct are punished, Lisa Keranen alludes to “sentencing disparities.”6 And one ORI study examined “exonerated” respondents which conjures up images of offenders wrongfully accused or convicted. Rebecca Dresser uses the term “rehabilitation” in her discussion on publishing the names of respondents,7 and the idea of correcting offenders is reinforced by Professor DuBois’s P. I. Program for RCR violators. These references support the argument that the scientific community has already drawn parallels between serious scholarly misconduct and crime. Those convicted of conventional crimes, whose misdeeds are often more harmful to society than the faking of data or the stealing of ideas, are routinely given a second chance to demonstrate that they can lead socially-acceptable, law-abiding lives. Offenders typically have the opportunity to make amends to those they have offended and to seek appropriate treatment. This question has been raised in discussions of research misconduct, but not adequately answered.8

Resistance to defining scholarly misconduct as crime A major reason supplied by our fellow scholars as to why we are resistant to defining scholarly misconduct as crime is that we are united in protecting ourselves. In other words, because we are all members of the same discipline, we hesitate to think that some of our membership commit wrongdoing, especially wrongdoing having to do with the science itself. If we learn that a fellow scholar commits an ordinary street crime, we are far less disturbed because that type of act is

74  Scholarly misconduct as crime irrelevant for the rest of us scholars. This is not to say that we don’t come down hard on the colleague who commits ordinary crime. We may and often do, but criminal colleagues are excluded and they are dealt with by the criminal justice system outside of academe. But acts of wrongdoing relevant to our profession reflect poorly on the profession as a whole, of which we are members. To broaden the question of resistance beyond scientists’ view of themselves as possible deviants, we might consider Joanna Masel’s discussion of the widespread resistance against modern evolutionary theory,9 particularly among creationists.10 As a biologist, Masel has observed: Listening to those who deny or are at least uncomfortable with evolution, it quickly becomes clear that most are relatively unconcerned about the evolution of microbes. Instead their objections dwell on the evolution of humans and our relationship to other animals . . . an uncomfortable message from science . . . is that we humans are much less significant . . . than we like to think. This discomfort is not restricted to creationists. Masel notes that the bias against being scientific about humans must also extend to some degree to biologists themselves, who apply scientific principles “less objectively to humans than to other study organisms.” She finds that she must consciously stay on guard to ensure that “my methods of reasoning and standards of evidence remain the same for the study of humans as they are for the study of yeast.” Despite her vigilance, she strives “not to think about humans for too long, lest her own biases about our species erode my scientific standards.”11 This self-protection is true not only for scientists. Many organizations strive to protect their membership from accusations of wrongdoing or incompetence. The military is afforded the privilege of investigating crimes by members of the military and those crimes are determined by a military court of law, not a civilian one. Airline pilots who are lax in their duties as pilots, are incapacitated by illness, or commit dubious acts such as sexual harassment, are subject to investigation and sanctioning by their union such as the Airline Pilots Association. Of course, they can also be tried and sanctioned by criminal courts, depending on the type of accusations, but a main purpose of the union is to keep matters internal to the profession. The same is true of medical doctors who are subject to investigation and sanctioning by state medical boards as well as by criminal and civil proceedings. Often these internal, protective professional organizations state, with some validity, that only they can know if an act of professional misconduct has occurred because of the esoteric nature of the work they do.12 So, besides vague definitions, another explanation for resistance to even thinking about scholarly misconduct is that scholars are so intensely linked to their discipline that they do not separate themselves sufficiently to consider that one of their own could commit abhorrent scholarly behavior. If someone within the discipline is being unscholarly, we see ourselves in that offender – an uncomfortable if not unacceptable experience.

Scholarly misconduct as crime  75 We have mentioned resistance from the academic world and from the public at large to acknowledge scholarly misconduct. Victims of scholarly misconduct are also hesitant to speak up about their victimization at the hands of miscreant scientists. Some are “sued into silence”13 or fired for speaking up,14 but more commonly they are badgered and threatened. In a recent case with which the authors are familiar, the offender, supported by departmental members told their false version of the offender’s story to the media which made the victim, the department chair, appear to be the offender. In another case, the victim was threatened by the offender and his department, along with the victim’s own department head. We believe this is not uncommon. Moreover, the attorneys who are consulted about cases of scholarly misconduct, no matter how blatant the evidence, caution the victims to not pursue the case largely by predicting that the outcome will work against them. Startlingly, the victim can be sued by the offender. And as we discussed in the Preface, an imbalance of resources can prevent the victim from prevailing. The best defense is a strong offense, as the old adage goes. Hence, the victims are cowed into doing nothing, and, the definitions of scholarly misconduct go unclarified.

Some criminal aspects of scholarly misconduct In his reexamination of the criminal law concepts mala in se and mala prohibita discussed earlier, Mark Davis suggested a role for equity in defining those offenses that are inherently wrong and those that simply are proscribed by law.15 He argued that criminal offenses consist of a combination of the extent to which they violate standards of equity and reciprocity, the intent of the offender, and the degree of harm. Extending this equity-based conception of offensive behavior to scholarly misconduct, we maintain that acts such as the theft of ideas, the plagiarism and duplication of the intellectual work of others, as well as the fabrication and falsification of data all constitute forms of exploitative behavior. They derive for the offender an outcome to which she or he is not entitled. The data fabricator or falsifier gets credit for a publication that is undeserved. The same is true of plagiarism and the theft of ideas. This form of exploitation is exacerbated by the violation of trust, something that indeed does set apart the scholar’s world from that of many other social realms. The differences between the two types of offenses – crime as we have come to think of it and serious scholarly misconduct – are few. For instance, our interest in and understanding of criminology, the study of ordinary crime, dates back hundreds of years while the “joint federal-institutional effort to discourage scientific misconduct is still in its infancy.”16 The early developmental state of understanding scholarly misconduct and, importantly, the strategies to sanction such misconduct has many impediments, notably the fact that not everyone in the scientific community subscribes to the notion that scholarly misconduct is a form of crime. Since that is the case, many or most of the scientific community disagree as to the

76  Scholarly misconduct as crime range of activities that should be labeled misconduct. Nor do we agree on the more minute details such as how to assign seriousness to various types of misconduct. There is no doubt that scholarly offenses cause harm, individual and social. Focusing on fraud, Warren Schmaus points out that scientific fraud is a violation of the norms of science, a form of harm that visits itself upon the individual scientist as well as the sciences and society at large.17 Indeed, we can think of “scientists as participants in a larger social contract among all members or society”.18 Like others who address the topic of scientific misconduct, Schmaus points to the supposed disinterestedness of scientists as a false explanation for why scientific misconduct cannot happen. Yet our disinterestedness does not override detours from our search for scientific truth. According to the prevailing public image, scientists are supposed to be disinterested seekers after truth. Any scientist who commits fraud or willfully misrepresents experimental results, to gain some unjust advantage or to cause some injury to another, is therefore perceived as acting out of pure self-interest, pursuing his or her own career at the expense of the growth of knowledge.19 Not only is Schmaus saying that scientists hope to portray the “bad apples” who commit violations as suffering from a personal pursuit of selfishness rather than operating in an environment that encourages violations, he is pointing to the fact that unethical scholars, like criminals, commit harmful acts knowingly and with purpose. In sum, the same rules apply to all of us, scientists or not. Scientists are not free to practice poor science because of their “disinterested” status.20 As to street or white-collar crime, criminologists commonly consider crime as an act of selfishness, which purpose is to accrue financial, physical, or other (for example, moral superiority) gain at the expense of others. The best of us are disinterested and therefore searchers for truth but none of us, scientists or not, are immune from baser desires and behaviors. The harm caused by research misconduct goes well beyond the betrayal of trust. A case involving Duke University illustrates the tremendous costs a major instance of research misconduct can entail. A witness to the case filed a qui tam action, resulting in a $200 million suit against Duke University.21 Qui tam actions can result in treble damages which in this case could total nearly two-thirds of a billion dollars. This price tag doesn’t include the costs and lost revenue to the other research institutions with which Duke was collaborating. We have seen the various consequences of research misconduct to the accused. What is often neglected in discussions on the topic is how others often suffer as a result of such an incident. Bonnie’s case discussed in the Preface makes clear some of the emotional harm done by alleged research misconduct. This is impossible to measure, but it affects all the direct and indirect victims. The victimology of scholarly crime is one of the areas least understood. Indeed, it parallels attempts to understand the victimization of more traditional crime, which lagged behind other specialties.

Scholarly misconduct as crime 77 Uncovering instances of research misconduct depends in part on whistleblowers, that is individuals with firsthand knowledge of the alleged misconduct who are willing to alert proper authorities. Quite often whistleblowers in research misconduct cases are in positions subordinate to the alleged offender, making them vulnerable. Despite advances we have made in the protection of whistleblowers, they remain quite vulnerable in cases of alleged research misconduct. Those who level allegations against other scholars can be sued, and sometimes these actions are successful. In addition to these direct, predictable sanctions, however, there may be other, indirect effects from a finding of research misconduct that could stigmatize the researcher. For example, individuals involved in a misconduct case may find themselves unable to obtain funding from other institutions even after the period of debarment has ended, or they may suffer suspiciousness and distrust from colleagues.22 The investigation itself, similar in many respects to criminal cases, may also lead to some adverse effects for the accused. Friedman suggests that the accused may feel hurt by the skepticism that administrators or committee members may display toward them. The accused may be subject to adverse publicity, and thus garner a tainted reputation.23 Moreover, the submission of work to journals may be adversely affected in that the work may not be accepted for review given the tainted reputation.24 Just as our criminal justice system must guard against wrongfully accusing and convicting innocent parties, so should those in charge of pursuing cases of scholarly crime. Fuchs and Westervelt suggest that instances of research misconduct have debilitating consequences affecting many different social arenas. It not only compromises relationships among colleagues, but it also affects the integrity of the work in question. Fraud trials interfere with one’s own research and they incur high costs from bad publicity, shared by all – guilty or not – whose prestige derives, at least in part, from the prestige of their employing organizations and laboratories where the investigations take place.25 So, stigma associated with a finding of research misconduct can also affect colleagues and supervisors of the offender. An example of this is the case of Francis Collins, a well-regarded director of the National Center for Human Genome Research at National Institutes of Health. After it was found that a junior scientist in his lab fabricated data in a paper that went out under Collins’ name, some began to question if he and other researchers in highly responsible positions were paying enough attention to the details of the research that they coauthored.26 Collins also commented that “[t]his is the worst nightmare a scientist has, that the truth could be undermined right under your nose.… I knew that some people might question if I could continue to play an effective role as head of the centre, but I was encouraged by people whose judgement I value not to draw that conclusion.”27 The investigation itself, similar in many respects to criminal cases, is also likely to lead to some adversarial effects on the accused. As noted above, Friedman suggests that the suspected scholar may feel hurt by the skepticism that administrators or committee members may display toward the suspect.28 Unfortunately, there is

78  Scholarly misconduct as crime little empirical research on the stigma associated with research misconduct. Much of our knowledge on negative consequences associated with such a finding are gathered from examining celebrated cases of research misconduct which have been well publicized and scrutinized by the media, or what little there has been, has been directed toward negative consequences for whistleblowers.29 In the case of Maurizio Zanetti, an immunologist at the University of California at San Diego, the physician who made allegations against Zanetti, ended up losing his position at the university after making them.30 Another example comes from the case against Kenneth Bauer, chief of hematology-oncology, and Arnaldo Arbini, a physician at the Veterans Affairs Medical Center at West Roxbury, Massachusetts. A former researcher in Bauer’s lab alleged that she was fired from her job because she questioned a paper that Bauer and Arbini published.31 Because of such cases, there has been an increased awareness to maintain the confidentiality of whistleblowers. These results have also led to heightened regulations to protect whistleblowers. But what do we know about negative consequences for offenders? Are the consequences so severe that the researcher is unable to recover from the effects of being labeled as one who committed fraud in research? Sociologists of deviant behavior have suggested that those involved in deviant behavior may be subject to labeling by formal and informal processes of social control.32 The stigma that results from this process frequently has consequences for the person so labeled. Research misconduct such as fabrication and plagiarism do substantial harm to the literature in which it is published. Those caught will often either voluntarily or mandatorily retract the papers tainted with the suspect results. This consists of contacting the editors of the respective journals and they in turn can issue a published statement that the papers have been retracted. But many subsequent investigators do not know about the retractions. It therefore is quite possible to conduct research that builds on the bogus findings. The harm associated with scholarly misconduct can and does extend well beyond financial losses and damage to the literature. In 1988, Stephen Breuning, a researcher then affiliated with the University of Pittsburgh, was charged by the U.S. Attorney’s Office in Baltimore, Maryland with making false statements in response to reporting fabricated research results.33 His research, which was funded by the National Institute of Mental Health (NIMH), revolved around the psychopharmacological effects of such drugs as Ritalin and Dexadrine on hyperactive, developmentally disabled children. What makes this case so egregious is that children conceivably were being treated based on Breuning’s fabricated results, a terrifying prospect. Breuning pleaded guilty in federal district court and on November 10, 1988 was sentenced to 60 days in jail, given five years’ probation and ordered to make restitution of more than $11,000 to the University of Pittsburgh for his salary. He was permitted to serve his jail time in a Michigan halfway house, and was also ordered to serve 250 hours of community service. The University of Pittsburgh reimbursed NIMH more than $160,000 for the cost of lab equipment and research assistants’ salaries. It is plausible that the case was prosecuted to make an example of Breuning because his fictitious research results influenced the treatment of mentally retarded children. By conventional criminal

Scholarly misconduct as crime  79 justice standards his sentence may seem relatively light, but it was the first known criminal conviction for scientific misconduct. It also served to illustrate that at least some forms of research misconduct meet the legal definition of, and are capable of being treated as, crimes. Aggravating and mitigating circumstances As in the study of criminology and in the practice of criminal justice, Keranen distinguishes research misconduct by levels of seriousness.34 The same elements make up seriousness in criminal offenses and scholarly offenses: intent, damage (the consequences of misconduct), aggravating and mitigating factors, and singularity versus regularity (the nonrepetitive and repetitive nature) of the behavior. When considering the seriousness of an offense, criminal justice officials typically take into account various extenuating circumstances. Aggravating circumstances are those that make an offense more serious than it would otherwise be. For example, if a suspect shoots a police officer, it generally is considered more serious than the shooting of an ordinary citizen. Consequently, the penalty the offender potentially faces is more severe. Mitigating circumstances, on the other hand, are those that tend to soften the response of the criminal justice system. If a citizen mistakenly shoots a neighbor, mistaking the person for an intruder, these circumstances lessen the seriousness of the act, even though the homeowner may still be culpable. There is evidence that mitigating circumstances apply in some instances of research misconduct. Sources of personal strain in the scientist’s life, including financial problems, marital or other relationship problems, and difficulties adjusting to a new culture, can lead to scholarly misconduct.35 These circumstances do not take offending scientists “off the hook” for their misdeeds; they remain accountable and responsible, and they must face the consequences of their actions. As in the case of the person who steals food bread to feed his family in the wake of a devastating hurricane such as Harvey or Irma, we may be able to understand why the individual took that particular course of action, but at the same time we also support the enforcement of laws prohibiting such an offense. Mitigating circumstances for a case of scholarly crime could be compelling situational factors such as a poor relationship with colleagues or a serious problem at home. Regardless, we expect those administering justice in such cases to take these circumstances into account. Repetitive scientific misconduct, part of a long-term pattern of wrongful behavior, is far less excusable than a one-time violation. Some scholarly offenders fake data or plagiarize over a period of months or years, victimizing their colleagues and cluttering the scientific literature with meaningless papers. One such case is that of Eric T. Poehlman of the University of Vermont College of Medicine.36 Poehlman, a full professor, was found to have engaged in a long-term pattern of misconduct. In addition to the fabrication and falsification of which he was accused, Poehlman destroyed evidence of his misdeeds, gave false testimony, and even solicited others to provide false documents. The seriousness of Poehlman’s

80  Scholarly misconduct as crime aggravating circumstances was matched by the Office of Research Integrity’s formal response. Poehlman’s was the first case of research misconduct in which the respondent was banned from receiving federal funds for life. Ignorance of RCR as an excuse In Chapter 2 we discussed the socialization of scholars including efforts to educate them about the responsible conduct of research. Implicit in RCR training is that trainees are, at least to an extent, ignorant of the courses’ subject matter. If we accept this assumption, then are research personnel who are ill-acquainted with the content of RCR training less culpable when they engage in misconduct? In the substantive criminal law, the ignorance of a law is seldom sufficient justification for violating that law. For example, in the United States, those who are required to file federal tax returns can be prosecuted for tax evasion. The citizen may or may not be aware that she is required to file a return. The assumption is that the ordinary socialization of citizens exposes them to the knowledge that every year people must file tax returns. Just as the criminal law does not excuse most citizens for their ignorance of the law, we assert that with few exceptions, most scholars and scientists know well what constitutes fabrication, falsification, and plagiarism. There are conceivable circumstances under which the ignorance argument may have some merit. One is misconduct by support staff members who have not been formally trained in the methods of scholarship and science. Foreign scholars who trained in environments in which the norms of science had been distorted or perverted, may be ignorant of accepted practices by virtue of where they grew up and trained. Another instance where ignorance of RCR might be raised is in the case of research support staff who are caught up in misconduct. Those who have not been trained in the norms of scholarship and science should not be as culpable as professors, post-docs and other research personnel who should be thoroughly familiar with those norms. Scholarly misconduct as a case of white-collar crime37 Sociologist Edwin H. Sutherland earned a permanent berth in criminological history with his path-blazing study of white-collar crime.38 Acts by corporations that had been handled administratively were, according to Sutherland, serious and socially harmful. In his now famous 1939 address to the American Sociological Society, Sutherland asserted that all white-collar crimes in business and the professions had one factor in common: The violation of delegated or implied trust. Many of these violations, he asserted, could be categorized as: (1) misrepresentation of asset values; and (2) duplicity in the manipulation of power. At about the same time, Marshall Clinard used his wartime position to observe the inner workings of black-market activities by businessmen.39 There was at that time a small but growing interest in deviant behavior and crime by elites.

Scholarly misconduct as crime  81 Sutherland’s work on white-collar crime was cut short by his death in 1950. After the war, Clinard continued to write on the subject of white-collar crime. Throughout the 1950s, however, little attention was paid to these topics. Mainstream criminology of this era was, in the words of Don Gibbons “dominated by an interest in the behavior of criminals, rather than in the criminality of behavior.” Most criminologists focused on behavior which was not only normatively but also legally proscribed. Consequently, much of the scholarly work of the post-war era bore more than a faint stamp of the early sociological tradition, that is, an emphasis on the deviance of the disadvantaged.40 In 1961, a single event brought about a sudden resurgence of interest in whitecollar crime. General Electric, Westinghouse, and a number of other electrical appliance manufacturers were charged with conspiring to fix the prices of their products.41 Several executives were convicted of crimes, and a few even had to serve prison sentences. This was particularly noteworthy because the white-collar defendants were regarded as among the most upstanding citizens in their respective communities. It was clear that even the defendants did not regard themselves as criminals.42 The heavy electrical equipment cases proved that upper world crime was not a figment of overactive imaginations. The defendants were some of the most respected and trusted members of their communities. No longer could criminology justify an exclusive focus on lower-world crime. In the years that followed the electrical conspiracy cases, the times were, in the words of folksinger Bob Dylan, “a-changin’,” and these changes cast suspicion on a number of formerly revered actors and institutions. The United States government and the individuals who served in its armed forces were viewed by many as committing criminal acts in Southeast Asia. The corporations that manufactured the necessary implements of war were viewed as no better. It is little surprise, then, that those in formerly respected realms became subject to so much professional and public scrutiny. Then in 1972, the foundations of American government were shaken by a scandal that was soon to become known as “Watergate.” Implicated were trusted public officials from White House aides to the President of the United States Richard M. Nixon himself. The governmental and political deviance of the 1970’s seemed to spawn interest in its corporate counterpart. Journalists and scholars alike were drawn to publicized cases of gross malfeasance by such industrial giants as Dow Chemical, DuPont, Ford, and Boeing. But whereas the principal cost of Watergate had been the public trust, the cost of corporate crime was found to be billions of dollars and countless human lives. As more scholars began to turn their scholarly attention to white-collar and organizational crime and deviance, it was perhaps only a matter of time before those in the ivory tower put themselves under the criminological microscope. High-profile cases of research misconduct in the biomedical sciences such as William Summerlin, Elias Alsabti, and John Darsee, coupled with the investigatory and research work of such agencies as the federal Office of Scientific Integrity and its successor, the Office of Research Integrity, suggested that scholarly attention to research misconduct was long overdue.

82  Scholarly misconduct as crime It should be evident that behavior now classified as white-collar crimes were not always regarded as such. As we have noted, the anti-trust cases which Sutherland analyzed had been handled administratively or civilly and, as such, were not punishable by terms of confinement. The dispositions of the heavy electrical equipment conspiracy cases of 1961 were novel, then, in that some of the defendants were sentenced to jail terms. This was newsworthy at the time because the defendants were some of the most upstanding citizens in their communities. Despite the different foci of the major analysts of white-collar crime, they have yet to shed the notion or importance of trust. From Sutherland’s early inquiries into white-collar crime up to the present one, the notion of trust has surfaced again and again as a definitional and conceptual tool. Regardless of motive or structural pressure, white-collar offenders have betrayed one form of implied trust or another in the discharge of their duties. But unlike the large corporation that stands to gain enormous profits when it cuts ethical corners, the institution of science would collapse if its members engaged in large-scale misconduct. There are other reasons for linking scholarly misconduct to white-collar crime. Scientists, like many of their counterparts in business and government, are intelligent and well-educated. And although contemporary academic scientists may not reap the financial rewards associated with Wall Street, they enjoy a considerable amount of prestige. Even the white lab coat of the scientist – a symbol of cleanliness and purity – parallels the white collar of the businessperson. Sutherland’s forms of trust violation have their analogs in the world of academic science. As we have noted, trust is critical among scientific researchers who depend on the competent and ethical behavior of their peers. The faking of data, for example, is a misrepresentation of both the scientist’s actual investment and the worth of the product, which in science are the data, the grant proposal, or the manuscript submitted for publication. Let us offer an example from an environmentalist, disturbed upon finding out that his Volkswagen was not environmentally all that green, as the manufacturer had promised. The reader may recall that in September of 2015, it was discovered that Volkswagen had altered the software on its diesel cars to show, when emissions tested, that the cars were very low on pollution when in fact they were polluting the environment horribly as well as damaging human lives.43 In contemplating this crime, Conniff conjures up other recent news stories of corporate crime that were completely avoidable, ruinous in their impact, and committed out of simple selfishness and a lack of regard for honesty . . . as with scholarly misconduct. Johnson and Johnson, the drug and cosmetic company, promoted an off-label drug that resulted in the elderly having strokes and teenage boys to grow breasts. The CEO of the drug company, Turig, raised the cost of a 62-yearold drug by 4000 percent, thus putting this previously inexpensive drug out of reach for people who needed it. Takata, the maker of automobile air bags, knowingly installed airbags that would explode and kill passengers. General Motors knowingly installed ignition switches that, without warning, would turn off the engine and therefore the steering, thus killing and maiming many. The list is depressingly long.

Scholarly misconduct as crime  83 Let us consider why the list is depressingly long. In Conniff’s terms, the reason lies in the fact of absent governmental regulation and enforcement of those regulations; in other words, “the federal government’s repeated failures to adequately penalize past corporate criminals.”44 Rena Steinzor, a law professor and author of Why Not Jail? Industrial Catastrophes, Corporate Malfeasance, and Government Inaction, concurs with Conniff. She says that federal prosecution of corporate crime is, obviously, possible.45 Moreover, state attorneys general can also pursue criminal charges for corporate infractions. Rarely do they. So rarely that it makes news when a white-collar criminal is prosecuted and punished as was the case of the CEO of a peanut butter manufacturer who knowingly sold tainted peanut butter that killed nine and sickened hundreds; he got 28 years in prison. The blame is also partly ours, according to Conniff.46 Yes, “corporate responsibility” is a contradiction in terms. But we, the public, must not “complacently accept some company’s [false] promises.47 To continue to allow this form of behavior makes us all partners in crime against one another; it makes us “willing fools.”48 Witnesses to scholarly misconduct encourage this form of social harm if they do nothing about it, if they do not report it and do not control it when they are aware of it. These witnesses include university departments and the higher levels of the university (provosts, deans, et al.), as we note in cases of faculty plagiarizing others’ work and then getting promoted.49

Sanctioning The goals of sanctioning for scholarly misconduct, as described by law professor Rebeca Dresser, are precisely the same ones we apply in the study of ordinary crime: Retribution, restitution, deterrence, and incapacitation;50 the goals are reliant upon theories put forward by criminological theorists, such as Andrew Von Hirsch.51 Moreover, as Dresser points out, the dimensions for determining wrongdoing in academics and in street crime are the same: The seriousness of the offenders’ damaging behavior, the level of personal awareness accompanying the behavior, and the offenders’ personal circumstances. The fact that Dresser, in offering legal remedies for scientific misconduct, applies the same theories and concepts that criminologists use in discussing the control of street crime tells us that research misconduct resembles crime more than most academics and the public are willing to acknowledge. In fact, Dresser states that her primary reference in understanding sanctioning for scientific misconduct “is to the principles that shape decisions about punishment, for I believe that the criminal justice system and research justice systems have many common concerns and goals.”52 True, the punishments can be different for bad scientists and may include forced retirement, loss of funding, mandating future supervision, revocation of tenure, and disclosure of wrongdoing to future employers. But the behavior is not dissimilar in terms of the individual and community harm. And the goals of punishment are the same such as prevention, exposure, shame, and “just deserts.”

84  Scholarly misconduct as crime Keranen pinpoints things a bit more than other experts on the topic of sanctioning when she acknowledges that the assignment of “sanctions to those found guilty of research misconduct have been under-scrutinized” but also notes that sanctions are “inconsistently and unfairly applied.”53 We need, she says, an “expanded lexicon for talking about the seriousness of research misconduct [in order to] promote fairness and consistency in sanction assignment”54 in part because we do not equate ethical/moral definitions and sanctions to legal definitions and sanctions. That is, we apply an ethical/policy-oriented analysis instead of a legal analysis. Moreover, Although not all moral obligations are translatable into legal ones, the stakes with regard to research misconduct are so high that we have an interest in making our moral obligations as universal and enforceable as possible, and examining their relation to legal obligation affords us the opportunity to do so.55 All writers on the topic of scholarly misconduct call for sanctions, a range of punishments for academic wrongdoing, although some do suggest, as we do, that sanctions can involve education or other more restorative measures. Schmaus goes so far as to recommend not only sanctions for the offender but also for those in the sciences who fail to detect and respond to offenses, referring to the failure to notice offenses as a form of “failed science.”56 The fact that several experts in the study of scholarly misconduct have written in some detail about the sanctions that we need to set forth as a response to this form of deviance tells us that scholarly misconduct does resemble crime.57 Some forms of scholarly misconduct are clearly crime, such as fraud, while others are yet to be defined so, such as plagiarism. What is more obvious from reading the limited works on this topic of scholarly misconduct is that there is a strong need for guidelines. We need education to prevent misconduct, open-minded and courageous ethics committees to determine misconduct when it is brought to their attention, an avenue for reporting suspected offenses, a repository for reported cases for historical understanding and analysis, and a means of exposing and stigmatizing repeat offenders. The prosecution of scholarly misconduct In considering scholarly misconduct as fraud, there is little question that behavior such as falsification, fabrication, and plagiarism under a federal research grant would qualify as fraud under prevailing statutes. Title 18 of the United States Code defines fraud in the following way: Sec. 1001. – Statements of entries generally (a) Except as otherwise provided in this section, whoever, in any matter within the jurisdiction of the executive, legislative, or judicial branch of the Government of the United States, knowingly and willfully –

Scholarly misconduct as crime  85 (i) falsifies, conceals, or covers up by any trick, scheme, or device a material fact; (ii) makes any materially false, fictitious, or fraudulent statement or representation; or (iii) makes or uses any false writing or document knowing the same to contain any materially false, fictitious, or fraudulent statement or entry; shall be fined under this title or imprisoned not more than 5 years, or both.58 The acts of those who engage in FF&P violate one or a combination of provisions (i), (ii) and (iii) of the above statute. It could be argued that passing off plagiarized material as one’s own also violates these provisions. So, the researcher whose work is supported by a federal grant, and who knowingly fabricates data and reports the fictitious results in a report, grant proposal or scholarly manuscript, could be charged with fraud under the above statute. It is also clear that the associated punishments bespeak a level of seriousness on a par with theft and similar property offenses. There are several known cases in which those accused of research misconduct have been prosecuted in federal or state courts. The cases below serve to illustrate and perhaps provide hints as to what constitutes a prosecutable case. Pat J. Palmer As noted earlier, Pat Palmer, a researcher at the University of Iowa, pled guilty to felony and misdemeanor charges related to fabrication of data and the falsification of her credentials.59 She was also charged with felony theft related to making false mileage claims, actions which were not research misconduct, but which was part of a wider pattern of wrongful behavior. Palmer was supported on several federal grants at the time of her misconduct. In grants applications she claimed to hold several academic degrees and authorship of a number of published papers in scholarly journals, all of which was fictitious. Palmer was placed on three years’ supervised probation after the judge suspended an indeterminate jail sentence not to exceed ten years. Her case is somewhat unique in that it was the State of Iowa, not the federal government, that preferred charges against her. It could be argued that Palmer’s case, while arguably not as serious as Stephen Breuning’s described above, was prosecuted because the research misconduct was accompanied by a garden-variety theft. One wonders if the case would have been prosecuted in the absence of the theft. Despite cases like these that have been successfully prosecuted, the criminal justice system has been slow to take on FF&P as criminal behavior. It has been suggested that the scientific concepts and processes that would have to be explained in a trial are too complex for lay juries to understand. It may also be that federal and state prosecutors choose to reserve their limited resources for what they regard as more serious forms of crime.

86  Scholarly misconduct as crime Dong-Pyou Han Since the AIDS epidemic began in the early 1980s, teams of biomedical researchers have been struggling to identify one or more keys that will unlock the mystery of HIV. Millions of research dollars have been devoted to the pursuit of a vaccine that would prevent the spread of this devastating disease. One AIDS researcher, Dong-Pyou Han of Iowa State University, violated this principle in a big way. While working in the lab of Michael Cho at Case Western Reserve University, Han mixed rabbit blood and human blood, the results of which hinted at a medical breakthrough. When confronted about his misdeeds, Han told investigators that the incident began as an accidental mixing of human and rabbit blood. He then compounded his sin by failing to admit his initial mistake, even though he later came clean. Han’s case caught the attention of Iowa Senator Charles Grassley who became incensed at the waste of federal dollars. Grassley’s attention to the case likely resulted in greater scrutiny and perhaps increased chance of a punitive response. As with other researchers that have been found guilty of scientific misconduct, Han was debarred from receiving federal funds. Where this case departs from normal processing is that Han was prosecuted for making false statements and was sentenced to prison. This case begs several interesting questions. For one, Dong-Pyou Han was a Korean national, but does that have any bearing on his actions? Another question relates to his punishment. How does one make restitution in the amount of $7.2 million? Was the order of restitution largely an expression of outrage meted out with little expectation of compliance? Two unprosecuted cases of scholarly misconduct60 As examples from the world of the social sciences, we recall the case of Professor X, who, as head of the Sociology department, had to report questionable behavior on the part of one of the faculty, and of Professor Y, who was the victim of plagiarism, reported the plagiarism to the representative professional organization, and got precisely nowhere. In Professor X’s case, the malfeasance was not so much centered on the erring professor’s research as it was on her teaching, which caused harm to her students. From the moment that Professor X made inquiries and reported the misconduct, as reported by the students, the offender had and still has the benefit of a strong defense by her colleagues in the department and outside the university, partly due to her stating her case to the media and the academic community as a case of “academic freedom,” while Professor X was prohibited from speaking to the press about the case. The university lawyers saw the case in a very different light from what the offending professor declared as an infringement on academic freedom; the lawyers determined it to be sexual harassment. During all this, all but two of the departmental faculty supported the suspect professor, defiantly saying that she had done nothing wrong, regardless of the evidence to the contrary. These

Scholarly misconduct as crime  87 faculty had been fully aware, for years, of what had been going on in the suspect’s classroom but none came forward to report this nefarious conduct. The interesting point here is that, when Professor X did report it, the other faculty were implicated as being complicit all along, begging the question of why did they not report this sexual harassment to the department head or anyone else. Thus, they did everything possible to negate Professor X’s reporting and their behavior toward her was (and is) impossibly ugly. Although this case went on for over a year and finally ended with the suspect professor retiring early, the departmental members with the exception of two members, continue to berate Professor X. To wit, in September of 2014, Professor X was still being faulted in the local newspaper and the suspect professor, now retired with a golden parachute, is exonerated and defended by the departmental members. It is too lengthy and messy of a case to go into in this forum but, suffice it to say that there was no question as to the wrongfulness of the behavior. The present authors have heard many horrible cases over the years but this was one of the worst. Our question here is: How can we, as moral beings, not view these behaviors as reprehensible? Why are we, more to the point, staunchly defending these behaviors? The case of Professor Y provides another obvious example of resistance to a definition of scholarly misconduct being applied (followed by reporting, investigating, and reparation). At every step, a clear case of verbatim plagiarism, mixed with managed copying, was ignored. Professor Y and two colleagues submitted a paper for review and one of the reviewers revised the draft and submitted it for review under his own name as though it were his original work. The suspect professor’s paper was published; the real authors’ paper was not. Professor Y took his claim to the American Sociological Association (the original paper was reviewed by an ASA journal), complete with crystal clear evidence of plagiarism, and the ASA stalled the process. They often didn’t answer emails, didn’t respond to the evidence, and basically wore Professor Y down until he gave up. The ASA does have a code of ethics and that code states that plagiarism is against standard practice. Yet, they would not pursue what seems to be an obvious case. The authors of this book have read the evidence and what happened cannot be denied. We have read the responses from the ASA, while a respected organization and one admired by the authors, and found them lax in terms of investigating the case and dealing with it. If we cannot turn to our professional organizations for help, the question remains and is answered in the conclusion to this text, where can we turn? Not uncommonly, fellow scholars say that it is the characteristics of the cases that permit non-action; notably, that there is a lack of evidence to establish guilt. People reviewing the complaints can say that the evidence is too ambiguous to determine guilt even if the evidence is not ambiguous at all. Or perhaps the traits of the accused, principally their relative power within the discipline, disallows investigation and guilt determination. In Story 1 in Appendix A, it was reported that a student who claimed a professor stole her work was ignored. Moreover, this

88  Scholarly misconduct as crime same source also cited an example of a highly-respected and powerful professor, known for his foul temper, who stole work and was never even accused. A note on net widening In criminal justice, net widening is the process whereby behaviors previously not prohibited by law become defined as criminal, which means that more people will be brought in under a broadened net than when the net was smaller and less inclusive. Examples of net widening include the various drug wars, the early 2000s war on terrorism, and the mid – 1990s-to date sexual predator laws. Whether these expanded definitions of crime and responses to crime are correct, effective, or moral is not our point. Our point is that net-widening does occur and we must be cautious about the reasons for it, for instance, political gain, religious influence, revenue needs, since they can be onerous to a society without granting additional protection to that society. Along with broadened definitions of offenses, there are corresponding broader and harsher sanctions. With the advent of electronic monitoring, Bonnie Berry noted that, more suspects and convicted offenders were placed under additional community controls simply because this new device and new program existed, not because the device and program were effective in controlling crime.61 The same principle could operate in the definition and control of scholarly misconduct. And some in the scientific community regard the scientific and scholarly communities as already self-policing and thus in no need of strengthened definitions of heightened sanctions. Clearly, it is not the case that scientists are selfpolicing. However, it is well to recognize that broadened definitions and sanctions must be carefully considered mainly because the aim is to have effective and fair prevention and control. Again, we rely on the historical example of Edwin Sutherland and his study of antitrust violations. It could have been argued in the 1940s that Sutherland’s definition of a new set of crimes cast the criminal justice net far more broadly than it even had been before. And that viewpoint would have been correct. But net widening is not always a negative. In recent decades, new technology and new legislation have made possible new classes of offenses, offenses that would have not been otherwise recognized as offenses while the harm was present and often severe. In sum, if net widening is necessary to reduce social and personal harm, it should not be dismissed out of hand simply because it broadens the definition of what is criminal and therefore results in more official processing.

Conclusions Research misconduct constitutes a serious breach of professional trust. In most cases it results in wasted time and money. Once determined, it elicits a strong response from academic institutions and federal oversight agencies and rightly so. Despite the agreed-upon seriousness of research misconduct, particularly fabrication, falsification, and plagiarism (FF&P), perhaps not all offending researchers

Scholarly misconduct as crime  89 should be forever banned from science. Some scientists can be expected to show genuine remorse, to learn from their mistakes, and to move past their misdeeds to once again contribute to their respective fields of study. In some cases of scientific misconduct, respondents can, and perhaps even should, be given a second chance to prove they can undertake scientific research with integrity. Just as there are first-time, remorseful offenders who “learn their lesson” after committing crimes, so are there scientific researchers whose personal circumstances and poor judgment led to a misstep on the slippery slope to scientific misconduct. It is not only possible but probable that some are genuinely remorseful and could once again be trusted to conduct scientific research competently and with integrity. As is well known in the literature on scholarly misconduct, the definitions of misconduct are quite vague, probably intentionally so. Acts of scholarly misconduct are, in fact, no more ambiguous than crime as we usually think of crime. Yes, there are ambiguities in the definition of crimes but there are also certainties. The same is true for the definition of scholarly misconduct. One of the larger questions raised in this chapter is willingness to define scholarly misconduct because, with this willingness, would come a call for reporting, investigation, and sanctioning. Research on the perceived seriousness of scientific misconduct shows that falsification, fabrication, and plagiarism are indeed perceived to be the most serious of ethical breaches of integrity. These forms might even qualify as what Blackstone referred to as offenses mala in se which can incorporate violations of trust.62 That is, it may well be that these offenses would be deemed inherently wrong regardless of the existence of codified rules prohibiting them. Prosecutorial discretion is an important element of contemporary criminal justice decision making. The prospect of prosecuting every eligible offense is untenable. Still, we have to wonder what standards – arbitrary or otherwise – dictate whether one case is pursued when a near-identical case is ignored. This question extends to FF&P violations. Future inquiries on this topic should explore how prosecutorial discretion works in such cases. In sum, scholarly crime is not criminal because it is not defined so in criminal law. It is criminal because it violates norms of fairness by deriving outcomes for the perpetrators to which they are not entitled. Further, scholarly crime creates victims on a variety of levels, public and individual, that experience harm, sometimes egregious harm. In discussing the consequences faced by Frank Sauer, whose research misconduct spanned 16  years, commentator Paul Bracher of St.  Louis University observed: “A  shoplifter gets arrested; scientific fraudsters just get debarred from federal funding for a few years. It’s ridiculous. Sauer should be in prison.”63 Likewise, Senator Charles E. Grassley, wrote in a letter to ORI about the Han case highlighted above, that the three-year debarment by the feds “seems like a very light penalty for a doctor wo purposely tampered with a research trial and directly caused millions of taxpayer dollars to be wasted on fraudulent studies.”64 It is well-established that scholarly misconduct is vaguely defined; moreover, it is continuously defined. For example, sexual relations between professors and students has only recently been banned by Harvard University,65 reminding

90  Scholarly misconduct as crime us that, before the ban, it was not a violation of university policy for faculty to have sexual relations with students while, after the ban, this same behavior was declared a violation. As with ordinary street crime, our definitions of offenses evolve. For example, in 2012, the U.S. federal government changed the definition of “forcible rape” to include males as possible victims and expanding the types of sexual assaults to be used in compiling national crime statistics.66 Likewise, with this text and with other recent works on scholarly violations, we may begin to view scholarly violations are not so separable from crime. This very point, that the definition of scholarly offenses evolve and that they resemble both street crime and white-collar crime, leads to a discussion of legal moralism. While most people, probably including most scholars, do not view scholarly misconduct as crime, a strong argument can be made that public wrongs, such as research offense committed by scientists, can and should be criminalized. Duff argues that: we have good reason to criminalize some type of conduct if . . . it constitutes a public wrong. . . . [C]riminal law is a political, not a (merely) moral practice, and therefore . . . in asking what kinds of conduct we have good reason to criminalize, we must begin not with the entire realm of wrongdoing, but with the conduct falling within the public realm of our civic life.67 As noted, resistance to viewing scholarly misconduct as crime comes from a number of groups ranging from the legal profession to individual scholars to organizations of scholars. While there are unmistakable overlaps between crime, as we usually think of it, and scholarly misconduct, the main reason that almost no one views scholarly misconduct as crime is the fact that it is not legally defined so. As mentioned above, this is a matter of defining it so; that is, if scholarly misconduct were legally defined as crime, it would be. And based on this definition, it should be. That said, we can point out that the main elements of crime are present in much of scholarly misconduct. Take the fundamental feature of harm. Emile Durkheim, in defining crime, attributes social harm as one of the main criteria of crime: We can  .  .  . say that an act is criminal when it offends strong and defined states of the collective conscience.… In other words, we must not say that an action shocks the common conscience because it is criminal, but rather that it is criminal because it shocks the common conscience. We do not reprove it because it is a crime, but it is a crime because we reprove it.68

Notes 1 Adams, Douglas  & Pimple, Kenneth D. (2004). “Research misconduct and crime: Lessons from criminal science on preventing misconduct and promoting integrity.” Accountability in Research, 12: 225–240; Bechtel, H. Kenneth, Jr. & Pearson, Willie, Jr. (1985). “Deviant scientists and scientific deviance.” Deviant Behavior, 6: 237–252; Davis, Mark S. (2003). “The Role of Culture in Research Misconduct.” Accountability in Research, 10: 189–210.

Scholarly misconduct as crime  91 2 Farthing, Michael J. G. (2014). “Research misconduct: A grand global challenge for the 21st Century.” Journal of Gastroenterology and Enterology, 29: 422–427; Friedman, Paul J. (1996). “Advice to individuals involved in misconduct accusation.” Academic Medicine, 71(7): 716–723. 3 Farthing, Michael J. G. (2000). “Research misconduct: diagnosis, treatment and prevention.” British Journal of Surgery, 87: 1605–1609. 4 Smith, R. (2006). “Research misconduct: the poisoning of the well.” Journal of the Royal Society of Medicine, 99: 232–237. 5 Spece, R. G. & Bernstein, C. (2007). “What is scientific misconduct, who has to (dis) prove it, and to what level of certainty.” Medicine and Law, 26: 493–510. 6 Keranen, Lisa (2006). “Assessing the Seriousness of Research Misconduct: Considerations for Sanction Assignment.” Accountability in Research 13: 179–205. 7 Dresser, R. (1993). “Sanctions for research misconduct: A  legal perspective.” Academic Medicine, 68, S39–S43. 8 Bird, Stephanie J. (2004). “Publicizing scientific misconduct and its consequences.” Science and Engineering Ethics, 10: 435–436. 9 Masel, Joanna. 2014. “What Can Evolutionary Biology Learn from Creationists?” Scienta Salon. Available online at: http://scientasalon.wordpress.com/2014/09/16/whatcan-evolutionary-biology-learn-from-creationists/ (accessed 1-2-2018). 10 For example, Manning, Jason (2014). Personal communication. September  24 (via Pure Sociology Network). 11 Masel (2014). 12 Of course, the Airlines Pilots Association, as with all labor unions, serves a range of other important functions for their members, among them to ensure work rules and fair benefits. All of these functions are, rightly, protective of the work force. Nonunion labor can be self-protective in less socially beneficial ways. In Michael Lewis’s book Flash Boys, he remarks that upon discovery of very crooked practices by stock market players, the market rallied around those who commit such behavior. As the story goes, a faction of market traders learned the trick of high-frequency trading simply by making the distance between the trader’s computer and the market as small as possible, known as co-locating. The lucky traders who had the smallest distance possible between them and the receiving end of the market (by knowing who can set up the wire and by paying for this service) could beat any other traders to the punch, placing their bets more quickly and thus trading at a speed that cut out all competing traders’ trades. In short, those who knew the trick (knew who to call to set up the short-distance wire and paid for this advantage) made enormous amounts of money to the deficit of those who didn’t know the trick. When this became known to a trader who had suffered at the hands of high-frequency trading and who possessed a scintilla of morality, he wanted to go on a public information campaign to alert US investors about this unfair practice. His argument was: the investors were prey, and they needed to protect themselves from the predators’ new weapon. The market pressured him to say nothing. 13 Loftus, Elizabeth F. (2003). “On Science Under Legal Assault.” Daedalus, 132(4): 84–86. 14 Blythe, Will (2014). “Fired? Speak No Evil.” New York Times, January 3, p. A17. 15 Davis, Mark S. (2006). “Crimes mala in se: An equity-based definition.” Criminal Justice Policy Review, 17: 270–289. 16 Dresser, Rebecca. 1993. “Sanctions for Research Misconduct: A Legal Perspective.” Academic Medicine: Journal of The Association of American Medical Colleges, 68(9): 39–43. 17 Schmaus, Warren (1983). Fraud and the norms of science. Science, Technology & Human Values, 8: 12–22. 18 Schmaus (1983): 15. 19 Schmaus (1983): 12 20 Schmaus (1983): 21

92  Scholarly misconduct as crime 21 McCook, Alison, (2017). “$200  M research misconduct case against Duke moving forward, as judge denies motion to dismiss.” RetractionWatch, April  28. Available online at: http://retractionwatch.com/2017/04/28/200m-research-misconduct-caseduke-moving-forward-judge-denies-motion-dismiss/ (accessed 8-8-2017). 22 Fuchs, S. & Westervelt, S. D. (1996). Fraud and trust in science. Perspectives in Biology and Medicine, 39: 248–269. 23 Friedman, Paul J. (1996). “Advice to individuals involved in misconduct accusation.” Academic Medicine, 71(7): 716–723. 24 Friedman (1996). 25 Fuchs & Westervelt (1996). 26 Marshall, Eliot (1996). “Fraud strikes top genome lab.” Science, 274: 908. 27 MacIlwain, C. (1996). “ ‘Ambition and impatience’ blamed for fraud.” Nature, 384: 6–7. 28 Friedman (1996). 29 Poon, P. (1995). Legal protections for the scientific misconduct whistleblower. Journal of Law, Medicine & Ethics, 23: 88–95. 30 Dalton (1997b). 31 Dalton (1997a). 32 Lemert, Edwin (1951). Social Pathology: A Systematic Approach to the Study of Sociopathic Behavior. New York: McGraw-Hill. 33 Associated Press. (1988). Scientist given a 60-day term for false data. New York Times, November  11. Available online at: www.nytimes.com/1988/11/12/us/scientist-givena-60-day-term-for-false-data.html (accessed 1-2-2018). 34 Keranen (2006). 35 Davis, M. S. & Riske, M. L. (2000). “Scientific misconduct: A theoretical and empirical inquiry.” Paper presented at the annual meetings of the American Society of Criminology, November 15, San Francisco, CA. 36 United States Attorney, District of Vermont. (2005). Press release – Dr. Eric T. Poehlman. March  17. Available online at: https://ori.hhs.gov/press-release-poehlman (accessed 1-2-2018). 37 Some of this section has been taken from Davis, M. S. (1989). The perceived seriousness and incidence of ethical misconduct in academic science: Unpublished dissertation. The Ohio State University. 38 Sutherland, Edwin (1940) “White‑collar criminality.” American Sociological Review, 5: 1–12; Sutherland, Edwin (1949) White Collar Crime. New York: Dryden Press. 39 Clinard (1946). 40 Gibbons, Don C. (1979). The Criminological Enterprise. Theories and Perspectives. Englewood Cliffs, NJ: Prentice-Hall. 41 Geis, Gilbert (1967). “The Heavy Electrical Equipment Antitrust Cases of 1961.” In Marshall Clinard and Richard Quinney (eds), Criminal Behavior Systems. New York: Holt, Rinehart and Winston. 42 Geis (1967). 43 Conniff, Richard. 2015. “Revenge of the Jetta.” New York Times, September 27, p. 5 (Sunday Review). 44 Conniff (2015) p. 5. 45 Steinzor, Rena (2014). Why not Jail? Industrialized Catastrophes, Corporate Malfeasance, and Government Inaction. New York, NY: Cambridge University Press. 46 Conniff (2015) p. 5. 47 Conniff (2015) p. 5. 48 Conniff (2015) p. 5. 49 See, for example, Alaimo, Carol Ann. 2015. “UA Professor Who Plagiarized Student Gets Tenure.” Available online at: http://tuscon.com/news/local/education/us-professorwho-plagiarized 50 Dresser (1993).

Scholarly misconduct as crime  93 51 Von Hirsch, Andrew (1976). Doing Justice: The Choice of Punishments. New York, NY: Hill and Wang. 52 Dresser (1993): 39. 53 Keranen (2006): 180. 54 Keranen (2006): 179. 55 Keranen (2006): 181. 56 Schmaus (1983): 15. 57 See Wells, Frank (2001). “Fraud and Misconduct in Clinical Research.” Accountability in Research, 5: 25–38; Bird, Stephanie J. (2004). “Publicizing scientific misconduct and its consequences.” Science and Engineering Ethics, 10: 435–436; Keranen (2006). 58 Available online at: www.law.cornell.edu/uscode/text/18/1001 (accessed 9-15-2017). 59 No author. (2004). “Researcher given felony probation; Research misconduct found.” Office of Research Integrity Newsletter, 12, 2, 3. Available online at: https://ori.hhs. gov/images/ddblock/vol12_no2.pdf (accessed 1-2-2018). 60 It is interesting that we, the present authors, feel obliged to not identify the suspects and offenders by name. Of course, we were asked to not name names because the victims are fearful of retaliation. Yet, we cannot help but wonder whether our compliance in maintaining secrecy encourages the behavior of the offenders. 61 Berry, Bonnie. 1985. “Electronic Jails: A  New Criminal Justice Concern.” Justice Quarterly, 2(1): 1–22; Berry, Bonnie (1986). “More Questions and More Ideas on Electronic Monitoring.” Justice Quarterly, 3(3): 363–370. 62 Davis, Mark S. (2006). “Crimes mala in e: An equity-based definition.” Criminal Justice Policy Review, 17: 270–289. 63 Urquhart, James (2017). “Investigation sheds light on biochemist’s misconduct.” Chemistry World, August  9. Available online at: www.chemistryworld.com/news/investiga tion-sheds-light-on-biochemists-misconduct/3007841.article (accessed 9-4-2017). 64 Oransky, Ivan (2014). Should scientific fraud be treated as a crime? RetractionWatch, February, 11. Available online at: http://retractionwatch.com/2014/02/11/should-scientific-fraud-be-treated-as-a-crime/ (accessed 1-2-2018). 65 Southall, Ashley and Lewin, Tamar (2015). “New Harvard Policy Bands Teacher-Student Relations.” New York Times, February 6, p. A15. 66 Savage, Charlie (2012). “U.S. to Expand Its Definition of Rape in Statistics.” New York Times, January 7, pp. A10, A15. 67 Duff, R. A. (2014). “Towards a modest legal moralism.” Criminal Law and Philosophy, 8: 217–235. 68 Durkheim, Emile (1933). The Division of Labor in Society, translated by George Simpson. New York, NY: The Free Press.

7 Criminological theory and scholarly crime

One of the first efforts at a criminological explanation was offered by sociologist of science Harriet Zuckerman.1 In an essay on deviant behavior in science, she covered several possible explanations for such behavior including perspectives borrowed from criminology. After discussing theories of differential association, labeling, and anomie, she concluded that the latter did the best job of explaining departures from science’s moral norms. Less than a decade later, H. Kenneth Bechtel, Jr. and Willie Pearson, Jr. took their own inventory of theories and suggested that a sociological approach was preferable to focusing on individual pathology. They, too, posited that Robert K. Merton’s anomie theory, despite criticisms, offered promise as a framework for explaining deviance in science.2 In the mid-1980s Nachman Ben-Yehuda called for a “criminology of science.”3 But despite a very few inquiries into scientific fraud and related matters by criminologists, little has been done to heed his call. It is interesting that in the acknowledgements of her essay, Professor Zuckerman noted that she had consulted Marvin E. Wolfgang, one of the most prominent American criminologists at the time. That contact apparently did not spawn any criminological investigations of these phenomena. It was not until much later that criminologists showed some interest in scholarly misconduct. We would argue that with few exceptions the field of criminology remains uninterested in these forms of professional misconduct. A number of theoretical explanations can be used to explain the phenomenon of scholarly misconduct, some more suitably than others. Though by no means an exhaustive list, these perspectives, often overlapping, all serve well to explain the phenomenon of scholarly misconduct; the best-fit theory, however, seems to be a combination of rational choice and routine activities, infused with the latest general theory of conflict (moral time). Neglected below are theories that might also lend insight, but here we offer the sociological and criminological ones that best fit our analysis.

Techniques of neutralization Following on Edwin Sutherland’s notion that criminal behavior is learned through the process of differential association, a learning theory that states that offending

Criminological theory and scholarly crime  95 is dependent upon with whom one associates, Gresham Sykes and David Matza determined that learning involves motives, drives, rationalizations, and attitudes favorable to violation of law. It’s not that offenders see their offenses as appropriate behavior but that they believe their offenses are justified.4 Most people are socialized such that they hold conventional or prosocial attitudes about offending, attitudes that would typically prevent them from lawbreaking. Using techniques of neutralization, however, they generate excuses that free them from social constraints. Through the use of these techniques of neutralization, individuals motivated to commit offenses can do so without guilt. Of the neutralization techniques, those befitting the topic of scholarly crime include denial of responsibility, denial of victim, and condemnation of the condemners. If offenders can deny intent, they can avoid moral culpability for their offenses; thus, they can deny responsibility. They may explain their actions as an “accident,” as serendipity, or in some other way sidestep personal accountability. A plagiarist may justify duplicating another’s work by asserting that two scholars studying the same topic will, of course, find overlaps across their citations, conclusions, and content coverage. As to denial of the victim, moral indignation can be avoided if it can be described as an injury that is not wrong in light of the circumstances: The injury is not really an injury and the victim wasn’t really victimized. Instead, the victim becomes the wrongdoer. The fact that an appropriated work is not cited in the text or the index of the plagiarist’s book, which could be an ethical violation in itself, makes the accuser’s existence weakened, as though the victim does not exist, which then aids in neutralizing the wrongdoing. With condemnation of the condemners, offenders remove the focus of attention from their own misconduct and place the focus on the motives and behaviors of those who disapprove, for instance, the victim of the wrongdoing. Plagiarists and publishers of plagiarized works may attack the victims of plagiarism for bringing legal cases to light. The plagiarists and the publishers of plagiarized work will say that the accusations are without merit, whether they are or not, and the wrongfulness of the plagiarizing authors is more easily lost to view. Violators who exercise techniques of neutralization maintain the righteousness of their attitudes and behaviors which is a way of declaring their violations as “acceptable” if not “right.” Moreover, overlapping with the routine activities approach discussed below, a common perspective is that intellectual property (IP) theft is a “harmless” offense. As Luckenbill and Miller state, the desire to misuse intellectual property may stem from the belief that the behavior is victimless and harmless, since these offenses “lack the clear, tangible loss associated with conventional property crimes, and this seems to make the behavior justifiable, even acceptable”.5 White-collar crime scholar Michael Benson describes the cognitive dissonance offenders experience when they are caught in a violation. They identify themselves as respectable people, such as in a case of a well-known professor at a prestigious university, and yet they are accused of serious misconduct.6 Given who they are, they might say that they have not committed such an offense. When

96  Criminological theory and scholarly crime this happens, they offer “accounts” or “justifications” explaining why their behavior is not that of an offender. Sykes and Matza’s techniques of neutralization are one way of dealing with this cognitive dissonance, along with the psychological construct of rationalization. Rationalizations, as Benson uses the distinction, aid the offenders in coping with the stigma of being publicly defined as offenders. They offer explanations or excuses for their conduct that allow them to reconstruct themselves as lacking a guilty mind or intent, and thus deny that they are, in actuality, offenders. Offenders can also neutralize their own guilt by saying that “everyone does it.” For example, everyone “borrows” other scholars’ work, which may be true even if it is not ethical. White-collar offenders, which we argue includes research violators, may suffer subjectively because of the public humiliation of being adjudged offenders. But they also, paradoxically, are able to maintain a non-offender selfconcept and deny their wrongdoing. They interpret their offenses as harmless, as a “technical violation.” This reinterpretation is aided by the diffusion of guilt and by pointing out that their other professional practices are intermingled with legitimate behavior. Such offenders commonly attempt to negate their blameworthiness by setting their offenses within the context of otherwise impeccable lives. They point to lifetimes of socially acceptable and professional successes as excuses for the occasional indiscretion. Moreover, there is the complexity of laws, copyright and other, which serves to confuse matters, thereby aiding in making an offense seem explainable and excusable. Techniques of neutralization as rationalizations for committing serious scientific misconduct occur in a wide range of disciplines. A recent example of a cancer researcher illustrates this when he “acknowledged that his colleagues ‘were lax in certain regards in the preparation of papers,’ but he denied having committed a grave offense”.7 In other words, he exercised denial. He said that his published-but-falsified “studies were retracted because they used pictures from older papers, rather than from the experiments described in the studies. He went on to say, “I think this re-use is not a scientific misconduct,”8 assuming that, by saying so, we would all support his contention that there was no wrongdoing. This researcher’s willingness to even talk about his misconduct is rare. An editor of medical journals concluded that those researchers who are found guilty of sloppy or fraudulent research continue to maintain their innocence by denying they did anything wrong, admitting guilt but refusing to talk about it, and vanishing “from the face of the earth.”9 Thus far, much of the work done on research misconduct focuses on describing the cases and theorizing the possible causes for committing acts of fabrication, falsification, and plagiarism. Many studies speculate that researchers act unethically due to ignorance of ethical standards, or as a result of acting in haste and making honest mistakes.10 Other observers find that those guilty of research misconduct report reasons such as pressure to publish, mental and physical exhaustion, and unbearable experimental and clinical loads which numb judgment.11 In effect, these accused researchers are presenting rationalizations or neutralizations for their behavior.

Criminological theory and scholarly crime  97 Criminologist Michael Hindelang argued that neutralization is necessary only if there are moral bonds to begin with.12 His notion of a more generalized release from moral constraints may work better with delinquents than it does with scientists. No one thus far has argued that there is a widespread moral complacency among scientific researchers. We maintain that in the absence of data to the contrary, scientists in general do abide by ethical standards and, according to neutralization theory, extenuating circumstances do indeed play a role in episodic deviation from scientific norms. With respect to research ethics, most researchers maintain conventional values and attitudes on the conduct of and reporting of research. However, in light of multiple pressures and stresses faced by researchers in competition for position and financial benefit, they may develop neutralization techniques to neutralize any internalized feelings such as guilt. It is then possible that individuals who commit research misconduct may employ one or more of these techniques in order to rationalize their conduct. Mark Davis, Michelle Riske-Morris and Sebastian Diaz analyzed data from ORI’s case files.13 Among other findings, the data showed that a number of respondents did in fact employ “neutralization techniques,” that is, rationalizations which permit offenders to overcome misgivings about engaging in misconduct. These rationalizations included jumping the gun on research findings and avoiding degradation from others. Techniques of neutralization are employed by individuals and organizations of individuals. Rationalizations may permeate an organization, but it is human agency that permits some to use such techniques to free themselves of moral constraints while others choose to abide by social norms. To what extent are such rationalizations part of personality? Do organizations create an environment in which these are more likely employed?

General Strain Theory One of the more popular and enduring theoretical perspectives in criminology is Robert Agnew’s General Strain Theory.14 Building on earlier work of Robert K. Merton, Agnew sought to identify specific variables that affect, constrain and motivate delinquents to commit violations. Agnew’s strain theory may also be applied to instances of research misconduct. His proposition that offenses are more likely when the constraints against them are low and the motivations for them are high suggests, specifically, we are constrained from committing violations by external control; that is, the fear of being caught and punished. We are also constrained if we have a lot to lose if we are punished; thus, we have a stake in conformity. Or we may be internally constrained, internally controlled, if we believe that crime or research misconduct is wrong. Agnew reminds us that the motivations to commit violations refer to the enticements and pressures to do so. As we have discussed already, the rewards for committing scholarly misconduct can be enormous. In the world of scholarship and

98  Criminological theory and scholarly crime science, careers are enhanced through publications, developing breakthrough findings, and attaining external funding, whether these rewards are legitimately obtained or not. A second proposition advanced by Agnew is that we are constrained from committing additional offenses depending on the reaction to our prior offenses. If sanctioning bodies, such as university disciplinary boards, fail to respond to offenses in a punitive manner or respond in a manner that rejects the offense but forgives the offender, we will have an absence of constraints and, correspondingly, encouragement for future misconduct. As we mentioned in Chapter 3, Brian Martinson and his colleagues incorporated Agnew’s General Strain Theory in their study of the organizational sources of research misconduct. Using the literature on organizational justice, they posit that perceived unfairness in the workplace constitutes a strain on employees, in this case scientific researchers. This strain makes it more likely that researchers will engage in misconduct. The results supported their hypothesis: Those who perceive organizational injustice are more likely to engage in scholarly misconduct. In sum, General Strain Theory has become one of the more popular perspectives in criminology. It now has several decades of research that provides support for its main principles. The one application of this theory to research misconduct and related behaviors suggests that one form of strain, procedural injustice, does increase the likelihood of misconduct. One of the problems we see with this theory’s fit for research misconduct is how strain is conceptualized. When it is given a specific definition such as procedural injustice, it lends itself to measurement. However, it is conceivable that anyone who becomes involved in research misconduct can point to some specific aspect of their situation or environment, assert that that element constituted strain, and then use this constituted strain as a convenient excuse. Virtually any strain, defined as broadly as possible, could serve as a version of “the devil made me do it.” It could be argued that we all must cope with a wide array of strains, including those we discussed in Chapter 5, as situational factors. The presence of strain still does not explain why some of us use socially acceptable ways to cope with those strains, whereas others cave to the pressures of life and engage in maladaptive behavior. Moreover, there are a number of protagonists in the annals of scholarly misconduct who were not under any discernible form of strain. Tenured full professors, those who have security of position and income, cannot assert that the strains associated with organized scholarship drove them to offend. Sir Cyril Burt, for example, was well-established as an eminent educational psychologist. He had proven long before his offense his ability to survive in the academic world.

Routine activities, rational choice, and Opportunity Theory Douglas Adams and Kenneth Pimple examined the literature for explanations of criminal and deviant behavior and discarded the usual explanations such as individual responsibility, individual psychological states (for example, variations in

Criminological theory and scholarly crime  99 beliefs and values), irrationality (faulty rational calculations), and disadvantaged social environments (poverty, membership in an alternative subculture, and the like.15 Instead, they point out that most illegal and unethical offenses are resistant to attempts to modify rational calculative decision processes, which makes sense given that offenses are often the result of rational calculation. The focus on routine activities, rational choice, and opportunity models places greater emphasis on situational factors that enhance or limit the opportunity for unethical behavior.16 A situational factor that may increase the likelihood of scholarly misconduct would be a fantastic journal article, already written and under review, being plagiarized by the journal reviewer. That has happened and does happen. The opportunity model of Routine Activity Theory was proposed by Lawrence Cohen and Marcus Felson17 and later enhanced, extended, and further developed by Felson.18 As to opportunity, routine activities theory points to simple assumptions about offender motivation and decision processes, namely that there are three fundamental “almost-always” elements that define criminal acts: (a) a likely offender, (b) a suitable target, and (c) the absence of a capable guardian.19 A likely offender could be someone who could benefit from the misconduct monetarily or with an easy publication, a suitable target could be an already completed book, and the absence of a capable guardian could refer to the absence of controls in place such as whistleblowing colleagues, copyright laws, university disciplinary boards that could challenge the misconduct. These offenses are facilitated by ready access to the Internet (by online sources of information and free access to research), mainly through exploitative technologies. These are the avenues or devices which enable motivated offenders to misuse intellectual property in ways that violate the rights of owners. One of the key elements of Opportunity Theory is the assumption that even if a motivated offender exists, misconduct will not occur without the opportunity to engage in it. To reiterate, there are two important dimensions of opportunity, according to Cohen and Felson’s routine activity theory: An attractive target such as a book worth copying and the absence of capable guardians such as those willing and courageous enough to witness, to call attention to the misconduct, and to enforce sanctions against the offender. Capable guardianship is critical; without it, misconduct is likely to occur. Specific to intellectual property (IP) offenses, Luckenbill and Miller find that the desire to misuse intellectual objects appears to be pervasive.20 This may be due to the ideology that informs intellectual property law. When “law endorses the economic principle that creators have a right to remuneration for their labor, it also endorses the principle that people have the right to as much information as they deem necessary. The idea is that if people are to lead meaningful lives, they must be able to draw upon and use the work of others. This principle encourages people to regard intellectual objects as public goods and to use them in almost any way they see fit.”21 As scholars, we can all agree that our work builds upon the work of other scholars but to replicate an entire book and attach the copier’s name to it is not

100  Criminological theory and scholarly crime building upon the foundation created by another, it is squatting and then staking an illegitimate claim to the other’s property. It must be pointed out, however, that sometimes IP theft is obvious, sometimes less so, sometimes it is not obvious at all; for example, plagiarism, as word-for-word copying, is obvious while managed copying and the theft of conference papers and ideas are not so obvious or detectable. Marcus Felson remarks that we can control criminal offenses, and the present authors would guess scholarly misconduct as well, if we control the situations that motivated offenders encounter.22 Those who are predisposed to offending are more likely to engage in offenses in situations where the costs of offending are low and the benefits are high, where the chances of getting caught and punished are slim and attractive targets are present. Since that is the case, we need to reduce the benefits of scientific misconduct. It is foolhardy, Felson thinks, to hope to improve the human character.23 A situational approach to prevention means that we increase the range of costs, including informal sanctions and “moral costs,” while decreasing the benefits of offenses such as those accrued from scientific theft. If we could, for instance, increase the sense of shame forthcoming from committing research misconduct, we could reduce the benefits of misconduct. It is a simple matter of understanding that committing offenses is a rational decision, drawing on classical theory and economic theories of crime: the decision to commit offenses, be they ordinary crimes or scientific misconduct, is a cost-benefit analysis.24 We must recognize the importance of incentives and disincentives, notably rewards and punishments. Routine activities theory has a lot to offer the analysis of scholarly crime’s etiology.

Moral boundaries One interesting sociological approach to research misconduct is that offered by Nachman Ben-Yehuda.25 He concluded that behavior such as scientific deviance is relative as the moral boundaries of society shift. That is, what may be deviant in one era may not be in another. But when these cases emerge, the scientific community has the opportunity to assert that which is acceptable within the discipline. One has only to look at the literature on research misconduct over the past thirty years to verify that such definitions and redefinitions have occurred. We particularly like that Ben-Yehuda added to his analysis an explanation of how these structural mechanisms influence the behavior of individuals. Incorporating marijuana use for his illustration and crediting sociologist C. Wright Mills, he asserts that “motivational accounting systems” permit the marijuana user to benefit from the social acceptability and support that accompanies this role of marijuana user. Extending this to scholarly crime, it may be that the socialization of scholars in competitive, visible, and reward-rich universities permits them to justify their behavior, somewhat akin to the techniques of neutralization mentioned earlier. If this is correct, then it suggests a level of acceptability of scholarly crime within these environments. Inasmuch as we know that many such incidents

Criminological theory and scholarly crime  101 occur not just with impunity, but with organizational indifference and even defensiveness, the conclusions do not bode well for the world of scholarship. Ben-Yehuda’s theory of research misconduct is compelling. However, as with other more structural theories of human behavior, we think it gives too little attention to the role of human agency, that is, the ability of individuals to choose courses of action on their own volition.

Moral time The theory of moral time is an important global theory that explains all conflict, and is part of a more general theoretical strategy, pure sociology, which is applicable to all social behavior.26 Pure sociology is the paradigm explaining human behavior, particularly conflict, with conflict meaning “a clash of right and wrong: a matter of morality.… Conflict occurs whenever anyone provokes or expresses a grievance” such as scholarly misconduct.27 As we will discuss in the last chapter, resolution to conflict is also well-explained by pure sociology/moral time via avoidance, negotiation, restitution, gossip, apology, banishment and exile, and other means. Pure sociology explains behavior in terms of its social geometry, in other words, behavior’s location and direction in a multidimensional social space.28 The theory of moral time explains conflicted behavior, thus, notably deviant behavior. It explains why conflict occurs and why some conflict is deemed more serious than other conflict, which is useful to the study of scholarly misconduct since most of the public and a large part of the scholarly community do not think of scholarly offenses as rampant or serious. This lack of recognition is a source of conflict by itself since the victims of scholarly misconduct, like the victims of rape, are ignored at best and punished at worst for bringing it to light. We know that ordinary crime, street crime, is considered to be more serious than elite or white-collar crime. Most people in the general public or in social control agencies view scholarly misconduct, if they think of it at all, as insignificant in terms of social harm. Perhaps for this reason, adequate reporting and control resources mainly do not exist. To way oversimplify, moral time describes conflict as arising from by overintimacy or underintimacy (when people are too intimate or not intimate enough, the relational time dimension), overstratification and understratification (when status differentials cause unease, the vertical time dimension), and overdiversity and underdiversity (when there is an abundance of diversity or not enough diversity which makes some people uncomfortable, the cultural time dimension). The diversity dimension, as part of cultural time, is also applicable to scholarly misconduct in that overdiversity can confuse the rules of appropriate scientific conduct: Scientists trained in cultures other than the U.S. and Europe adhere to differing codes of conduct. Whether this is an excuse for deviating from the rules of good scientific conduct is a matter of debate. The juncture at which moral time best applies to scholarly misconduct is the vertical dimension, with its focus on stratification. As we have already seen, power and status have a great deal to do with the commission of and, importantly,

102  Criminological theory and scholarly crime the obscurity of scholarly misconduct. Overstratification refers to an increase in inequality, with two subdimensions: Oversuperiority (rising above others) and undersuperiority (a decline in status). Understratification, a decrease in inequality, also has two dimensions: undersuperiority (the decline in a superior status) and underinferiority (the rise of an inferior which threatens the superiority of others). Recalling the discussion about victims of scholarly misconduct fearing reprisals of superior-status scholars, an unequal status reinforces the status quo of scholarly misconduct as “normal” or “okay.” Recall also that the superior status itself shields the perpetrator from charges of misconduct. Now, if the whistle is blown and the misconduct is made evident, we may have an instance of undersuperiority in which the once-revered scholar is exposed and her or his status is reduced. All of these circumstances create conflict. Someone, the victim or the perpetrator, is made uncomfortable. Drawing upon moral time theory, a new study of how university professors react to discovery of their work being compromised shows that such behavior is not uncommon and the most common response, upon discovery, is to do nothing.29 In this study, the authors report findings from interviews with 70 university faculty members of two large institutions, the faculty representing all ranks and a range of disciplines including social sciences, humanities, physical sciences, and math/statistics/engineering/computer science. The authors found an array of reactions, formal and informal, to learning that their work had been stolen, but mainly the response was restraint: Academics rarely confront offenders or do so only mildly. They may punish offenders by undermining their ideas or their careers, which may be a hidden process in which gossip plays a role. But a common reaction is exit or avoidance, in which the victims do and say nothing to the offenders or to their colleagues. By “exit,” Cooney and Phillips mean exiting the situation (not usually by leaving one’s job although that can happen); by “avoidance,” they mean not directly confronting the offender or engaging in any formal or even informal attempt at notifying the discipline or seeking legal redress. Let us here pose the question of whether academics are more reticent than others to engage in conflict, to report or do anything about their work being compromised. After all, we have a reputation as meek individuals. Or, and this is a very important question, are scientists reluctant to confront research misconduct because we feel that nothing can or will be done about it? This latter question returns us to one of Cooney and Phillips’s more significant findings: Misconduct victims are far less likely to contest being wronged when the offenders are powerful. This makes sense since people on the whole are more reluctant to confront those with power and status. Should this general pattern be so since, while one might argue that those with power and status have the wherewithal to crush their opponents’ victims of scholarly misconduct in this case, they also have the most to lose if their misconduct becomes known? Cooney and Phillips, in their interviews with university faculty about their experiences with scholarly misconduct, frame their interpretation as a test of Black’s principles. Besides the major finding that most victims show restraint and do little or nothing in response to being victimized in this fashion, one of the

Criminological theory and scholarly crime  103 most significant findings from this study demonstrates the common theme running throughout the present text that powerful offenders are the least likely to be confronted. Here we find empirical support for Black’s principles, particularly the one referring to stratification. Of course, the process of being victimized and responding to victimization is fraught with conflict and, in a perfect world, victims would hope to reverse their victimization and return to a more equal status with the offender, if it could happen through reporting the deviance and punishing the deviant. Instead, unequal statuses being what they are, with the contingent dangers of going up against a powerful offender, victims will avoid conflict. This common pattern is evident in our study. From the stories that Bonnie gathered (see Appendix A) and from our many conversations with alleged victims over the years, it is obvious and unsurprising that power plays an enormous role, to the extent that victims are intimidated and will keep silent rather than confront their offenders. It doesn’t need to be this way. In Chapter  9 we will suggest a number of strategies to prevent and control scholarly misconduct, but underlying these strategies is a simple truth: Powerful offenders, because of their power and status, are seldom discovered or punished. However, we suggest that powerful offenders, because of their power and status, have the most to lose. Hence, if their offenses were made known, the victims and the sciences as a whole could regain an equal footing in Black’s terms. This process would require a number of changes which we will detail, such as clearer definitions of misconduct, an improved reporting process, and a justice process that involves education and restoration. The theory of moral time explains the rules about how people are expected to behave and veers from a common interpretation that deviant behavior is that which is so labeled30 or behavior that violates a standardized rule. It is at this juncture that moral time has particular relevance to scholarly misconduct. Everyone – the public, legal systems, and scientific researchers themselves – expects scholars to behave in a moral manner. We are expected to be above petty jealousies, above theft, and to adhere to the scientific codes of conduct. Our purpose should be to advance our respective disciplines. As with misbehaving priests, it comes as a shock when scholars are discovered to operate outside the realm of good conduct. Yet it happens, probably a great deal more than is known.

Conclusions We set out to try to understand what causes scholarly conduct. As criminologists, it was natural that we would apply some of the perspectives from our field. Our conclusion is that every perspective has merit, particularly if the analyst is open to the notion that various etiological factors can and do work in concert to bring about phenomena such as scholarly violations. We also found the selected theories wanting in important ways. There exists a fracture between sociological criminology and the criminology of the individual, with few people seemingly interested in reconciling the differences. We feel that prevailing perspectives have gotten too far away from norms, particularly those

104  Criminological theory and scholarly crime that operate not only in criminal subcultures, but in mainstream society including academic circles. What criminology needs is a new perspective or a major overhaul of one or more existing ones that better integrates not just criminological theories, but notions about acceptable behavior and unacceptable behavior. Here we harken back to the criminal law concept mala in se. As problematic as definitions have been in the past, mala in se conveys the notion that there are behaviors regarded as wrong independent of any legal proscriptions. Criminology should take this concept and try to explain not just criminal behavior as legally defined, but wrongful behavior that may or may not be against the law, but which results in harm, is regarded serious, whether measured by seriousness studies or simply in the minds of ordinary people.

Notes 1 Zuckerman, Harriet (1977). “Deviant Behavior and Social Control in Science.” In Edward Sagarin (ed.), Deviance and Social Change, pp. 87–138. Beverly Hills, CA: Sage Publications. 2 Bechtel, H. Kenneth, Jr. & Pearson, Willie, Jr. (1985). “Deviant scientists and scientific deviance.” Deviant Behavior 6: 237–252. 3 Ben-Yehuda, Nachman (1986). “Deviance in science: Towards a criminology of science.” British Journal of Criminology, 26: 1–27. 4 Sykes, G. & Matza, D. (1957). “Techniques of neutralization: A theory of delinquency.” American Sociological Review, 22: 664–670. 5 Luckenbill, D. F. & Miller, K. (2011). “Intellectual Property Crime.” In Clifton D. Bryant (ed,), The Routledge Handbook of Deviant Behavior, pp. 434–440. New York, NY: Routledge. 6 Benson, Michael L. & Moore, E. “Are white-collar and common offenders the same? An empirical and theoretical critique of a recently proposed general theory of crime.” Journal of Research in Crime & Delinquency, 29: 251–272. 7 Zimmer (2012b). 8 Zimmer (2012b). 9 Zimmer (2012b). 10 Woolf, Patricia (1981). “Fraud in science: How much, how serious?” The Hasting Center Report 11(5): 9–14; Hansen, B. C. & Hansen, K. D. (1995). “Academic and scientific misconduct: Issues for nursing educators.” Journal of Professional Nursing, 11: 31–39. 11 Lynch, A. (1994). “Ethics in dental research. Publication of research: The ethical dimension.” Journal of Dental Research, 73: 1778–1782.; Smith, M. M. (1992). “Chiropractic research: The ethics.” Journal of Manipulative and Physiological Therapeutics, 15: 536–541; LaFollette, M. C. (2000). “The evolution of the ‘scientific misconduct’ issue: An historical overview.” Proceedings of the Society for Experimental Biology and Medicine, 224: 211–215. 12 Hindelang, M. (1974). “Moral evaluations of illegal behavior.” Social Problems, 21: 370–385. 13 Davis, Mark S., Riske-Morris, Michelle L. & Diaz, Sebastian R. (2007). “Causal factors implicated in research misconduct: Evidence from ORI case files.” Science and Engineering Ethics, 13: 395–414. 14 Agnew, R. (2001). “Building on the foundation of General Strain Theory: Specifying the types of strain most likely to lead to delinquency.” Journal of Research in Crime and Delinquency, 38: 319–361.

Criminological theory and scholarly crime  105 15 Adams & Pimple (2005). 16 Adams, D. (2000). “The opportunity structure of deviance.” In Encyclopedia of Criminology and Deviant Behavior. Volume One: Historical, Conceptual and Theoretical Issues. New York: Taylor & Francis, cited in Adams & Pimple (2005). 17 Cohen, L. E. & Felson, M. (1979). “Social Change and Crime Rate Trends: A Routine Activity Approach.” American Sociological Review, 44: 588–608. 18 Felson, Marcus (2002). Crime and Everyday Life, 3rd edition. Thousand Oaks, CA: Sage Publications. 19 Felson (2002), p. 21 20 Luckenbill & Miller (2011). 21 Luckenbill & Miller (2011), p. 437. 22 Felson (2002). 23 We are not in agreement with Felson. Indeed, the recent criminological literature reveals that indicated interventions such as cognitive behavior therapy have proven effective in changing maladaptive thinking patterns of individual offenders. And with regard to scholarly misconduct, efforts such as those being undertaken at Washington University in St.  Louis suggest that specialists in research misconduct have not rejected rehabilitation and reintegration as promising strategies. 24 Cornish, Derek B. and Clarke, Ronald V. (eds) (1986). The Reasoning Criminal: Rational Choice Perspectives on Offending. New York, NY: Springer-Verlag. 25 Ben-Yehuda, Nachman (1985). Deviance and Moral Boundaries. Chicago: University of Chicago Press. 26 Black, Donald (2011). Moral Time. New York: Oxford University Press. 27 Black, Donald (1998). The Social Structure of Right and Wrong. San Diego: Academic Press. 28 Cooney, Mark (2009). Is Killing Wrong? A Study in Pure Sociology. Charlottesville, VA: University of Virginia Press; Berry, B. (2013). “Occasional essays inspired by provocative reading: A review of Moral Time by Donald Black.” The Criminologist, 38(1): 36–38. 29 Cooney, Mark and Phillips, Scott (2017). “When Will Academics Contest Intellectual Conflict?” SOCIUS, 10: 1–15. 30 Becker, Howard S. (1963). Outsiders: Studies in the Sociology of Deviance. New York, NY: Free Press.

8 Implications for theory and research

Theoretical implications Below we summarize some of our thinking about a topic has been and remains a difficult one to wrap one’s brain around. It is so encompassing of many disciplinary fields, there is still so little known about it, and it is quite complicated. As to the last point, in a larger sense, what we have written about in this book is human behavior. Most of us do our best, probably, and do not harm others. But there are those of us who do commit grave harm to individuals and societies, in criminally-defined and -undefined forms, and we seek an understanding of how we may reduce these harms. 1 Scholarly crimes cause substantial harm to individuals, to organizations, and to society Let’s start with the financial aspect of scholarly crime, though it’s not necessarily the most important form of harm. It could be argued that of those individuals deemed guilty of scientific misconduct, their misdeeds wasted hundreds of millions of dollars in federal funds. This is just those found guilty by ORI. Add to those the cases whose funding came from the National Science Foundation and other federal funding agencies. Victims of scholarly crime feel violated in ways similar to victims of conventional crimes. When someone steals your property – be it a chair from your porch or a credit card from your wallet – your sense of trust and innocence is lost, never to be fully restored. Similarly, when someone misappropriates your idea or pawns bogus data on you, you feel that your trust in other scholars will never be what it was. When Bonnie learned that another scholar had largely duplicated her book, she felt violated. Something of hers had been misappropriated, and it was something that could not be returned. All these harms should be important not only to scientists and scholars, but to lay women and men who care about fairness in everyday transactions. They not only violate fundamental norms that prevail in social exchanges, they are serious enough to warrant definition as crimes. For the same arguments that were made decades ago to define white-collar and corporate offenses as criminal, so should we now include scholarly crimes in our array of unacceptable behaviors.

Implications for theory and research  107 2 Trust, while important to scholarly and scientific work, will not prevent scholarly crime The world of scholarship has relied on trust, the belief of each scholar that other scholars will behave competently, honestly, and in the interests of the discipline. We know this is only an ideal, but it is one to which all scholars should aspire. Otherwise, the lines of distinction between scholarly nonfiction and works of fiction start to blur, and we begin to question the worth of the work and the integrity of the person who produced it. Was this Pulitzer Prize-winning book on economics written by the author, or was it ghosted by graduate students who felt obligated to do his work? Are the arguments and conclusions presented in the book based on verifiable data, or were they contrived to support the author’s new theory? The record of both crimes and misdemeanors of scholarship and science suggest it is naïve to think that trust will sustain integrity. There have been too many instances wherein scholars serving as journal referees or grant reviewers have misappropriated work under review. Likewise, there have been too many cases where scientists have modified or duplicated images in order to make their work publishable. There simply have been far too many instances in which trust in other scholars was misplaced. A professor with whom the authors are acquainted had his paper idea stolen during peer review. The thief later published his own paper on the topic using a similar methodology. When the victim brought his case to the attention of his university and that of the offender, neither was willing to take any purposive action. Based on our knowledge of the case, there is no reason for this professor to trust other scholars, as there is no reason for him to trust the system to pursue justice for victims. That said, this does not mean that scholars shouldn’t behave as trustworthy as possible. The recognition that there are numerous actual and potential offenders shouldn’t turn into a cynicism so severe that no one in the community of scholars can ever trust each other. Instead, mutual trust should be an ideal that mentors communicate to students and trainees. 3 Some of the most prominent and powerful actors are among the worst offenders One finding from our study suggests that, as with research misconduct, the theft of ideas may be more prevalent in some of the most prestigious universities and committed by some of the most-respected scholars. What is difficult to explain is offending by those who, by virtue of their educational attainment and scholarly work and have already proven that they are capable of achieving at extremely high levels, go on to misappropriate the intellectual work of others. The holder of an endowed professorship at an elite university has ostensibly demonstrated to colleagues and the world that she is capable of meeting the demands of academe. She has graduated from a top-notch program that admits a select few. She has published work in some of the most prestigious journals in her field. Yet she engages in misconduct. One explanation is that, no matter how much one achieves, it is

108  Implications for theory and research never enough; some scholars want more recognition, more accolades, more media attention, and more respect. Unfortunately, the power that permits motivated offenders to plagiarize or steal the ideas of others may also slow movement toward appropriate definitions and social control. Some offending scholars are prominent in scholarly societies, and thus can throw up roadblocks to those who would promote formalized ethical standards and their enforcement. 4 Acquiescence emboldens motivated offenders Some of the offenders we have discussed surely would have had their scholarly criminal careers cut short if someone had blown the whistle on their behavior earlier. Reluctance to report such offenses may be sourced in the same reluctance to report ordinary crimes, such as witnessing rape, robbery, assault, and so on. Witnesses may be afraid of the perpetrators harming them in some way or they may not want to be bothered with a time-consuming process of reporting and following up on the report. That said, it would be helpful if victims and witnesses of scholarly crime step up and confront their offenders. Otherwise, scholarly offenses continue and worsen. In the Afterword, we note that whereas most social and behavioral science associations have long had codes of ethics, the American Society of Criminology was long a notable exception. It is not that the ASC has not wrestled with the issue but that they repeatedly voted not to adopt such a code. We are not naïve enough to think that the mere existence of a code of ethics magically transforms the association’s members into people of integrity. But the existence of an ethics code conveys a message to members that fair, honest and professional behavior is expected. This form of organizational acquiescence or indifference may be worse than that of individuals in that it helps perpetuate the problem of scholarly misconduct. 5 What constitutes a scholarly crime should be grounded in the norms of fairness and reciprocity While Robert K. Merton and other sociologists of science1 have made convincing arguments about the existence and importance of specific norms that govern scholarly and scientific work, a close examination of the theft of ideas and other more serious forms of scholarly misconduct suggests that such behaviors violate not only scientific and academic norms, but also more general norms that govern day-to-day behavior. Warren Schmaus made such a suggestion when he asserted that most forms of research misconduct violate the more general moral obligation to do one’s duty.2 Sociologist of science Harriet Zuckerman took Schmaus to task for suggesting that a single norm was at issue in instances of research misconduct.3 She countered that moving away from Mertonian norms of science toward something as general as “doing one’s duty” represented a theoretical step backward.

Implications for theory and research  109 While there is convincing evidence that moral norms exist and operate in the world of science and scholarship, we also would argue that more general norms also operate. When scholars go to work, they don’t hold in abeyance the general norms they’ve internalized. And the scholarly world, as unique as it is – perhaps guided in part by specific norms and role requirements – is not insulated from the rules by which larger society operates. And so we think Schmaus may have been onto something in trying to explain research misconduct by a more general norm or set of norms. It is precisely the kind of conclusion Robert Agnew offered when called on by criminologists to examine factors that weaken social concern.4 We are prepared to argue that the more serious forms of research misconduct violate norms with which anyone can identify. If a person steals another person’s watch, or they steal another person’s idea, in each case the offender is a thief who has taken something to which he is not entitled. From the victim’s perspective, the loss of a watch and the loss of an original idea is still a loss. At least in the case of the watch, it can be returned and the victim made largely whole. Not so with the theft of the idea which, once published under the thief’s name, can no longer be returned to the originator. Other theoretical considerations We maintain that the causes of scholarly misconduct are known, and that they can be reduced to five factors we’ve discussed which can and do overlap with one another, increasing the probability that an instance of scholarly offending will occur. Other criminological theories, such as we have described here, play a necessary role in framing our view of scholarly misconduct and, significantly, aid in our recommendations for alleviating the confusion about this form of misconduct. Despite the respective strengths of various criminological theories, we find them wanting when applied to scholarly crime. Because these offenses occur outside the world of scholarship, we reject the notion that they only violate science’s norms. Taking something that belongs to another is more fundamental to human nature. One need not be a scholar or scientist to feel approbation at such behavior. What is it about faking data or stealing the idea of another that offends basic standards of propriety? And what are the implications of these types of behaviors for other forms of crime? Criminologists and other scholars who study legal and social science matters might want to more closely examine the connections between behaviors defined as illegal and those that characterize everyday life. What is it about criminal behavior that violates not only law, but also more general norms? How is it that those norms, then, become embodied in our criminal law? The offenders who steal ideas from others are not poor youths from socially disadvantaged neighborhoods. They are not convicted felons who have learned improved crime techniques from fellow prison inmates. For the most part they are intelligent, well-educated professionals from advantaged backgrounds. Life has not only put them in a strong starting position, many have a lead on their peers.

110  Implications for theory and research We refer the reader to Chapter 7 in which we present several criminological perspectives in our attempt to explain scholarly misconduct. None should be discounted although some are more amenable than others to explaining this underresearched form of social harm. Among the more relevant theories are rational choice (since scholarly misconduct is often a choice and even a rational one assuming that the scholar is lacking a conscience), opportunity (since opportunity to commit scholarly misconduct is useful if not essential to committing this offense, the same as it is for ordinary crime), and techniques of neutralization (since we have seen evidence that scholarly offenders rationalize their behaviors to make them seem not harmful). We do not want to overlook new theoretical approaches such as Donald Black’s conflict-centered moral time explanation.5 There are a lot of factors – cultural, economic, and others – that go into explaining any kind of human behavior, and it is unhelpful to exclude any theoretical explanation that works as, hopefully, we have stated in this book. The study of scholarly crime punctuates the point English jurist Sir William Blackstone made long ago that what defines criminal behavior transcends what may or may not be proscribed by criminal law.6 The simple reworking of someone else’s hard-earned intellectual labors is morally if not legally unacceptable. However, scholars in criminal law and criminology need to rethink how we discuss what is criminal and what is not. And that has been one of the main purposes of this book. FF&P violate fundamental notions of fairness. It is not fair to make up data and pass it off as legitimate because in doing so, the offender derives an outcome to which he or she is not entitled. Bogus data should not net a published journal article, submitted grant application, or a published book. We expect others to do the requisite work in order to reap the appropriate rewards.

Research implications To date, numerous surveys have been conducted to assess various aspects of scholarly misconduct. Some have examined the perceived incidence and prevalence of misconduct among various segments of the scientific community. Others have assessed the perceived seriousness of various forms of research misconduct and other detrimental research practices. Yet, there is room for more surveys since, although we do have some accumulation of information from surveys, basically there remains a dearth of survey knowledge on the topic of scholarly misconduct. For example, we have noted the paucity of research on the role of culture in scholarly crime. It is a somewhat sensitive subject and not part of mainstream RCR research. Seriousness scaling, which has been used to assess cultural conflict, should be employed in future studies of scholarly crime.7 One very significant group of people who could provide massive insight into scholarly misconduct are Research Integrity Officers (RIOs). RIOs are the university and research institute employees who are specifically tasked with investigating whether research malfeasance has occurred. Asking them how much misconduct they uncover, of what types, would be very helpful to understanding this phenomenon. Moreover, they have knowledge about the rationalizations

Implications for theory and research  111 offered, the histories of the offenders, the damage done by misconduct, and how misconduct is hidden. Longitudinal and multilevel studies We have argued that the failure to adhere to RCR is best represented by overlapping rings, each representing one of the five etiological factors including structure, culture, organization, situation, and individual. So as much potential as organizational justice has for helping to explain research misconduct, it should not be used to the exclusion of other causal variables. The importance of justice in the workplace notwithstanding, research misconduct is multidimensional and, like workplace misconduct, comprised of other factors such as personal traits and human agency.8 Research has shown that individual factors influencing perceptions of organizational justice include personality,9 independent self-construal,10 and negative affectivity.11 With respect to research misconduct and related practices, research has shown that narcissism and particularly a sense of entitlement are significantly related to RCR departures.12 We therefore recommend that future research address the interactions among these different factors. For example, under which conditions of fairness or unfairness are narcissists more likely to depart from RCR standards? How does tenure status affect the role of organizational injustice? The work of Brian Martinson and colleagues suggests that those early in their careers are more susceptible to injustices.13 How do such factors operate in relation to cultural background? Can we assume that everyone, regardless of ethnicity and national origin, approaches issues of justice in the same way? All these questions argue for prospective longitudinal and multilevel studies that consider the simultaneous effects of all these variables and their interaction effects. Organizational justice is predicated on the assumption that injustices – whether distributive, procedural, or interactional  – occur primarily within the immediate organizational environment. This has been the focus of most of the psychological studies of organizational justice. There are, however, potential sources of professional injustice that lie outside the immediate environment. For example, Patricia Keith-Spiegel and Gerald Koocher implied that Institutional Review Boards (IRBs) might create the perception of having perpetrated procedural or interactional injustice.14 And if we are to look outside the institution, researchers may find that panel members behave unfairly during the process of reviewing grant proposals. Is it possible that such potential sources outside the organization contribute to an individual’s perceptions of injustice? And does the individual’s organization suffer even in cases where it was not responsible for the perceived injustices? Future research should attempt to identify these other sources of perceived injustice with the purpose of improving real and perceived fairness, and thereby preventing unfairness-related misconduct. The studies examining the effects of the organization on RCR have employed different measures. The arguments for intellectual and academic freedom notwithstanding, it may be a better strategy to employ a valid and reliable and consistent set of measures to order to facilitate comparison of results and ensure knowledge

112  Implications for theory and research accretion. Active RCR researchers should discuss these issues with a view toward reaching consensus on consistent measures. The recent focus on the role of the organization, in general, and organizational justice, in particular, tends to emphasize the effects of organizational climates and practices on potential offenders. Many, if not most, of the cases of research misconduct and other serious RCR departures would not have come to light if it were not for whistleblowers. They risk position and career to make allegations, but why? Do perceived justice concerns motivate them to do the right thing? Are they bothered by individuals who are deriving benefits such as publications or grants without having made the appropriate inputs? Research on the role that the organization plays regarding RCR need not focus solely on maladaptive behaviors such as fabrication, falsification and plagiarism, as well as other detrimental research practices. The industrial/organizational (I/O) psychology literature from which the work on organizational justice derives not only identifies counterproductive work behaviors (CWBs), but also organizational citizenship behaviors (OCBs). OCBs have been defined as “those organizationally beneficial behaviors and gestures that can neither be enforced on the basis of formal role obligations nor elicited by contractual guarantees or recompense.”15 Despite the methodological sophistication of the more recent, well-funded studies of RCR departures, they have major limitations. Even well-designed crosssectional studies cannot answer questions about what happens over the course of a scientific career. For example, are those who receive RCR training early in their careers less likely to depart from RCR standards later in the pre-tenure period? Do early perceptions of organizational injustice have only short-term effects, or do the effects endure throughout an individual’s research career? The ability to answer these and other etiological questions require longitudinal data such as those used by Ybema and van den Bos in their study of how organizational justice influences depressive symptoms and sickness absence.16 The evolution of RCR research calls for a comprehensive longitudinal study to provide more definitive answers to looming questions. It is now time to move the methodology of scholarly crime forward and employ strategies that will yield different and more useful data. Longitudinal studies have become a mainstay of criminological research. Projects such as the National Longitudinal Survey of Adolescent Health (Add Health) have enabled researchers to explore the many correlates of delinquency, crime, and criminal justice system involvement throughout the life course. The Project on Human Development in Chicago Neighborhoods (PHDCN), a well-funded longitudinal research project, gathered in-depth data on crime – and health-related variables using a compressed longitudinal design. These and other longitudinal studies have enabled criminologists to carefully examine the developmental life course of those at risk of becoming involved in crime and other maladaptive behavior. Likewise, the study of scholarly careers, including involvement in crimes and questionable research practices would benefit from a comprehensive longitudinal study. Imagine following a cohort of scholars-in-training from the time they enter

Implications for theory and research  113 graduate or professional school – or even before – to when they become mature full professors with established national or even international reputations. Such an enquiry would yield immensely useful information including the very basic question of why some scholars commit misconduct and some do not. Ideally, a longitudinal approach to exploring scholarly crime would involve an elaborate data collection scheme including individual, situational, organizational, and institutional information. Individual data would consist not only of sociodemographic variables, but also measures of personality, situational challenges, country and culture of origin, experiences as mentee and mentor, characteristics of employing organization, scholarly productivity, and honors and awards. Most of those who become involved in scholarly crime do so relatively early in their careers. When a 27-year-old post-doctoral fellow is caught plagiarizing a federal grant application, is there not some reason to believe this is not the first instance of academic misconduct? Was she a plagiarist as a college student? There is evidence to suggest that those who engage in professional misconduct have an early history of ethical missteps.17 Did he put bogus papers on his CV when applying for residencies? Because opportunities to deviate from both scholarly and general norms present themselves early, those constructing longitudinal surveys should consider starting the cohort in graduate or professional school or even earlier. If we agree with Donald Kornfeld that these standards of right and wrong are learned early, our point of data collection should begin earlier as well. Another advantage of a longitudinal approach is the potential to connect various levels of explanation. We know, for example, that narcissism and narcissistic traits have been connected to self-reported departures from scholarly standards. But not all narcissistic scholars steal ideas or engage in other forms of misconduct, and not all scholarly miscreants are narcissists. However, there may be a likelihood that narcissistic scholars are more prone to engage in scholarly misconduct since they feel that they are superior and above reproach. Related questions might be the manner in which narcissism interacts with organizational injustice to set the stage for a scholarly criminal act. It is also worth investigating whether narcissists are more likely to engage in misconduct regardless of situational or organizational factors. These and other questions can be explored with a multilevel approach. Social networking studies Criminologists have become more interested in social networking studies over the past couple of decades. Those working in the area of white-collar and corporate crime, for example, recognize that modern organizations consist of complex networks of individuals,18 and that social network analysis could offer new insights.19 We think this methodology lends itself to the study of scholarly misconduct. Whereas the study of white-collar and organizational deviance suggests that complex networks facilitate such illegal behavior, there is little reason to believe that scholarly misconduct relies on such networks. We have evidence that cultural norms are transmitted via social network.20 Is it possible, however, that likeminded scholars – those with the reputation of being predisposed toward scholarly

114  Implications for theory and research offending  – are members of the same network? Social network analysis could assess the extent to which there exist invisible criminal colleges, that is, networks of unethical scholars that not only tacitly approve of members’ respective misdeeds, they also resist attempts at formal social control of such conduct. These are fascinating questions that could be answered with the appropriate networking data. Another interesting application of social network analysis would be to test our contention that informal social control can be as effective or more effective in reducing scholarly offenses. A test of this approach would be to have one member of the network report an instance of misconduct. After a certain period of time elapsed, sufficient for the gossip to be transmitted, either survey or interview methods could assess the extent to which study participants knew and disapproved of the alleged misconduct. This would confirm the existence of networks that could employ prosocial gossip to help control scholarly crimes and misdemeanors. Experimental studies Much of the research on research misconduct and related behaviors has consisted of surveys. We have alluded to both the strengths and weaknesses of this approach, one of which is the ability to statistically control all the important variables of interest. One way to ensure statistical control is through the use of experiments. When all variables of theoretical interest are under control of the experimenter, she has greater confidence that the effect is the result of the imputed cause. This is why experimental research is considered the so-called gold standards of research approaches. We find it both curious and understandable that investigators exploring the causes of research misconduct have not employed laboratory experiments, particularly given that most researchers know the various strengths of an experimental approach. Further, because much of the research in which misconduct takes place involves a laboratory setting, experiments seem like an obvious, if neglected, strategy. We have cast scholarly criminals as free riders, a term for individuals who in economic games benefit at the expense of others. Games such as “tragedy of the commons” provide an opportunity for the participants to engage in selfish behavior – free riding – or some other alternative. We can envision studies in which experimental subjects are confronted with ethical dilemmas related to scholarship under various conditions. For example, for those that score higher on measures of narcissism, under what circumstances does their tendency to exploit others manifest itself? Qualitative research on scholarly careers Some social scientific questions cannot and should not be answered with crosssectional survey or other forms of quantitative data. If an investigator wants to know the intricacies of multiple relationships in the workplace, this cannot easily be tapped with a survey instrument, if at all. Despite the strengths of sophisticated

Implications for theory and research  115 surveys and interviews, they have limitations that argue for a closer, in-depth examination of the phenomenon of interest. The social sciences have seen researchers embedded in just about every societal milieu in which crime is alleged to occur. In the 1970s, Jack Douglas studied the social dynamics of nude beaches. Sociologist Elijah Anderson spent time hanging out with black men in inner city Philadelphia in order to understand what he termed the “code of the street.”21 Christopher Dum of Kent State University spent a year living with homeless people in a residential motel in an effort to understand the challenges faced by this population.22 As a graduate student at UCLA, University of Chicago sociologist Forrest Stuart lived among street people in Los Angeles, a fascinating ethnography detailed in his book, Down, Out, and Under Arrest.23 In an era in which quantitative analysis of social phenomena seems to reign, there are those social scientists who defy convention and provide analyses that are up close and personal. Given such precedents, why couldn’t a social scientist imbed herself or himself in a research organization for an extended period of time, observing the organization, the culture, and the personalities associated with the research enterprise? The person would be privy to many of the decisions made regarding how data were to be collected, recorded, stored, transformed and analyzed. Another qualitative approach would be to conduct in-depth interviews with a sample of convicted offenders, scholars who have been found guilty of research misconduct. Such a study was attempted under ORI support in the early 2000s, but the Office of Science and Technology Policy effectively killed the study. Such a study, perhaps with scholars who have had time to reflect on their experiences, could be quite revealing. We think that a historical look at scholarly crime could yield insights that other approaches could not. William Broad and Nicholas Wade covered some of the history of research misconduct, but their analysis focused on some of the more highly publicized and controversial cases. In the annals of science and scholarship there undoubtedly are hiding other developments that potentially could shed light on how scholarship has evolved and what this evolution means for contemporary scholars. Similar to the analysis of the history of white-collar crime by John Locker and Barry Godfrey, historians might be able to qualitatively trace important but neglected trends.24 What do current cases and those of a hundred years ago have in common? How are they different?

Conclusions In September of 2015, it emerged that General Motors, one of the United States’ largest automobile manufacturers, had admitted to knowingly selling cars with faulty ignition switches which were responsible for at least 124 deaths. David Uhlmann, a law professor, wrote an editorial stating that admission of guilt, a fine, and a promise to behave were not sufficient; criminal prosecution was in order.25 As part of a deferred prosecution agreement resolving the charges, G.M. admitted that it knew as early as 2004 that many of its vehicles contained defective ignition switches…. Yet G.M. did not recall the 2.6 million affected

116  Implications for theory and research vehicles until nearly two years later and instead hid the fatal safety defects to increase sales. To make amends, the company agreed to forfeit $900 million to the government and to retain an outside monitor who would oversee its safety programs to help prevent future violations.26 The problem, as Professor Uhlmann sees it, is that wrongdoers and potential wrongdoers will not be deterred from committing social harm if they are not brought up on criminal charges. If the defendants do not plead guilty or go to trial, they do not learn from mistakes. Charging and convicting corporations addresses the flawed corporate cultures and misplaced priorities that encourage criminal behavior; holding individuals accountable is the best way to deter future wrongdoing and promote law-abiding behavior. Instead, in case after case over the last decade, the Justice Department has treated the worst corporate criminals like first-time drug offenders, agreeing to dismiss or not bring charges if the companies clean up their acts.… [We need] to stop sending the mixed message that corporations can avoid criminal liability by admitting they were wrong and promising not to do it again.27 What does this have to do with scholarly misconduct? As we mentioned in Chapter 6, scholarly misconduct overlaps with white-collar crime in several important ways. The offenders are respected members of the community, many enjoy high social status, and their offenses are difficult to prosecute partly because of vague definitions and partly because the public does not want to see these offenders as bad people. Since that is the case, we ignore their offenses or, if their offenses become known and undeniable, we, the public and the social control systems, are pleased with an admission of guilt and a promise to not repeat the offenses. If we can agree that scholarly misconduct resembles white-collar crime, we can agree that social controls must be meaningful in both cases in order to prevent and reduce repetition of these harmful acts. The analysis presented in this book makes a convincing case that many of the research-related behaviors in which scholars engage should be defined as crimes and treated accordingly. These acts include not only instances of outright fraud such as the fabrication and falsification of data, but also the theft and purposeful duplication of intellectual property. Regardless of their native ability, education, and social status, scholarly offenders need to be held accountable for exploiting others, as well as for retaliating against victims of confirmed misconduct. Wrongful behavior, that which violates major social norms and causes harm to individuals and society, does not derive its seriousness solely from codification. The data and literature on seriousness that we have presented bears this out. Subjects responding to vignettes of offenses reacted to a description of a wide range of scholarly behaviors, irrespective of whether the acts are embodied in the law. It is clear that forms of scholarly misconduct are deemed more serious than some ordinary felonies.

Implications for theory and research  117 In actual cases, victims feel trauma not because the offender violated Section 4521 of the state’s criminal code, but because of an emotional and psychological realization that, as a result of the offensive act, they are now less than whole. Their ability to trust others has been compromised. The analogy of kidnapped brainchildren is apropos; one feels the loss of something so precious that restoration is unlikely or impossible. This emotional trauma experienced by victims is exacerbated by institutional and cultural indifference to scholarly offenses. Academia does little to either punish those who commit scholarly crimes or render aid to those who have been victimized. Indeed, we have argued that in a very real sense the world of organized scholarship has become both a witting and willing enabler of scholarly crime. Universities and publishers look the other way unless forced to act by federal regulations and incontrovertible evidence. In the absence of such incentives, they are far more likely to ignore the offense or discourage the pursuit of justice. In sum, there is much to learn. We are in the early stages of discovery about this odd, complicated, and very troubling and socially harmful behavior. Those of us who study this behavior hope that our younger and more energetic colleagues will take up these questions. The more examination we pursue of this form of harm and the more questions answered about it, the greater our chances of reducing it and making the world a better place.

Notes 1 See Cournand, A. & Zuckerman, H. (1975). “The code of science.” In Paul A. Weiss (ed.) Knowledge in Search of Understanding: The Frensham Papers. Mt. Kisco, NY: Futura Publishing Company. 2 Schmaus, Warren (1983). “Fraud and the norms of science.” Science, Technology & Human Values, 8: 12–22. 3 Zuckerman, Harriet (1984). “Norms and deviant behavior in science.” Science, Technology & Human Values, 9: 7–13. 4 Agnew, Robert (2014). “2013 Presidential address to the American Society of Criminology: Social concern and crime: Moving beyond the assumption of simply self-interest.” Criminology, 52: 1–32. 5 Black, Donald (2011). Moral time. New York, NY: Oxford University Press. 6 Blackstone, William (1941). Commentaries on the laws of England (W. H. Brown & B. C. Gavit, Eds.). Washington, DC: Washington Law Book Co. 7 Einat, T. & Herzog, S. (2011). “A new perspective for delinquency: Culture conflict measured by seriousness perceptions.” International Journal of Offender Therapy and Comparative Criminology, 55: 1072–1095. 8 Kidder, D. L. (2005). “Is it ‘who I am’, ‘what I can get away with’, or ‘what you’ve done to me’? A multi-theory examination of employee misconduct.” Journal of Business Ethics, 57: 389–398. 9 Bernerth, J. B., Field, H. S., Giles, W. F. & Cole, M. S. (2006). “Perceived fairness in employee selection: The role of applicant personality.” Journal of Business and Psychology, 20: 545–563. 10 Brockner, J., De Cremer, D., van den Bos, K. & Chen, Y.-R. (2005). “The influence of interdependent self-construal on procedural fairness effects.” Organizational Behavior and Human Decision Processes, 96: 155–167.

118  Implications for theory and research 11 Aquino, K., Lewis, M. U. & Bradfield, M. (1999). “Justice constructs, negative affectivity, and employee deviance: A  proposed model and empirical test.” Journal of Organizational Behavior, 20: 1073–1091. 12 Davis, Mark S., Wester, Kelly L. & King, Bridgett (2008). “Narcissism, entitlement, and questionable research practices in counseling: A  pilot study.” Journal of Counseling & Development, 86: 200–210. 13 Martinson, B. C., Anderson, M. S., Crain, A. L. & de Vries, R. (2006). “Scientists’ perceptions of organizational justice and self-reported behaviors.” Journal of Empirical Research on Human Research Ethics, 1: 51–66. 14 Keith-Spiegel, Patricia  & Koocher, Gerald P. (2005). “The IRB paradox: Could the protectors also encourage deceit?” Ethics and Behavior, 15: 339–349. 15 Chiaburu, D. S. & Lim, A. S. (2008). “Manager trustworthiness or interactional justice? Predicting organizational citizenship behaviors.” Journal of Business Ethics, 83: 453–467. 16 Ybema, J. F. & van den Bos, K. (2010). “Effects of organizational justice on depressive symptoms and sickness absence: A longitudinal perspective.” Social Science & Medicine, 70: 1609–1617. 17 Yates, J & James, D. (2010). “Risk factors at medical school for subsequent professional misconduct: a multi-centre retrospective case-control study.” British Medical Journal, 340: c. 2040. 18 Lippens, R. (2001). “Rethinking organizational crime and organizational criminology.” Crime, Law & Social Change, 35: 319–331. 19 Simpson, Sally (2011). “Making sense of white-collar crime: Theory and research.” Ohio State Journal of Criminal Law, 8: 481–502. 20 Bichler, Gisela, Schoepfer, Andrea  & Bush, Stacy (2015). “White collars and black ties: Interlocking social circles of elite corporate offenders.” Journal of Contemporary Criminal Justice, 31: 279–276. 21 Anderson, Elijah (1999). Code of the Street: Decency, Violence and the Moral Life of the Inner City. New York: W. W. Norton. 22 Dum, C. (2016). Exiled in America: Life on the Margins in a Residential Motel. New York: Columbia University Press. 23 Stuart, F. (2016). Down, Out, and Under Arrest: Policing and Everyday Life in Skid Row. Chicago: University of Chicago Press. 24 See Locker, John P. & Godfrey, Barry (2006). “Ontological boundaries and temporal watersheds in the development of white-collar crime.” British Journal of Criminology, 46: 976–992. 25 Uhlmann, D. M. (2015). “Justice falls short in G.M. case.” New York Times, September 20, p. 5. 26 Ibid. 27 Ibid.

9 Preventing and controlling scholarly crime

We have made a case for treating the more serious forms of scholarly misconduct as crimes. Whether they are treated as such remains to be seen. Regardless, we think this new posture necessitates a fresh approach to prevention and control. In this chapter we lay out a strategy of what that approach might look like. Research institutions and scholarly associations have proven themselves ineffective, perhaps out of unwillingness, at controlling scholarly crime. Not only do they inadvertently provide incentives for scholars to deviate from accepted standards, they too often fail to intervene and thus reinforce the criminogenic influences under which scholars work; indeed, as we have seen in other scandals that threaten reputation and funding. Their inaction sends an unspoken message to the community of scholars that the status quo is not to be disrupted by allegations of wrongdoing. This in turn nurtures resentment, skepticism and indifference. Most scholarly bodies in the social, behavioral, and life sciences have welldeveloped codes of ethics to which members are expected to abide. Such ethical codes admonish members about exploiting students and other scholars, and to acknowledge the contributions of those who have contributed to their research. But where association members have sought relief from the inappropriate actions of others, these associations backpedal. Moreover, associations can only influence members, so would-be offenders need only avoid membership in order to sidestep any of their regulatory processes. This makes them a less-than-effective agent of social control. Scholarly misconduct causes real harm, and this harm is not limited to individuals. We argue that the organizations and institutions in which scholars work suffer as a result of each and every instance of idea theft, fabrication, or falsification. Social psychological research on distributive, procedural, and interactional justice tells us that instances of profound unfairness – which is what this behavior represents – have a negative impact on those who work in the organization. Even though most people in society are unaware of scholarly crime, and may not be interested in it if they were made aware, it diminishes the world in which they live. Betrayals of trust, particularly among educated, well-paid professionals, reduce the investment we all have in abiding by social norms. This is particularly true where the offenders fail to face consequences for their maladaptive behavior. In the absence of formal or informal reinforcement of proper scholarly behavior

120  Preventing and controlling scholarly crime and attitudes, social norms weaken over time, freeing actual and would-be offenders from the moral approbation of society. Do we want a society in which disinhibition leads to bolder attempts to breach social norms, particularly norms we consider fundamental to the fabric of society? One of the unaddressed harms is to the norms of fairness and reciprocity. Each instance of serious exploitation, through behaviors such as plagiarism and fabrication, chips away at that all-important element of modern social relations: Trust. Exploitation unaddressed engenders cynicism and contempt, resulting in a slow dissolving of the social glue. It therefore is incumbent upon all of us by word and by example to ensure that fairness in social relations, including those in the world of scholarship and science, is not the exception, but the rule.

The formal control of scholarly crime We have been largely critical of the way scholarly crime has been handled to date. The truth is there are countless professionals who are trying to advance the responsible conduct of research in the best ways possible. The current structure of organized scholarship can consider a number of policies and procedures that should reduce if not eliminate the incidence of future scholarly crime. Addressing structural issues Structural factors such as the publish-or-perish pressure may be among the most difficult to address. Just as we can’t wave a magic wand and reduce social disadvantage and institutionalized racism known to be related to much traditional crime, we can’t magically change the structure of the scholarly enterprise. But there may be opportunities for incremental changes. For example, as we noted in Chapter 2, criminology now has a number of ultra-high producers, scholars that publish upwards of 15–20 academic papers per year. This may inadvertently have the effect of raising the bar for the next generation of new scholars, a bar which, quite frankly, may be impossible to reach. As early as 1989, the Institute of Medicine acknowledged the role that emphasizing quantity of publications may play in encouraging misconduct.1 In discussing the problem of fake peer reviews, journalist Glyn Moody suggested that “[t]he best way to address the growing problem of fake reviews is to adopt better, more inclusive ways of evaluating academics and their work, and thus move beyond today’s fixation on publishing papers in high impact-factor titles.”2 Universities and other research institutions have it within their power to change tenure and promotion practices in such a way as to discourage unreasonable competition. Reinforcing the maintenance of bars that only few can get over will only inadvertently encourage ethical shortcuts. Consideration of revised tenure and promotion policies would be a place to start. Universities may be the purveyors of structure, but they are not necessarily bound by it. For example, institutions could consider only a candidate’s five best publications for tenure and promotion, thereby relieving pressure on those who are less productive but still capable of quality

Preventing and controlling scholarly crime  121 scholarship. Such recommendations have been made before without much apparent movement in this direction. The implementation of separate teaching and research tracks may also lead to the encouragement of what individuals do best. At well-known criminology and criminal justice programs, such as those at the University of California at Irvine and the University of Missouri at St.  Louis, there are both tenure-track research and non-tenure track teaching faculty members. For those who want to do research, they can pursue such positions, while other faculty members, who prefer to focus on teaching and do not want the pressure of publication can pursue that route. The separation of these functions is one way to begin to address the structure of scholarly work in a way to alleviate pressure that indirectly leads to scholarly crime. Addressing organizational issues Somewhere in between individual proclivities and structural influences to produce lie organizations, the more immediate environment in which intellectuals work. Organizations can create an atmosphere characterized by support, cooperation and fairness, or one in which actors viciously compete with one another, cutting various ethical corners in order to rise to the top. The American Society of Criminology, until 2016, lacked a code of ethics, an organizational omission that could inadvertently send the wrong message to the membership. A number of commentators on RCR have discussed the merit of data audits.3 Data audits not only permit an organization to control the quality of data, they also send a signal to would-be offenders that someone will be checking their work. It would be possible, though highly difficult, to invent bogus data out of whole cloth without convincing supporting documentation. Take, for example, a social science survey. Such data are easily fabricated; one need only type fake values in a data entry template. One possible, though more expensive, option would be to employ outside auditors. They would be less likely co-opted by local organizational norms. Indeed, they may have a professional obligation to report suspected misconduct irrespective of their relationship with the alleged perpetrator or the host organization. This has worked with corporate offending, even though there is some concern by auditors about being perceived as whistleblowers or informants.4 Just as federal agencies contract with so-called “beltway bandits” – contractors that render services to federal agencies in and around Washington, DC – to oversee the peer review of grant proposals, so could they engage outside contractors to perform periodic audits of their raw and transformed data. Research on procedural and interactional justice has shown that notions of justice are important to those working in organizations. Fairness norms are so pervasive in society that it is not surprising that they play a substantial role in the amount of counterproductive work behavior (CWB) that occurs within organizations. Just as law enforcement agencies would benefit from implementing organization changes to reduce perceived procedural injustice by citizens who interact

122  Preventing and controlling scholarly crime with officers, research organizations would be wise to take measures to treat all staff members – from senior faculty to lab assistants – with fairness, respect and dignity. Practically this means giving staff a voice in what happens in the organization. Voice, as the term implies, means not that staff members have veto power, only that their opinions and concerns will be heard and considered. In short, there are a number of steps organizations can take to prevent and control scholarly misconduct. One is to clearly communicate realistic expectations for employee performance. Another is to treat employees with procedural and interactional fairness. Organizations can also implement policies and procedures related to the documentation of research and the preservation of data, thereby discouraging secrecy. Organizations such as universities – as well as their subdivisions such as colleges, schools, and departments – should take a strategic approach to promoting the responsible conduct of research. And this approach needs to go well beyond RCR training. It should include formal goals and associated objectives for creating a procedurally fair work environment. Including fairness, defined broadly, in an organization’s mission, vision and strategic plan makes it a foundational commitment, an organizational promise that must be kept. The complex etiology of research misconduct should also make us more realistic about its prevention and control. That is, how much change realistically can be brought about by reducing staff members’ perceptions of justice in the workplace is unknown. Nevertheless, we recommend that research institutions examine their practices with the thought of promoting organizational justice. The findings from research in business and industry seem unequivocal: Perceived injustice likely contributes to maladaptive behaviors in the workplace. These behaviors can include employee theft, excessive absenteeism, malingering, and even violence. There are sufficient reasons to believe that faculty and staff members in research organizations respond to justice norms, both negatively and positively. We think this line of inquiry has important implications for RCR education and training. Current curricula tend to address the responsibilities that individual researchers have for the traditional nine RCR areas. It is unlikely that existing RCR courses have incorporated the findings on organizational injustice. There may not be many studies that have explored organizational justice in the RCR context, but there is a vast literature in industrial/organizational psychology which includes recommendations on how to promote organizational justice.5 Despite the disagreement on the goals of RCR training6 and the doubt cast on the effectiveness of RCR training,7 we see no compelling justification for not developing new curricula or revising existing curricula to address issues related to organizational justice. This is particularly true for RCR training aimed at administrators and managers, those whose roles give them influence over justice concerns. Despite our reservations about the effectiveness of RCR education and training in preventing FF&P, we agree with Schoenherr that integrating research integrity within research methods courses is an early opportunity to expose future scholars to RCR.8

Preventing and controlling scholarly crime  123 Another tack just as important as education and training is to employ the research findings on organizational justice to change organizational policies and procedures in such a way as to promote organizational justice. Despite the modest body of studies supporting the role of injustice in violations of RCR, the research community could implement changes to promote employee feelings of equity and justice. After more than 30 years of contributing to the literature on organizational justice, Greenberg asserted that “sufficiently well-established principles of organizational justice exist . . . to justify advancing to practice”.9 There appears to be little justification not to use the extensive research findings on organizational justice to begin revisiting the policies and procedures within research organizations. If we truly believe that the greatest promise for change is at the organizational level, then the community of scholars should support a rehabilitative program for organizations. Change is unlikely to occur if we don’t attempt to reform large organizations in which cases of scholarly crime take place. Addressing cultural issues There is growing evidence that culture plays a role in at least some instances of scholarly misconduct. In some cases, it may be a function of the progress of science in the culture of those involved. In other cases, it may be stresses created by individuals of one culture trying to perform in another where language and customs are unfamiliar. Regardless, culture becomes an etiological factor we must consider for the prevention and control of scholarly misconduct. We also suggest that culture be incorporated into efforts aimed at preventing scholarly crime. Just as considering culture in the primary prevention of public health problems makes sense,10 so it is reasonable to include culture in future preventive efforts to forestall scholarly offenses. For example, since we know that foreign applicants are more likely to plagiarize in their residency applications,11 we could use software such as Turnitin to verify that the submitted text was original. Culture need not be an immutable etiological factor. We agree with Miriam Erez and Efrat Gati that individuals can bring about cultural change.12 Recent experiments in microeconomics have shown that some individuals, long thought to have only self-serving motives, will actually punish others who have taken advantage in interpersonal exchanges in economic games. These so-called strong reciprocators will punish others even if doing so comes at a personal cost.13 So, in addition to creating fair and nurturing environments in which scholars can work, it is also possible for a handful of ethically-predisposed scholars to set a tone that scholarly crime is unacceptable and will not be tolerated. RCR trainings such as that offered through the Collaborative Institutional Training Initiative (CITI) may do an adequate job of acquainting trainees with the norms of research. What they do not do, however, is address the many cultural differences that exist between Western organized scholarship and the evolving norms in other societies. Future efforts to promote RCR must take account of the

124  Preventing and controlling scholarly crime many cultural differences that exist among contemporary scholars coming from various countries and backgrounds. These issues need to be frankly discussed with a view toward incorporating the insights into improved mentor-mentee relationships. Inasmuch as this involves numerous cultural differences, this will not be an easy task. We still think the community of scholars should stop avoiding this prickly issue and should make sincere efforts to increase understanding. Regardless of the amount of extant research, it is our opinion that those charged with promoting RCR should incorporate culture within their educational and training programs, from orientation through in-service opportunities. Just as RCR officials moved quickly to develop pre-service and in-service programs designed to prevent research misconduct, they now should do more to address the various implications of cultural differences for potential violations of responsible conduct of research. The cultural training we envision here cannot be fulfilled with the canned diversity training programs routinely required for workers in large bureaucracies. Rather than simply an appreciation of differences among various ethnic groups, there must be nuanced discussions of how these cultural differences influence the enterprise of scholarship and science. How do status hierarchies in a society which is undergoing scientific evolution influence the way science and scholarship plays out? Does fealty to a superior, for example, override prevailing standards of who deserves to be an author on a paper and in which order? Does the influence of Western structural pressures play an even greater role in such an evolving scientific environment where RCR is poorly understood and where methods of prevention and control are not in place? These and other nuances need to be teased out in an in-depth dialogue about culture and RCR. Instruction on culture also must be bi-directional. Assuredly, scholars who were raised and trained in foreign cultures need to be thoroughly exposed to Western norms of acceptable research relationships and behavior since many of the more desirable research journals are Western. Likewise, administrators and managers who will supervise foreign-born and foreign-trained scholars must become more sensitive to how the culture uniquely influences the other four levels of causes. In other words, new training on cultural aspects should also include mentoring of non-U.S., non-Canadian, and non-European researchers. Bridging any cultural divide cannot be done by those standing on one side of the abyss. It requires those of the host culture to make purposeful efforts to understand and appreciate differences in culture and how these might lead to intentional or inadvertent misconduct. How do factors such as the status differences between mentor and mentee operate in non-Western cultures? How might the absence of scientific social control mechanisms in one’s native country influence a new post-doc’s behavior in the lab? We don’t assume that specialized cultural training for scholars will be a magic bullet to rid the sciences of intentional and unintentional misconduct. Training, as we have noted in the case of traditional RCR training, has limited effectiveness in preventing scholarly crime. But for no other reason, such training is necessary in order to counter potential arguments of ignorance of prevailing standards.

Preventing and controlling scholarly crime  125 If we agree that there are countries which are struggling with the adoption of competent and ethical scholarship and science, we should be willing to reach out and assist them in this difficult but important transition. RCR experts from countries with more advanced RCR apparatuses might be another way to assist less developed countries with their RCR challenges. The U.S. already does this with fighting modern diseases such as Zika and AIDS. We reach out and offer assistance with the understanding that a proactive approach may well reduce the likelihood of scholarly violations in the country in question, but also in countries where scholars are likely to immigrate. There is no reason why we couldn’t be nationally proactive in promoting such workable strategies around the world. Addressing individual issues There is little question that factors such as personality play a role in scholarly misconduct. If we subscribe to the notion that only higher order levels are responsible, then we are left to explain a large number of false positives, that is, individuals who should be susceptible to structural, cultural or organizations influences but who do not offend. We know from the research literature that certain personality factors are associated with counterproductive work behavior (CWB), including ethical breaches in scholarly research. Narcissism is perhaps the best candidate as a personality style that is related to scholarly offending. The sense of entitlement and the tendency to exploit, both traits associated with this personality style, permit the individual to flaunt rules and procedures. Narcissistic Personality Disorder is a serious mental health condition that doesn’t easily lend itself to treatment. But subclinical narcissism is a much different proposition, and it can and should be addressed by appropriate mental health counseling. Here we suggest that candidates for academic and other research positions, as with other professions such as commercial aviation and law enforcement, be required to take personality tests under the assumption that certain traits work against the ethical production of quality scholarship. Employers examine the activity of job candidates on social media, so psychometric tests are hardly more intrusive. Maladaptive traits such as interpersonal exploitativeness and a sense of entitlement may well signal an individual whose personality is ill-suited to work in the sciences, where behaving fairly and honoring trust is of paramount importance. Psychometric measures exist to assess these and other maladaptive traits. Whether research organizations take advantage of such opportunities remains to be seen. In cases where the organization deems the individual’s strengths to outweigh liabilities, then steps should be taken to refer the person for appropriate counseling. And organizations that give such individuals a chance to make contributions should be willing to terminate those who betray the trust implicit in the initial hiring decision. Of the several purposes of criminal sanctions, one is rehabilitation. Rehabilitation rests on the premise that some offenders can and should change in such a way

126  Preventing and controlling scholarly crime as to once again become law-abiding, contributing members of society. Recently, the rehabilitative ideal has focused on the adjustment of those who are coming back to the community after a stay in prison. The emphasis on reentry shows concern not only for the offenders’ employment, but also reconnecting with family and community. Rehabilitative efforts may also help to minimize the disintegrative effects of the stigma resulting from conviction and confinement. If rehabilitation makes sense for traditional offenders, we must at least consider the possibility of redemption for scholarly offenders, and there are several compelling reasons for doing so. First, a substantial amount of money goes into the education and training of scholars. In the biomedical sciences, where physician-scientists serve extensive post-doctoral research fellowships in addition to medical school and graduate school, this investment could amount to hundreds of thousands of dollars or more. We should not be willing to write off such an investment without first taking positive steps to protect it. A considerable amount of time and money goes into training an academic scientist for a career in research. This is true for those trained as MDs as well as PhDs. So it is questionable whether it is wise to write off this investment in every case of research misconduct. It is not unlikely that a number of those who engage in scholarly misbehavior can rejoin the scientific community as productive, trustworthy members, permitting all involved to realize the benefits of the time and money invested. Permanently banishing a remorseful scientist from research for a single, senseless act may, in some cases, constitute a punishment out of proportion to the original harm. Principles of equity require, and standards of compassion suggest, punishment with less permanent consequences for some respondents. So, what might rehabilitation look like for scientists found guilty of research misconduct? One approach might be to dissemble miscreant scholars’ rationalizations of their behavior. Cognitive behavior therapy, which has been shown to be effective with traditional offenders, might be employed to address rationalizations and other errors in thinking. Just as many of those convicted of committing crimes are ordered to serve a probationary period, so might wayward scientists be required to serve a similar probationary period during which their research behavior can be monitored. The ORI imposes a sanction which, while not termed probation, requires the offending scientist to work under the close supervision of a senior scientist for a specified period of time. Such a period of supervision could be paired with appropriate consequences for failure to abide by certain conditions. While we are supportive of efforts to reclaim the professional lives of wayward scholars, we maintain that in most of the more egregious cases of scholarly crime are intentional acts by individuals who know well the norms of scholarship and choose to exploit others. Such individuals can and perhaps should be given a chance to redeem themselves and demonstrate that they are willing to perform future work competently and ethically. But as with those found guilty of more conventional crimes, we should remain cautiously optimistic about the prospects for meaningful change.

Preventing and controlling scholarly crime  127 Even though there are questions yet to be pursued with as-yet untried approaches, there are steps the community of scholars can take to help prevent and control scholarly offenses. Waiting for definitive answers will only ensure more offenders, more victims, and more injustice. Now that we have made what we hope is a convincing case for treating some forms of scholarly misconduct as crime, we turn to ways in which such crime can be prevented and controlled. The recent history of criminal justice in the United States makes us painfully aware that failing to prevent crime in the first place creates an incalculable burden on the system and on society in general. While we have argued that the criminal justice net should indeed be widened, we are mindful of the hydraulic effect additional offenses places on existing resources. Thus, the bulk of our recommendations are aimed at preventing, rather than controlling, future incidents. Addressing situational issues Situational issues arise in the lives of everyone. They include but are not limited to health problems, the deaths of family and friends, financial problems, and relationship problems. While it could be argued that some situational issues such as financial problems are preventable, others such as family illness and death clearly are not. It could further be argued that individuals should not bring these issues to the workplace, but such a position is not only insensitive, it is also unrealistic. Situational issues do not take the employee off the hook. Even with the understanding and support of the employing organization, employees facing such issues must assume the responsibility of getting the professional help they need to resolve their issues. And if the stressed employees fail to do their part in addressing the issue, the organization is justified in letting them go. Most situational issues are best detected and treated by the organizations in which individuals work. Once managers learn of professional and personal problems, they are in a position to leverage the organization’s resources such as Employee Assistance Programs (EAPs). If this is done early on, these stressors are less likely to lead to counterproductive work behavior including instances of research misconduct. Publishers need to take a proactive stand against scholarly crime The American Sociological Association’s Committee of Professional Ethics (COPE) represents a major stride toward promoting responsible scholarly and scientific publication. Through workshops and other initiatives COPE has attempted to educate its members about the ethical challenges of scholarly publication and how to meet them. But following the directives of COPE is not by itself enough. Publishers should make a greater effort to verify the originality of the work they publish. This can include steps as easy as Googling the proposed title and subject matter of a proposed work. Another way to do this is to identify scholars who are likely familiar with the subject matter of proposed books. They do this now by asking prospective authors to provide them with a list of books similar to the one

128  Preventing and controlling scholarly crime proposed. However, there doesn’t seem to be a check on whether the author proposing the book is aware of previous works similar to the one proposed or whether the author proposing the book is being truthful. If allegations of impropriety are made against their authors, publishers need to take an objective look at the evidence, even if it means admitting that the work they published is purloined. The experience of Professor Y demonstrated in dismaying terms that book publishers as well as journal publishers prefer to shoo away any questions of impropriety, no matter how egregious and obvious. Cases of scholarly offenses should be aired publicly For a long time the scholarly community kept alleged cases of scholarly crime under wraps. Perhaps operating under the belief that such cases were the mischief of a few bad apples, universities and scholarly associations buried their heads in the sand under the mistaken belief that one can’t be bothered by what one can’t see. Fortunately, in the 1970s and 1980s, a cadre of science journalists including William Broad and Nicholas Wade put the spotlight on cases at prestigious institutions. No longer could the community of scholars remain in denial. There are cases in which the alleged misconduct is irrefutable, but the host institution attempts to keep a lid on it. When an allegation of plagiarism was leveled at a University of Arizona professor by a former student, the university issued a directive that no staff members were to discuss the case.14 This only reinforces our contention that those in power are not only in a position to offend, they can also protect alleged offenders from facing the moral approbation of the community. Worse yet is this obvious double standard – admonishing students not to plagiarize while protecting faculty who do nothing to promote the responsible conduct of scholarship to literally thousands of students who look to professors as role models. Thus, an organization can contribute to crime by covering it up. Take the case of Jerry Sandusky, the athletic director at Pennsylvania State University, who molested boys and young men over a period of years. As the investigation proceeded, it became clear that Penn State officials at the highest levels had knowledge of the allegations against Sandusky. The Penn State debacle did not involve research misconduct, but it does illustrate the ends to which institutional officials can and will suppress or play down alleged misconduct in order to preserve the institution’s reputation. Particularly in the United States, where the First Amendment of the U.S. Constitution protects a free press, journalists are tremendously important in bringing cases of alleged scholarly crime to light, especially cases that are being quashed by officials. Whereas some victims are reluctant to take on powerful offenders and often rightfully so, investigative reporters can be relentless in uncovering facts pertaining to alleged research misconduct. Journalists now routinely report on instances of research misconduct, plagiarism, and other forms of scholarly crime. This is true not only in the United States, but also now in numerous other countries that have been forced to confront this

Preventing and controlling scholarly crime  129 insidious problem. Retraction Watch exemplifies the journalistic search for the truth regarding scholarly crime. It not only highlights scholarly and scientific works whose findings have been subject to retraction, it also offers those involved the opportunity to give their side of the story. Journalists who pursue alleged cases of scholarly crime need to be scrupulous in their reporting of the facts. Just as the exploitation of innocent scholars is unfair, so is the ruination of an innocent scholar’s reputation as a result of a shoddy or incomplete analysis of available information. Journalists can play an important role in illuminating the problem of scholarly crime, particularly in those cases where the actors are powerful, are disavowing any responsibility, and the victim is vulnerable. Victims of scholarly crime, faced with budget-breaking lawsuits and career-ending threats, must be able to rely on a free press to uncover and report the truth, whatever that might be. In some cases, the allegation could be false, and the accuser is exposed as opportunistic or vindictive. In bona fide cases, however, the allegations might be borne out by the evidence, which even the most powerful in academe cannot suppress. Serious cases of scholarly crime should be prosecuted It should be clear at this point that we consider forms of misconduct such as fabricating data and plagiarism as criminal. Wasting millions of dollars in federal research funds by purposely plagiarizing or fabricating is just as wrong as selling high-risk mortgages to borrowers ill-equipped to repay them. It is now time for “lab-coat crime” to take its place alongside other forms of white-collar offending such as price-fixing, insider trading and embezzlement. It has been suggested that science is so complex that lay juries may find it difficult or impossible to comprehend the nuances that comprise alleged misconduct. We understand the concern, but juries have had to grapple with difficult material for decades. DNA and other aspects of contemporary crime science entered the courtroom decades ago. We find it a bit patronizing to suggest that the average person can’t understand something as simple as a figure from one published paper being inappropriately used in another paper. Whatever intricacies of science lay women and men find perplexing, most should be able to grasp the unfairness of defrauding the federal government by exploiting the vulnerabilities of the system. Contradicting the argument that misconduct cases are too complex is the fact that several cases have been successfully prosecuted. The cases of Eric Poehlman, Stephen Breuning, and Pat Palmer demonstrate that prosecutors can put together a viable case against those who would purposely defraud the federal government by engaging in extreme breaches of trust and fairness. Regardless, few criminal cases actually go to trial, thus we have little reason to believe that lab-coat criminals would be less likely than their conventional counterparts to plead guilty instead of going to trial. Because the prosecution of scholarly crime is still relatively rare, we don’t know the extent to which such a practice might serve as a deterrent. Deterrence research relies on a number of complex concepts and assumptions. If the world of

130  Preventing and controlling scholarly crime scholarship and science embraces practices such as data audits, perhaps would-be offenders will have to be more concerned with them than with the consequences of a criminal conviction. We understand that our position on scholarly misconduct as crime represents net-widening of the criminal law that many consider already guilty of overreach. But when we look at the literature on net-widening and the arguments against it, most of it centers on including minor offenses that cause little or no harm by those who stand to lose a lot by official processing and labeling. For example, we agree with other scholars that substance abuse should be treated as a health problem rather than a matter for the criminal justice system. Our argument for criminalizing the more serious types of scholarly crime does not necessarily imply a punitive posture. With a few possible exceptions, we do not support the imprisonment of wayward scholars. Just as confinement does little or nothing good for the traditional offender, years of time in a federal or state prison is unlikely to do anything other than consume taxpayer dollars that could be better spent elsewhere.

The informal social control of scholarly crime It is our opinion that the methods of controlling scholarly crime have to date yielded disappointing results. The cases that come to the attention of universities and administrative agencies are likely a fraction of all those that occur.15 This suggests we have been unable to constrain the behavior of motivated offenders through education, training and administrative threats. The investigation of research misconduct cases consumes significant human and financial resources.16 And while there are protections for whistleblowers, those who are willing to take this step still run the risk of suffering retaliation.17 The result is a drawn-out, costly process that nets mixed results. The legal systems – both civil and criminal – which come into play in many cases involving research misconduct appear to do little to deter future misconduct. The criminal justice responses may provide highly visible sanctions for a select few, but with the low probability of being sanctioned, let alone being prosecuted, there is little reason to believe that those bent on flouting the rules of scholarship will fret about the consequences. Likewise, civil actions such as qui tam suits can result in substantial awards for the government and for the whistleblower, but again, we don’t know if such outcomes serve as meaningful object lessons for would-be perpetrators. There has been a tremendous push behind RCR education and training over the past several decades, but there is mixed evidence that it has any appreciable effect on scholarly crime. While these efforts undoubtedly offer benefits, particularly for those unacquainted with science’s moral norms and who genuinely want to do competent and ethical work, we should not expect such training to prevent scholarly crime, particularly if our theory that the intersection of multiple etiological factors – seldom the ignorance of ethical standards – is responsible for scholarly crime has merit. It likely is wishful thinking that RCR training will have any

Preventing and controlling scholarly crime  131 impact on research misconduct.18 Such training programs, however, will continue to be popular largely for legal and symbolic reasons and to satisfy federal and organizational administrative regulations. Another problem with more conventional methods of social control is that they come into play after an offense has taken place; they do nothing to help the victim, save for a small measure of retribution in select cases. Scholarly offenders can do a substantial amount of harm to individuals, institutions, and the world of scholarship in general before their misdeeds are detected. After-the-fact social control, as it may be for justice and deterrence, cannot undo many of the harms visited upon individuals and society. We have also shown that organizations, including scholarly societies, are capable of suppressing knowledge about the misconduct of its members. The reasons that organizations suppress such activity can include concerns about reputation, potential scandal, and even possible legal consequences.19 Their indifference or denial only emboldens offenders and permits them to continue offending. Eventually, when the crime does come to light, the reputation of the organizations suffers more than if it would have come clean in the first place. As we have seen, there are forms of scholarly crime that are not subject to either laws or administrative regulations. The duplication of an expressed idea such as a book concept can be accomplished with impunity, and so it falls through the proverbial cracks in the system. Laws and administrative regulations can be changed to include such behaviors, but that is a lengthy, cumbersome and questionable prospect. In a sense, the failure of formal social control of scholarly crime mirrors the disappointing history of controlling more traditional crime. One need only look at how drug offenses have been pursued over the past 30 years to see that criminalizing what should have been a public health issue did little more than create the mass incarceration of generations of poor, minority males, stripping them of earning potential, family contact, and personal dignity. Confinement rarely offers any benefit other than incapacitation. Criminologists periodically imagine a new, different set of criminal justice responses, but these efforts never seem to change the landscape of criminal justice for the better. What scholarly crime – and we would argue crime in general – calls for are forms of social control which are less cumbersome and more effective than more traditional methods. Doug Adams and Kenneth Pimple called on the RCR field to consider informal social control, a call that seems to have gone largely unheeded.20 Specially, they recommended face-to-face, interactive RCR training as a means of inculcating and sharing an appreciation for responsible research practices. What we don’t know, however, is how effective such networking might be given the mixed evidence for RCR training in general. Furthermore, it is unclear how effective this kind of networking would be with individuals otherwise predisposed to cutting ethical corners. Social control of misconduct must be multifaceted21 and accessible to everyone regardless of power, position or resources. We need social control that in time will work to prevent scholarly crime so we don’t have to address it after the fact.

132  Preventing and controlling scholarly crime Informal social control in criminology While criminological interest in informal social control can be traced back to the early part of the twentieth century,22 the most articulate and oft-cited statement of how it works in the social disorganization tradition is offered by Robert Sampson, Stephen Raudenbush and Felton Earls.23 Using data from the Program on Human Development in Chicago Neighborhoods  – a methodologically sophisticated, well-funded longitudinal study in the field – they argued that neighborhoods that were higher in collective efficacy would experience less crime and social disorder. As extensions of the social disorganization tradition, as well as in addition to it, there have been a number of studies that examine the role of informal social control in preventing and controlling various social ills. It has been used to explain how people intervene in irresponsible drinking,24 child maltreatment,25 flag desecration,26 and sex offenses.27 All these studies suggest that informal social control can have a positive effect on selected forms of criminal and deviant behavior. It’s unknown if informal social control by residents in a socially disadvantaged neighborhood can translate into parallel efforts by academics operating within the social structure of organized scholarship. Still, the basic concepts are the same, the social stakes such as reputation and respect are the same, and thus the consequences are quite possibly the same. Another tradition in criminology that is particularly relevant to informal social control is Braithwaite’s reintegrative shaming theory.28 Braithwaite derived his shaming notion from the labeling perspective which posits that formal social control frequently stigmatizes deviants and, as a result, causes them greater problems than if they had been left alone. Central to Braithwaite’s theory is the notion that social control must be both public and reintegrative. The public aspect ensures that offenders openly acknowledge the wrong they have done in order to be reintegrated into the community. The role of the individual scholar in social control The mechanisms for achieving formal social control of scholarly crime involve systems, institutions and organizations. We rely on professional societies, federal agencies, universities and their various organizational subdivisions, as well as the amorphous scientific community to regulate and control misconduct. It is also at these levels where we have expended most of the resources and perhaps missed many of the opportunities. As we discussed in Chapter 5, the literature on research misconduct routinely discusses the role of individuals as offenders, their misdeeds, and their motivations. Victims are also viewed in individual terms and, indeed, individuals are among the primary victims of idea theft and other offenses. Seldom, though, do analysts of RCR violations cast individuals as proactive players in preventing or controlling research misconduct and related behaviors. More often the onus is on federal agencies and research organizations to revise policies and practices in order to control misconduct. Consistent with our approach that the problems of

Preventing and controlling scholarly crime  133 research integrity call for a theoretical and practical overhaul, we now explore the possible role individuals can play in the informal social control of scholarly crime. Informal social control, according to Ronald Clarke, “refers to society’s attempts to induce conformity through the socialization of  .  .  . people into the norms of society, and through people’s supervision of each other’s behavior, reinforced by rule making, admonition and censure.”29 There are certain requirements that must be fulfilled in order for individuals to play a meaningful role in informal social control. One is that they, witnesses and victims, must be willing to intervene. It’s one thing to be in close proximity to prospective or actual offenders and quite another to communicate information to them or about them. The willingness to intervene in crime is illustrated in the extreme by the 1960s case of Kitty Genovese, a New York woman who was sexually assaulted and murdered in the presence of numerous witnesses. While the facts of this infamous incident have long been disputed, they spawned a series of social psychological experiments designed to understand why people do and do not intervene in emergencies. Admittedly, the Genovese case is unusual in that it involved a life-or-death situation, but it serves to punctuate the importance of being willing to step up when the opportunity arises. As in the Genovese case, there are two possible types of informal social control. Indirect informal control consists of the willingness to contact authorities about an alleged crime. A lab assistant suspects her boss of fabricating data in a series of experiments. Instead of approaching her boss about her suspicions, she reports them to higher authorities such as the university compliance office or the federal Office of Research Integrity. Her actions lead to an investigation of the alleged misconduct. This informal social control is indirect because the witness did not confront the offender or try to remedy the case herself. Direct informal social control, on the other hand, occurs when an individual takes it upon him – or herself to address the problem. In the Genovese incident, witnesses perhaps could have armed themselves and confronted the suspect or tried to chase him away. In the case of scholarly crime, direct informal social control could take one of several forms. Once an incident occurred, a member of the scholarly community could approach the alleged offender and ask him or her for an explanation, which most likely would result in denial and even threats by the suspected offender. Under the best of circumstances, asking the suspect for an explanation could conceivably result in an admission by the perpetrator and a promise to come clean. What would be preferable, though, would be forms of indirect social control wherein individuals would take it upon themselves to spread the word that scholarly misconduct is unacceptable and discoverable. There are indications that individuals have played an important role in the social control of scholarly crime. Oft times instances of research misconduct have come to the attention of authorities not through time-honored structural mechanisms such as peer review or formal data audits, but by means of collaborators, lab assistants or others who have good reason to suspect the alleged perpetrator of misconduct. As we have indicated elsewhere, this often is a brave course of

134  Preventing and controlling scholarly crime action because the informant faces a number of risks including but not limited to dismissal, legal action by the accused or the institution, and loss of their own credibility. Reputation Reputation is the accumulation of information about individuals based on their performance as a professional or as a person. In Chapter 2 we discussed the ways academic reputations are established, as well as their importance to position, status and earnings. Informal social control of scholarly crime is unlikely to work if actual or wannabe offenders are unconcerned with their reputations. Most scholars, being highly-educated and integral parts of their communities, are concerned with how others, particularly those in their own profession, perceive them. For this reason, informal control would be a very effective way to control scholarly offenses. Some who write on the topic of reputation treat it as if it were a unitary construct comprised primarily of work performance.30 As Cleary and her colleagues have described it: Skills of self-promotion and positioning, assertiveness, networking, and coalition building are crucial competencies that can be learned and need to be developed to earn and maintain a good reputation, particularly in a […] competitive fields.31 This description of reputation connotes that excellence in producing academic work is sufficient for the establishment of one’s reputation. We agree that these traits serve a scholar well, but we might also admire the manner in which scholars achieve reputations not only for competence in research and publishing, but also for the honest, fair and respectful treatment of colleagues. Following the lead of Zinko and colleagues, it is our contention that reputation needs to be broken out into performance and character.32 Performance-based reputation, in line with Mertonian norms governing the technical aspects of how science is done, would consist of a mastery of technical abilities which would enable the individual to produce publishable scholarship. A scholar could be a prolific author of replicable studies, and there are numerous examples in criminology. But as we well know, productivity doesn’t imply integrity. Character-based reputation, in contrast, has to do with the honesty with which one approaches science and its products. Informal networks of scholars discuss the perceived honesty of others. When members raise suspicions, often it is as the result of an incident such as the one in which Bonnie was involved. They instantly gain credibility when specifics surface. In most cases the perceived victims and those who hear about the alleged victimization do nothing. But the character of the antagonist is thrown into question which, as we will argue, can, and in many cases should, carry consequences for the offender.

Preventing and controlling scholarly crime  135 Scholars have to be concerned about their reputations, particularly if they are active in writing academic papers and applying for funding. Their peers will review their submissions to journals and funding agencies. If scholars have tarnished reputations due to having allegedly engaged in scholarly crimes such as idea theft or plagiarism, this knowledge may well affect evaluations of their work by their peers. And it may also affect their future job prospects. Gossip as a mechanism for informal social control Gossip is “the exchange of information with evaluative content about absent third parties.”33 Gossip can be either malicious, prosocial, or neutral. The malicious kind is what we typically think of when we hear the word gossip. It can unduly damage reputations. Gossip can also perform a number of functions including status enhancement,34 sharing the “the misadventures of others,”35 and enforcing group norms.36 There are three important roles in gossiping.37 The gossiper is the individual who is conveying the information. The listener or respondent is the individual with whom the gossiper is communicating. Finally, the individual who’s the subject of the gossip is the target. Each of these roles is important for understanding how gossip works and thus how it can function as a means of informal social control. The advent of social networking sites has provided innovative mechanisms for disseminating gossip. This, of course, is a mixed blessing. We can post information about others to Facebook with relative impunity. The targets can retaliate, deny, or admit the gossip on their own Facebook sites. The outcome of gossip via social media can be useful in terms of spreading the word that the target has misbehaved in some way. Or the outcome can be negative, merely hurtful, and not at all in line with social control if the information spread is false. There is compelling scholarly evidence that gossip can and does lead to real harm. It has been linked to bullying and cyberbullying,38 youth violence,39 and female aggression.40 Indeed, gossip can and does have negative consequences beyond what we find in research studies. Gossip about individuals has been blamed for a variety of undesirable outcomes including suicide and murder. The harm gossip potentially represents is not borne solely by the target. Research shows that those who engage in negative gossip are more like to be disliked by others.41 Moreover, those who gossip in a negative fashion are not only liked less, they are also less powerful. This suggests that gossip employed as a mechanism for informal social control must be used judiciously, based on verifiable information, and only shared among honest and trusted colleagues. Gossip is a relatively unexplored mechanism for social control. One reason for this has been the pejorative connotation associated with gossip. It is associated with malicious information shared behind the backs of those targeted. It is often assumed to be untrue, even when it is true. It is further assumed to be engaged in solely for the purposes of hurting someone.

136  Preventing and controlling scholarly crime But gossip can also serve a prosocial function. The form of gossip we are particularly interested in is prosocial in nature, defined as “the sharing of negative evaluative information about a target in a way that protects others from antisocial or exploitative behavior.”42 Prosocial gossip alerts potential victims of the existence and past behavior of someone likely to reoffend. The good news is that those whose reputations have been tarnished by prosocial gossip are more likely to desist from the antisocial behavior.43 Let’s explore how prosocial gossip might work in controlling scholarly crime. Imagine there is a prominent scholar against whom allegations of idea theft have been made. When one of his alleged victims complained to the editor of a journal that had published the purloined work, she received no satisfaction. She then appealed to the scholarly society to which they both belonged. In the meantime, this story was related to a handful of sympathetic fellow scholars who relayed to members of their respective informal networks. One person in the network, a prominent and well-respected scholar, indicated to others that the offending individual would not get a job at his institution. So now the gossip, which at first was little more than hearsay shared among a few friends, has gained not only credibility with the latest link, but has also created real consequences for the offender. Incidentally, in the course of writing this book of which our colleagues are aware, we have received numerous reports of scholarly misconduct. When we discuss these cases with other colleagues, we have discovered that a lot of people already know about these cases of misconduct. In other words, gossip, including prosocial gossip, spreads and affects how members of the scholarly community feel about the offender. There is another way gossip can prevent scholarly crime. Once it becomes known that prosocial gossipers, particularly some of the more prominent and respected members of the scholarly community, will notify their trusted colleagues about instances of alleged scholarly crime, the rank and file members will in time learn that such a network exists. This special brand of general deterrence will help prevent some would-be offenders from engaging in scholarly crime. When information about alleged offenders is being shared among trusted colleagues, it is possible that some privy to the information may have contemplated scholarly crime at one time or another. Hearing gossip may well cause such individuals to self-evaluate, that is, to reflect on the scholarly crimes they’ve considered, and discourage them from engaging in such behavior.44 In this way, we believe that prosocial gossip can be employed to control the antisocial behavior of exploitative scholars. And it offers a number of advantages over more traditional forms of social control. For one, it is not necessary to make a formal report to institutional or funding authorities. There is virtually no cost associated with pursuing prosocial gossip as a strategy; one only need communicate with others who are interested in the information and willing to spread it for prosocial purposes. Because it is not enmeshed in bureaucratic machinery, it can be operated as quickly as the network will spread the gossip. Prosocial gossip will work best when there are people who will step up and initiate the transfer of information. Such people are what economist Herbert Gintis

Preventing and controlling scholarly crime  137 and his colleagues term strong reciprocators (SRs), individuals who are willing to punish exploiters even if doing so comes at a personal cost.45 After examining the behavior of participants in economic games, these economists found that not all of them behave according to the traditional homo economicus model. Instead of simply pursuing their own self-interest, they punished the unfairness of other participants who were behaving selfishly. Strong reciprocators are important to the promotion of compliance. In the parlance of traditional criminology, gossip can also be understood in terms of general deterrence. General deterrence operates when the threat of a punishment thwarts those who might be thinking about offending. If would-be scholarly offenders know that there is an extensive network of prosocial gossipers, including prominent and powerful individuals with untarnished reputations, they may be less likely to perpetrate an offense. Gossip as a method of social control is not without its limitations. Inasmuch as it occurs after an alleged instance of misconduct, it does nothing for the original misbehavior. The victim gets no reparation or other form of justice. It can, however, possibly prevent repeat offenses and, given what we know about scholarly misconduct, repeat offenders constitute a significant portion of scholarly offenders. Gossip must be prosocial in order to be an effective method of social control. It is conceivable that in an environment in which prosocial gossip is promoted, there will be spiteful individuals who initiate and spread malicious gossip. We know this already takes place. Just as it is reprehensible to make false allegations in the case of traditional crimes, negative gossip about untrue allegations of scholarly crime could have devastating effects on the careers of the targets of untrue allegations spread by gossip. But those who spread malicious gossip should be subject to the same informal social control as those who engage in scholarly misconduct. However, in a large network of scholars we have faith that most of the time the well-intentioned will prevail. We feel strongly that to work as effectively and efficiently as possible, gossip as a strategy for social control needs to be promoted. For example, should a professional organization such as the American Society of Criminology agree to support the concept of prosocial gossip, the organization could alert the membership that they should consider using it to prevent violations of the Code of Ethics. Doing so will notify the membership that this option is not only available, but is also supported by the organization as a means to foster ethical practice. Yet again, we see that relative power plays a significant role in offending and responding to offending. In a study of prosocial gossip as social control, Brandon Vaidyanathan and colleagues studied ethical violations in scientific workplaces and found, as we have, that formal accusations are often not pursued because the observers of these violations are extremely reticent to report wrongdoing.46 Using interview data from 251 physicists and biologists working in prestigious and not-prestigious research institutes and universities in the U.S., U.K., and India, the authors found that those observing research misconduct turned to gossip as a means to warn others of the transgressions. While prosocial gossip is somewhat

138  Preventing and controlling scholarly crime effective, these authors found, the effectiveness is greater when the transgressors are of higher status The status, integrity and power of individuals who become aware of scholarly offense as the result of reciprocation may be more important than simply the number of people eventually notified. Certain individuals in any field wield substantial influence with their peers. If powerful and respected members of a discipline are aware of scholarly offenses, their word will be taken very seriously by those they tell; moreover, because they are powerful and respected, they can throw roadblocks in the career path of the offender. There are three points at which gossip can have an impact on scholarly crime, and these parallel the three types of prevention in public health and criminology: primary, secondary, and tertiary. Primary prevention is concerned with preventing a phenomenon from occurring in the first place. With secondary prevention, the condition has occurred, and efforts are directed at minimizing the phenomenon once it’s occurred. Tertiary prevention, the third type, focuses on more drastic measures to curtail the contamination. With regard to gossip, primary prevention would occur when the community of scholars knows that prosocial gossip is prevalent, powerful, and could potentially damage the reputation and career of questionable members of the discipline. Ideally the would-be perpetrator would consider the risks of getting caught as well as the extent to which his or her career could be damaged by prosocial gossip by respected members of the field. Primary prevention occurs when the individual refrains from engaging in scholarly offenses because the costs are deemed too high. Personal trust is a necessary precondition for gossip to serve as a mechanism for social control. It’s through communication among scholars who trust one another that it will have a significant impact on exploitative behavior such as idea theft, plagiarism and data fraud. If we can’t trust other scholars to conduct their work competently and fairly, perhaps colleagues with a history of integrity can trust one another with prosocial gossip intended to punish actual offenders and deter would-be offenders. Elsewhere in this book we have cast scholarly offenders as free-riders, a term from game theory for those who don’t play by the rules of reciprocity and who exploit others. They do so by gaining an outcome – in the case of scholarly crime, a dataset or publication – to which they are not entitled by conventional standards of fairness. Dunbar maintains that gossip keeps free-riders from taking advantage of others.47 He further asserts that gossip serves an important evolutionary function by permitting groups of individuals to cooperate, thereby permitting civilized society to flourish.

Conclusions Reputation-guarding and gossip represent potentially promising tools in the fight against scholarly crime. They are capable of reducing the harm caused by even the most prominent offenders whose actions are protected by their professional reputation and by powerful organizations. They permit potential victims to consider

Preventing and controlling scholarly crime  139 whether they want to work with known offenders, and to withdraw if they consider the risk of exploitation too great. And, as Baumeister, Zhang and Vohs have noted, “Gossip is cheap, easy, efficient, and apparently rather effective.”48 It may be possible to use reputation and prosocial gossip to prevent forms of socially maladaptive behavior outside of the academy. Now that research has shown its benefits, there might be applications to worrisome conduct such as bullying and campus sexual assault. In other words, collective efficacy needn’t be restricted to disadvantaged neighborhoods in large cities. The concept of a network of people who care enough about the welfare of one another to share protective information should apply to a range of situations, including academe. The work on collective efficacy suggests that gossip is an important tool for social control in communities. And it might be so for the community of scholars. We concur with Ferrier and Ludwig who call for more research on how gossip works to promote collective efficacy.49 Now that social psychological experiments have shown its effectiveness in laboratory settings, we should explore how and why it works in field settings. Placing greater responsibility for the social control of scholarly crime in the hands of scholars themselves may take us closer to an earlier era in which scientists and scholars operated within a polite society, conducting their work without complex regulations and heavy administrative oversight. Despite some of the departures that occurred during that era, it was understood that scholars would do their own work, and share it openly and honestly when it was ready. True, there are aspects of science and scholarship that are forever changed by time and technology; undoubtedly and as evidenced, technology has allowed more scholarly misconduct to take place. Advanced technology may also be recruited to control scholarly misconduct by multiplying channels of sharing information. It may be possible to encourage individuals to do the right thing despite structural, cultural, and organizational influences. We hope that criminologists will explore the possibility that prosocial gossip can reduce more traditional types of crime. The more crime is controlled through informal social control mechanisms such as gossip, the less reliant we will be on expensive, ineffective, formal systems of social control. The main difference between controlling scholarly offenses and ordinary criminal offenses through gossip is reputation. Scholars, as presumably upstanding members of a discipline as well as respected members of their communities, must be concerned about maintaining a reputation, beyond reproach. This would also be true, though less so, for white-collar, political and corporate actors who rely on their reputation to live well and accomplish their goals. It is undoubtedly far less true of ordinary street criminals who may, in fact, want a reputation as a bad actor. In this way, reputation matters for all members of society: Most seek to establish and maintain a solid reputation as trustworthy and honest people. A far smaller segment is also desirous of a certain reputation, a reputation in opposition to that of trustworthy and honest. While prosocial gossip offers promise of controlling at least some scholarly crime, we should consider it only one strategy, albeit it a unique one, among many.

140  Preventing and controlling scholarly crime

Notes 1 Associated Press (1989). “Fraud in medical research tied to lax rules.” New York Times, February  14. Available at: www.nytimes.com/1989/02/14/science/fraud-in-medicalresearch-tied-to-lax-rules.html (accessed 12-31-2017). 2 Moody, G. (2015). “Large-scale peer-review fraud leads to retraction of 64 scientific papers.” Techdirt, 8–24–15. 3 Farthing, Michael J. G. (2014). “Research misconduct: A grand global challenge for the twenty-first century.” Journal of Gastroenterology and Hepatology, 29: 422–427. 4 Larrson, Bengt (2005). “Patrolling the corporation–the auditor’s duty to report crime in Sweden.” International Journal of the Sociology of Law, 33: 53–70. 5 See Greenberg, Jerald (2004). “Stress fairness to fare no stress: Managing workplace stress by promoting organizational justice.” Organizational Dynamics, 33: 352–365. 6 Kalichman, Michael W. & Plemmons, Dena K. (2007). “Reported goals for responsible conduct of research courses.” Academic Medicine, 82: 846–852. 7 Anderson, Melissa S., Horn, A. S., Risbey, K. R., Ronning, E. A., de Vries, Raymond & Martinson, Brian C. (2007).” What do mentoring and training in the responsible conduct of research have to do with scientists’ misbehavior? Findings from a national survey of NIH-funded scientists.” Academic Medicine, 82: 853–860; Antes, A. L., Wang, X., Mumford, M. D., Brown, R. P., Connolly, S. & Devenport, L. D. (2010). “Evaluating the effects that existing instruction on responsible conduct of research has on ethical decision making.” Academic Medicine, 85: 519–526. 8 Schoenherr, Jordan R. (2015). “Scientific integrity in research methods.” Frontiers in Psychology, 6: doi: 10.3389/fpsyg.2015.01562 9 Greenberg, Jerald (2009). “Applying organizational justice: Questionable claims and promising suggestions.” Industrial and Organizational Psychology, 2: 230–241. 10 Dusseldorp, Elise, Velderman, Mariska K., Paulussen, Theo W. G. M., Junger, Marianne, van Nieuwenhuijzen, Maroesjka & Reijneveld, Sijmen A. (2014). “Targets for primary prevention: Cultural, social and intrapersonal factors associated with cooccurring health-related behaviours.” Psychology & Health, 29: 598–611. 11 Segal, S., Gelfand, B. J., Hurwitz, S., Berkowitz, L., Ashley, S. W., Nadel, E. S.  & Katz, J. T. (2010). “Plagiarism in Residency Application Essays.” Annals of Internal Medicine, 153: 112–120. 12 Erez, Miriam & Gati, Efrat (2004). “A dynamic, multi-level model of culture: From the micro level of the individual to the macro level of a global culture.” Applied Psychology: An International Review, 53: 583–598. 13 Gintis, Herbert, Bowles, S., Boyd, R.  & Fehr, Ernst (2003). “Explaining altruistic behavior in humans.” Evolution and Human Behavior, 24: 153–172. 14 Alaimo, C. A. (2015). Arizona Daily Star, 9–19–2015. 15 Titus, Sandra L., Wells, James A. & Rhoades, Lawrence J. (2008). “Repairing research integrity.” Nature, 453: 980–982. 16 Michalek, A. M., Hutson, A. D., Wicher, C. P. & Trump, D. L. (2010). “The costs and underappreciated consequences of research misconduct: A case study.” PLOS Medicine, 7: 1–3. 17 Redman, Barbara K., Templin, Thomas N. & Merz, Jon F. (2006). “Research misconduct among clinical research staff.” Science and Engineering Ethics, 12: 481–489. 18 Kornfeld, Donald S. (2012). “Perspective: Research misconduct: The search for a remedy.” Academic Medicine, 87: 877–882; Kornfeld, Donald S. (2013). “Integrity training: Misconduct’s source.” Science, 340: 1403–1404. 19 Klein, Jennifer L. & Tolson, Danielle (2015). “Wrangling rumors of corruption: Institutional neutralization of the Jerry Sandusky scandal at Penn State University.” Journal of Human Behavior in the Social Environment, 25: 477–486. 20 Adams, Douglas  & Pimple, Kenneth D. (2005). “Research misconduct and crime: Lessons from criminal science on preventing misconduct and promoting integrity.” Accountability in Research, 12: 225–240.

Preventing and controlling scholarly crime  141 21 Fanelli, Daniele (2015). “We need more research on causes and consequences, as well as on solutions.” Addiction, 11:, 11–12. 22 Ross, Edward A. (1901/1929). Social Control: A Survey of the Foundations of Order. New York: MacMillan. 23 Sampson, Robert, Raudenbush, Stephen, & Earls, Felton (1997). “Neighborhoods and violent crime: A multilevel study of collective efficacy.” Science, 277: 918L 24. 24 Collins, Michael D. & Frey, James. H. (1992). “Drunken driving and informal social control: The case of peer invention.” Deviant Behavior, 13: 73–87; Dietze, Paul, Ferris, Jason  & Room, Robin (2013). “Who suggests drinking less? Demographic and national differences in informal social controls on drinking.” Journal of Studies on Alcohol and Drugs, 859–866. 25 Emery, Clifton R., Trung, Hai Nguyen  & Wu, Shali (2015). “Neighborhood informal social control and child maltreatment: A  comparison of protective and punitive approaches.” Child Abuse & Neglect, 41: 158–169. 26 Welch, Michael & Bryan, Jennifer L. (1998). “Reactions to Flag Desecration in American Society: Exploring the Contours of Formal and Informal Social Control.” American Journal of Criminal Justice, 22: 151–168. 27 Mustaine, Elaine E. & Tewksbury, Richard (2011). “Assessing informal social control against the highly stigmatized.” Deviant Behavior, 32: 944–960. 28 Braithwaite, John (1989). Crime, shame and reintegration. Cambridge, UK: Cambridge University Press. 29 Clarke, Ronald V. (1997). Situational Crime Prevention: Successful Case Studies. 2nd Edition. Guilderland, NY: Harrow and Heston. 30 Blickle, Gerhard, Schneider, Paula B., Liu, Yongmei  & Ferris, Gerald R. (2011). “A predictive investigation of reputation as mediator of political-skill/career-success relationship.” Journal of Applied Social Psychology, 41: 3026–3048. 31 Cleary, Michelle, Mackey, Sandra, Hunt, Glenn E., Jackson, Debra, Thompson, David R.  & Walter, Garry (2012). “Reputations: A  critical yet neglected area of scholarly enquiry.” Journal of Advanced Nursing, 68: 2137–2139. 32 Zinko, Robert, Ferris, Gerald R., Humphrey, Stephen E., Meyer, Christopher J.  & Aime, Federico (2012). “Personal reputation in organizations: Two-study constructive replication and extension of antecedents and consequences.” Journal of Occupational & Organizational Psychology, 85: 156–180. 33 Beersma, Bianca & Van Kleef, Gerben A. (2012). “Why people gossip: An empirical analysis of social motives, antecedents, and consequences.” Journal of Applied Social Psychology, 42: 2640–2670. 34 McAndrew, F. T., Bell, E. K. & Garcia, C. M. (2007). “Who do we tell and whom do we tell on? Gossip as a strategy for status enhancement.” Journal of Applied Social Psychology, 37: 1562–1577. 35 Baumeister, Roy F., Zhang, Liqing & Vohs, Kathleen D. (2004). “Gossip as cultural learning.” Review of General Psychology, 8: 111–121. 36 Grosser, Travis J., Lopez-Kidwell, Virginie, Labianca, Giuseppe  & Ellwardt, Lea (2012). “Hearing it through the grapevine: Positive and negative workplace gossip.” Organizational Dynamics, 41: 52–61. 37 Michelson, Grant, van Iterson, Ad & Waddington, Kathryn (2010). “Gossip in organizations: Contexts, consequences, and controversies.” Group & Organization Management, 35: 371–390. 38 Anthony, Bruno J., Wessler, Stephen L.  & Sebian, Joyce K. (2010). “Commentary: Guiding a public health approach to bullying.” Journal of Pediatric Psychology, 25: 1113–1115; Pelfrey, Jr., William V. & Weber, Nicole (2014). “Talking smack and the telephone game: Conceptualizing cyberbullying with middle and high school youth.” Journal of Youth Studies, 17: 397–414. 39 Zimmerman, Mark A., Morrel-Samuels, Susan, Wong, Naima, Tarver, Darian, Rabbiah, Deana & White, Sharrice (2004). “Guns, gangs, and gossip: An analysis of student essays on youth violence.” Journal of Early Adolescence, 24: 385–411.

142  Preventing and controlling scholarly crime 40 McAndrew, F. T. (2014). “The ‘sword of a woman’: Gossip and female aggression.” Aggression and Violent Behavior, 19: 196–199. 41 Farley, Sally D. (2011). “Is gossip power? The inverse relationships between gossip, power, and likability.” European Journal of Social Psychology, 41: 574–579. 42 Feinberg, M., Willer, Robb, Stellar, Jennifer & Keltner, Dacher (2012). “The virtues of gossip: Reputational information sharing as prosocial behavior.” Journal of Personality and Social Psychology, 102: 1015–1030. 43 Feinberg, Matthew, Willer, Robb & Schultz, M. (2014). “Gossip and ostracism promote cooperation in groups.” Psychological Science, 25: 656–664. 44 Martinescu, Elena, Janssen, Onne & Nijstad, Bernard A. (2014). “Tell me the gossip: The self-evaluative function of receiving gossip about others.” Personality and Social Psychology Bulletin, 40: 1668–1680. 45 Gintis, Herbert, Bowles, S., Boyd, R.  & Fehr, Ernst (2003). “Explaining altruistic behavior in humans.” Evolution and Human Behavior, 24, 153–172. 46 Vaidyanathan, Brandon, Khalsa, Simranjit & Ecklund, Elaine Howard (2016). “Gossip as Social Control: Informal Sanctions on Ethical Violations in Scientific Workplaces.” Social Problems, 63: 554–572. 47 Dunbar, R. I. M. (2004). “Gossip in evolutionary perspective.” Review of General Psychology, 8: 100–110. 48 Baumeister, Roy F., Zhang, Liqing & Vohs, Kathleen D. (2004). “Gossip as cultural learning.” Review of General Psychology, 8: 111–121. 49 Ferrier, Megan & Ludwig, Jens (2011). “Crime policy and informal social control.” Criminology & Public Policy, 10: 1029–1036.

Afterword Against all odds, a code is born Mark S. Davis

The incident that in large part served as catalyst for this analysis – the duplication of Bonnie’s book – led to a range of emotional states: Disbelief, anger, frustration, depression, and despair. Many of us who were close to Bonnie and learned of the incident firsthand shared some of these emotions. As is often true with other forms of victimization, the experience eventually moved Bonnie toward advocacy. Advocacy may or may not bring about personal healing or positive change, but in the aftermath of one’s victimizing event, it can serve as a channel for emotional energy that might otherwise be expended negatively, even counterproductively. It is also an attempt to give meaning to a selfish, senseless act that ostensibly has none. We noted earlier that the American Society of Criminology (ASC), a scholarly organization to which we both belong, lacked a formal code of ethics until 2016. This is curious for several reasons. One is the age of the ASC: It was founded in 1941. American scholars who identified as criminologists convened before the 1940s, but it was not until then that their efforts coalesced into the formal organization the ASC is today. In time those efforts included a mission statement, a statement of purpose, a board of directors, and a roster of officers including president, vice president, and executive counselors. As of this writing the ASC, which has its headquarters in Columbus, Ohio with several full-time staff members, boasts approximately 3,400 members from the United States and numerous foreign countries. Many of these members travel great distances to attend the annual meeting held in November of each year. Despite the impressive evolution of the ASC over the years, its progress did not include the adoption of a statement of ethics. Particularly glaring was the fact that almost every other scholarly association representing the social and behavioral sciences had such a code. This included the American Sociological Association, the Academy of Criminal Justice Sciences, the American Political Science Association, the American Psychological Association, and most other learned societies based in the U.S. Indeed, virtually every group of professionals – from agriculture to real estate to wildlife stewardship – has a code of ethics.1 What should have been a collective embarrassment to the community of criminologists went largely unnoticed and unaddressed. What was it that criminologists were missing? Why was there insufficient concern about professional conduct to establish their own code of ethics?

144  Afterword This is not to suggest that the ASC membership had not tried to implement an ethics code. According to the ASC archives, there were several unsuccessful attempts at pushing through a code.2 These attempts included the presentation of a draft code to the ASC Executive Board. Each time over three decades the idea of a code was time and time again struck down. The naysayers always had their reasons for not adopting a code of ethics. The code is too vague. The code must not have an enforcement mechanism, which was frightening to some who resisted the adoption of a code. The ASC might be sued. Criminologists monitor themselves and therefore do not need a code of ethics.3 Each time the result was the same. The code of ethics was voted down. Regardless, the absence of a code of ethics for the ASC made its own statement about criminology’s professional image, and that statement was not flattering. Enter Bonnie Berry. She was and remains a dedicated and respected member of the ASC. She regularly organizes panels and presents papers at sessions during the annual meetings. She instituted and for many years headed up the ASC’s mentoring program, an initiative whereby students could be matched with mentors whose interests and expertise dovetailed with their own. In 2008 she was awarded the ASC’s Herbert Bloch Award which “recognizes outstanding service contributions to the American Society of Criminology and to the professional interests of criminology.” In 2013, she received the Mentor of Mentors Award, an award designed especially for her on a one-time basis. Through this history of service to the ASC, Bonnie had managed to develop an extensive network of members with whom she had interacted by way of work groups, committees, and less formal contact. Now the ASC has someone who believes she has been victimized and, more importantly, was dumbfounded by the fact that any professional organization did not have a code of professional standards. Every colleague with whom Bonnie shared her story was horrified, and the supporting evidence strongly indicated that, while a grievous harm took place, there were no channels through which to right a wrong such as the one she suffered. A number of ASC members shared with her their own alleged victimization experiences, some of which were at the hands of the most prominent names in the field. Once she had decided to take on the challenge of pushing for an ASC code of ethics, she was well-positioned to trade on her professional reputation and network to undertake a new attempt to make the code a reality. The earlier attempts to adopt a code included basing the draft on the American Sociological Association’s code of ethics, a code that had been in place for decades. Bonnie and those she consulted agreed that it made little sense to start the code from scratch. Inasmuch as Bonnie was also a longtime ASA member, she contacted the ASA’s executive director, Sally Hillsman, to see if it was possible to once again use their code of ethics as template. Not only did Bonnie get the ASA’s official blessing to borrow the code, she received a lot of other useful information about how the ASA handled alleged instances of misconduct. Bonnie’s effort to push for the adoption of an ASC code of ethics was not without some speed bumps. Despite the many members who expressed their support

Afterword  145 to Bonnie, there was a handful that tried to subvert the effort. One prominent criminologist, a former president of the ASC, emailed a number of his colleagues in an attempt to sink any support for the proposed code. This simply reinforced for us that some criminologists were more interested in protecting the status quo than in promoting positive change. His efforts might have had the desired effect had it not been for Bonnie’s own counterattack, comprised of calls and emails to those who knew her, trusted her motives, and agreed that a code of ethics was not only long overdue, but more importantly, critical to the future integrity and credibility of the ASC and the discipline. The first formal hurdle, which turned out not to be a hurdle, was getting the ethics code approved by the ASC Board. As before, any proposed ethics code had to have board approval before it would be submitted to the membership for a vote. This meant that any draft submitted to the board, if not airtight, would have to have few holes from which air could enter or escape. In 2016, three-quarters of a century after the founding of the American Society of Criminology, its Board approved the adoption of the code of ethics. A  few months later the code was voted on and passed by the membership. Some of us who were involved expected to hear the Hallelujah Chorus reverberating in the distance. It only took 75 years. As a result of her efforts to push for an ASC ethics code, Bonnie received the Coramae Richie Mann “Inconvenient Woman of the Year Award” in 2016 from ASC’s Division on Women and Crime. “Inconvenient” may well be an understatement in the eyes of those who saw Bonnie’s efforts as an attempt to upset the status quo. Regardless, the award capped a several-year campaign to nudge her colleagues to do the right thing. I should note that the individual whose book was similar to Bonnie’s is neither a criminologist nor an ASC member. So, there is no reason to believe that an ASC code of ethics will have any impact whatsoever on the person’s thinking or future behavior. But her actions, as unfair and troubling as we consider them to be, had the unintended consequence of spurring Bonnie and others to action, indirectly leading to the development and adoption of a code of ethics by the ASC membership. Generally, when we speak of collateral consequences we are referring to unanticipated, negative outcomes of a particular course of action. In this case, the code was one potentially beneficial outcome. I consider this book another one.4 The experience of winning approval of a code of ethics after decades of indifference and resistance is, in our opinion, a case study in informal social control. In Chapter 9, we made an argument for innovative, less formal methods to control scholarly crime, largely because more formal methods to date have fallen short in preventing and controlling it. Just as scholarly networks can be used to transmit socially beneficial gossip for the primary, secondary and tertiary prevention of social problems, so too can they be used to marshal support for organizational change. We think this reinforces our point that less formal means can achieve better results. And in addition to the approval of the code, I would like to believe Bonnie’s efforts generated a new sensitivity to and appreciation for ethical concerns among the ASC membership. If I’m correct, this will prompt criminologists

146  Afterword to informally discuss questionable scholarly behavior with their colleagues and to share personal stories of alleged victimization. Who knows? Maybe the existence of a code will even cause some criminologists to think twice before exploiting others. The ASC’s code of ethics will not prevent all scholarly crime within the ranks of member criminologists. There will continue to be those bad apples in the barrel. In time, we may even discover that the known cases comprise just the tip of the iceberg. Shit happens. What the code of ethics does, however, is signal to the community of criminologists and even scholars outside the discipline that despite occasional or even persistent pockets of unethical or exploitative behavior, the world of scholars strives to be competent and honest and fair. And after all, isn’t that what we all want?

Notes 1 Komić, D., Marušić, S. L. & Marušić, A. (2015). “Research integrity and research ethics in professional codes of ethics: Survey of terminology used by professional organizations across research disciplines.” PLoS One, 10, doi: e0133662.doi:10.1371/journal. pone.0133662 2 The documentation for these attempts is available at www.asc41.com/Board_Minutes/ boardmin.html 3 Scholarly associations need not play cop, prosecutor, and/or judge. Some suggestions for other roles such organizations can play in promoting ethical behavior are discussed in Melissa S. Anderson, and Joseph B. Shultz (2003). “The Role of Scientific Associations in Promoting Research Integrity and Deterring Research Misconduct: Commentary on ‘Challenges in Studying the Effects of Scientific Societies on Research Integrity.” Science and Engineering Ethics, 9: 269–272; “The role of scientific associations in promoting research integrity and deterring research misconduct: Commentary on ‘Challenges in studying the effects of scientific societies on research integrity’ (Levine and Iutkovich).” Science and Engineering Ethics, 9: 269–272. Nick Steneck has suggested a role for these organizations that includes conducting needs assessments of members, developing useful informational resources such as guidelines, and promoting integrity (see Steneck, N. H. (2003). “The role of professional societies in promoting integrity in research.” American Journal of Health Behavior, 27 [Suppl 3], S239-S247). We agree that these activities are important, but we also feel that they are unlikely to reduce instances of misconduct. 4 I trust the reader will understand if we don’t formally acknowledge the antagonist in this story for inspiring our project.

Appendix Others’ stories of scholarly misconduct Bonnie Berry

When I discovered that my book had been duplicated, I solicited the stories of others in three professional publications. I in turn received a number of replies, some anonymous and some that voluntarily identified themselves. I promised that I would keep the identities of the story-tellers secret and thus these stories are numbered 1–10 in the order in which they arrived, as anonymous accounts. I received a number of emails in response to my account of scientific copying, many of them expressing shock, anger, and disgust. One response, from a highly-respected sociology professor, was to say that my case was the most egregious instance of misconduct he has ever known in his entire career. Besides saying that unethical academic behavior does not get enough attention and, in addition to having been a victim himself of academic theft, he, specifically, wrote that he was “appalled to hear that someone copied and published one of your books. That must be one of the academic crimes of the century, at least in sociology and/or criminology.”

Story 1 The first story comes from a well-known sociology professor, highly-placed at a prestigious university. He had co-authored a paper with one of his graduate students, which he allowed a colleague, pre-publication, to read. The colleague reworded the paper and published it as his own work. His student was worried about retaliation, being a student and looking for a job, and she certainly did not want to ruffle any feathers that might hinder her future career. Moreover, the colleague who copied the paper has a reputation for being very nasty. Strangely but not atypically, the colleague who stole the work is rather powerful and has a lot of publications. He did not need another publication, and certainly had no need to steal others’ work. This communication has two bonus stories: the story-teller told of a case of infringement that most sociologists do not know. Emile Durkheim, a founding father of Sociology, is known worldwide for the development of the sociological theory of anomie and innovative concepts such as social alienation. His famous book Suicide is a copy of Il Suicidio, as written by the Italian scholar Enrico Morselli. As with my case, Durkheim borrowed ideas rather than appropriate the wording verbatim.

148  Appendix The second bonus story refers to a case which occurred at this professor’s university, a case in which another professor misused a graduate student’s term paper, which later appeared as a chapter in the student’s professor’s book. The student complained to the department head and was brushed off. She went to the dean who told her that the university “had no rules against faculty plagiarizing, only against students doing so.” The student went after the professor with a law suit, which was mooted when the professor died. The professor who told me this story introduced a resolution to the Faculty Senate affirming that the norms against plagiarism be applied to faculty as well as students. The resolution passed despite opposition from the university administration. Why would the dean and the administration be opposed to such a resolution or to a student questioning the theft of her work, I asked the story teller. The probable answer: the publisher of the professor’s book was the university press where the professor worked; the university probably feared a law suit.

Story 2 The second story also comes from a respected professor at an Ivy League university. Another author re-published not one but three articles that the story-teller had previously written and had let the copying author read as unpublished manuscripts. One of the articles was published in one of Sociology’s most respected journals, American Sociological Review. Some of the stolen work, as in the ASR piece, was similar to the original work and some of the copied work was identical in structure, the variables used, and the argument applied to the original work. Faculty at the offending author’s university told the story-teller that multiple people had complained about repeated misconduct by this professor. Two victims promised to file complaints if the story-teller would also file. The story-teller filed his complaint but the other victims didn’t. The university’s examining committee concluded that no action was warranted because the copying author’s article used none of the original wording. The story-teller’s conclusion: “If someone is a smart plagiarist, there will be no consequences.”

Story 3 Story 3 comes from a graduate student who wrote a paper for the university newspaper. The university newspaper did not publish the paper but did disseminate it throughout the web. His work was lifted, some of it word-for-word (a clear case of plagiarism), by a journalist for a major national newspaper. He consulted professors and fellow students who told him it could be a coincidence, which seems unlikely given that part of the copied work was verbatim. I read both pieces and found the overlap remarkable. I advised him to contact his university’s legal department as well as the newspaper that published his work under another’s name. He didn’t. Partly, he didn’t because his professors and colleagues recommended against it and partly because he was warned by a fellow scholar who works in the same field (sociology of

Appendix  149 sport) to not pursue the case. The fellow scholar told the student of the major media story about sociologist Elijah Anderson, whose work appeared in a similar form as re-written by a fellow sociologist in Anderson’s then-department. The case was very, very ugly and is recounted in this book.

Story 4 This story is unique in that the story-teller is the one accused of copying someone else’s work. He is a very famous sociologist, now retired, but still active and wellregarded. He wrote a review of a book that had earlier been reviewed by a fellow sociologist; his review was similar to the earlier review insofar as comparison points and summary. The scholar who wrote the original review complained to the story-teller and the story-teller said he could see why the original reviewer thought he had been ripped off, as the reviews were very similar. However, as the story-teller told me, he may have subconsciously made similar comments and wrote a similar review, as he and the original author study the same topics and have known each other moderately well for some time. Having said that, he found my case, as laid out in the American Sociological Association newsletter, Footnotes, suspicious.

Story 5 Story 5 comes from two assistant professors who wrote a review of a book that examined the same topic in which they specialize. Their review was intended for publication in a sociology journal. In the course of reading the book for review, they “were surprised by the similarities” between this book and an earlier article that they had published in a sociology journal one year prior to the publication date of the book. “A number of the arguments, points, and citations that we had made appeared in [the 2010] book, but our article was not mentioned or cited.” They asked for assurance that I keep their identity top secret since they are both, or were at the time of writing, untenured professors. But they did want to let me know “privately and off the record” that they had concerns about being ripped off. Their review never appeared in Contexts, an American Sociological Association publication, where it was scheduled to appear.

Story 6 Story-teller 6 relates that, over the course of her career, she and her colleagues have had their original ideas “stolen.” She interprets her experiences as having happened in two ways: “(1) the result of discussing an idea with colleagues to get some feedback and (2) an article that is under review gets replicated/ripped off by a reviewer.” She regrets, as all serious scholars do, that we have to be watchful about discussing topics with colleagues because these colleagues sometimes take our research questions and run with them as though our ideas are their own. This dilemma of how to prevent the theft of our research when we also need

150  Appendix outside interpretation of our works-in-progress is a significant problem. If we cannot discuss our ideas with others, our science quickly comes to a halt. Yet if we bounce our questions and findings off of unscrupulous others, we may find our work appearing under someone else’s name. This story-teller has assisted colleagues and graduate students in laying the groundwork for a project (e.g., engaging in the background work or writing and [administering] grants) and when it comes time to write the papers, they have quickly done so on a few occasions without me, or done so with me but expressed that the article (resulting from my grant which they did not work on) was their idea. In short, these experiences led her to realize that “discussing ideas with colleagues means you are taking a chance on losing that idea.”

Story 7 This story comes from an important criminologist who relates an incident as told to him by a friend. The friend received a review copy of a new book with the same title as his book. The book was “virtually a word-for-word copy of his book.” Furious, he contacted a copyright attorney who discouraged the aggrieved author from bringing a suit against the other author, explaining that these cases drag on forever and don’t result in much satisfaction. Instead, the lawyer suggested that the victim write a firm letter to the imposter-author. The offending author apologized, admitted what she had done, and the two of them reached a settlement. The story-teller feels that “the rise of the Internet both makes it easier to plagiarize someone’s work as well as to track down plagiarism” and expressed a strong wish to determine the scale of such violations.

Story 8 This story is about a new PhD on the job market who scheduled a university job interview. One of the faculty members of the potential employer asked to see a copy of a paper that the story-teller co-wrote with a well-known sociologist. The story-teller sent the paper only to learn that the person requesting the paper published a very similar paper soon afterward. The story-teller was discomfited not only by the theft but also by “the deception involved in obtaining my unpublished work.”

Story 9 This story comes from a very well-known professor of law whose work, part of an award-winning book published by a prestigious university press, was stolen by a law student at a university different from the storyteller’s. The professor discovered the theft by comparing his work with that of the student. About “a dozen

Appendix  151 printed pages of verbatim text” was stolen. “The student author had simply lifted a large block of my article and repackaged it as part of his article – without ever once citing to my piece.” The professor contacted the dean of the student’s law school, the dean apologized and removed a couple of credits from the student’s record but the student received his law degree anyway. The law review that published the imposter paper published a written apology and alerted readers to the deception. The student phoned the professor “to apologize for his failure to make proper attribution [but] he never admitted that he simply lifted the stuff.” The story doesn’t end there. The student landed a job at a law firm but was fired for sloppy and derivative work. He sued the law firm, claiming that he was fired because he had AIDS. Hollywood made a blockbuster movie about this case.

Story 10 The story-teller here is a very well-respected professor of sociology, employed at a preeminent university, with many important books to his name. He read of my “victimization” (his word) and was “shocked that anything that extreme would occur.” He went on to say that his shock is misplaced since he has “no illusions about the ethics of academics. . . . In fact, their lack of idealism and morality is something that often disturbs me in my daily life in the academic community.” His own story goes back to his graduate student days when, I nearly transferred to another university primarily to distance myself from a faculty member who was, in my judgment, systematically trying to steal ideas from me and (believe it or not) even published two papers that I wrote in his own name and without acknowledgment to me.

Index

Adams, D. 71, 98, 131 Agnew, R. 97 – 98, 109 anomie theory 2 Apple computers 10

impact factor 23 informal social control theory 133 international scholars 27 Internet 2 – 4

Benson, M. 95 – 96 Ben-Yahuda, N. 94, 100 – 101 Bernier, G. 58 Black, D. 110 Braithwaite, J. 132 Broad, W. xiii, 2, 115

Keranen, L. 79, 84 Kornfeld, D. 38, 69, 113

Clarke, R. 133 Clinard, M. 47, 80 – 81 Cohen, L. 99 Cooney, M. 102 counterproductive work behavior 49, 112, 121, 125 Cressey, D. 70 Davis, M. S. 31, 57 – 58, 67, 69 – 70, 75, 97 Dresser R. 30, 73, 83 Durkheim, E. 2, 90 EPA violation of safe drinking water 12 external funding 20 – 21, 16 fabrication 30, 32 – 33 “failed science” 84 falsification 30, 32 – 33 Felson, M. 99 – 100 Friedman, P. J. 77 General Strain Theory 97 – 98 Google 3 – 5, 14 gossip as control 135 – 137 Hindelang, M. 97 history of scientific misconduct 3

legal moralism 90 Luckenbill, D F. 95, 99 mala in se 31, 75, 89, 104 mala prohibita 31, 75, 104 managed copying xiv Martinson, B. 34, 50, 98, 111 Masel, J. 74 Matza, D. 95 – 96 Merton, R. K. 2, 19, 24, 94, 97, 108, 134 Messner, S. 62 Meyer, W. 58 Miller, K. 95, 99 Mills, C. W. 100 moral boundaries 100 moral time theory 101 narcissism 69, 125 net widening 88, 130 norms of science 19 Office of Scientific Integrity 34 – 36 online education 22 open access journals 23 – 24 opportunity theory 98 – 99 organizational justice 49 peer review 21 – 22 Phillips, S. 102 Pimple, K. 71, 98, 131 plagiarism 30 – 32

Index  153 publications, importance of 19 – 20, 25 publish-or-perish 45 – 48 questionable research practices 34 rational choice theory 98 – 99 reintegrative shaming theory 132 Research Integrity Officers (RIOs) 110 – 111 resistance to professional ethics violations control xv, 29, 73 – 75, 90 retractions 5 Retraction Watch 38, 129 routine activities theory 98 – 99 Schmaus, W. 76, 84, 108 scholarly misconduct, definition of 29 – 30 Sellin, T. 57 seriousness of scientific offenses 79, 89 street crime, definition of 29, 90

substantial similarity xiii Sutherland, E. 70, 80, 82, 88, 94 Sykes, G. 95 – 96 Takata corporation 11 techniques of neutralization 95 – 97 Trump, Melania 13 Trump, Donald 13 – 14 Tuskegee experiment 3 Vaugh, D. 56 Von Hirsh, A. 83 Wade, N. xiii, 2, 115 whistleblowers 77 white-collar crime 80, 82, 96 white-collar crime, definition 29, 90 Wikipedia 4 Zuckerman, H. 94, 108